US20150112159A1 - Alertness Detection - Google Patents

Alertness Detection Download PDF

Info

Publication number
US20150112159A1
US20150112159A1 US14/522,398 US201414522398A US2015112159A1 US 20150112159 A1 US20150112159 A1 US 20150112159A1 US 201414522398 A US201414522398 A US 201414522398A US 2015112159 A1 US2015112159 A1 US 2015112159A1
Authority
US
United States
Prior art keywords
subject
implementations
data
acquired
sleep
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/522,398
Other languages
English (en)
Inventor
David Da He
Richard Robehr Bijjani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert F Dudley As Trustee Of Quanttus Liquidating Trust
Original Assignee
Quanttus Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Quanttus Inc filed Critical Quanttus Inc
Priority to US14/522,398 priority Critical patent/US20150112159A1/en
Assigned to QUANTTUS, INC. reassignment QUANTTUS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BIJJANI, RICHARD ROBEHR, HE, DAVID DA
Publication of US20150112159A1 publication Critical patent/US20150112159A1/en
Assigned to ROBERT F. DUDLEY, AS TRUSTEE OF THE QUANTTUS LIQUIDATING TRUST reassignment ROBERT F. DUDLEY, AS TRUSTEE OF THE QUANTTUS LIQUIDATING TRUST ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: QUANTTUS, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0453Sensor means for detecting worn on the body to detect health condition by physiological monitoring, e.g. electrocardiogram, temperature, breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/02028Determining haemodynamic parameters not otherwise provided for, e.g. cardiac contractility or left ventricular ejection fraction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • A61B5/02108Measuring pressure in heart or blood vessels from analysis of pulse wave characteristics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • A61B5/02108Measuring pressure in heart or blood vessels from analysis of pulse wave characteristics
    • A61B5/02125Measuring pressure in heart or blood vessels from analysis of pulse wave characteristics of pulse wave propagation time
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • A61B5/02427Details of sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • A61B5/0285Measuring or recording phase velocity of blood waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • A61B5/029Measuring or recording blood output from the heart, e.g. minute volume
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1102Ballistocardiography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4815Sleep quality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4818Sleep apnoea
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4833Assessment of subject's compliance to treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/02007Evaluating blood vessel condition, e.g. elasticity, compliance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/053Measuring electrical impedance or conductance of a portion of the body
    • A61B5/0531Measuring skin impedance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • A61B5/363Detecting tachycardia or bradycardia
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks

Definitions

  • This document describes technology related to consumer biometric devices.
  • Various types of sensors can be used for sensing biometric parameters.
  • a method in one aspect, includes obtaining, using a first sensor, a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject. The method also includes obtaining, using a second sensor, a second data set representing time-varying information about motion of the subject at the first body part of a subject. The method also includes identifying, using one or more processors, a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part. The method also includes identifying, using the one or more processors, a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. The method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including obtaining a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject.
  • the operations also include obtaining a second data set representing time-varying information about motion of the subject at the first body part of a subject.
  • the operations also include identifying a first point in the first data set. The first point represents an arrival time of the pulse pressure wave at the first body part.
  • the operations also include identifying a second point in the second dataset. The second point represents an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the operations also include computing a pulse transit time (PTT) as a difference between the first and second points.
  • the PTT represents a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • the device also includes a second sensor configured to obtain a second data set representing time-varying information about motion of the subject at the first body part of a subject.
  • the device also includes memory.
  • the device also includes one or more processors.
  • the one or more processors are configured to receive the first and second data sets.
  • the one or more processors are also configured to identify a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part.
  • the one or more processors are also configured to identify a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the one or more processors are also configured to compute a pulse transit time (PTT) as a difference between the first and second points.
  • PTT represents a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • Implementations can include one or more of the following features.
  • the information about the at least one pulse pressure wave includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • data including at least one of the first data set and the second data set is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the first body part is an arm of the subject.
  • the first body part is a wrist of the subject.
  • the first sensor includes an optical sensor and the second sensor includes an accelerometer or a gyroscope.
  • identifying the first point includes computing, by the one or more processors, a cross-correlation of a template segment with each of multiple segments of the first dataset. Identifying the first point also includes identifying, based on the computed cross-correlations, at least one candidate segment of the first dataset as including the first point. Identifying the first point also includes identifying, by the one or more processors, a first feature within the identified candidate segment as the first point.
  • identifying the second point includes determining a reference point in the second data set, the reference point corresponding to substantially the same point in time as the first point in the first data set. Identifying the second point also includes identifying one or more target features within a predetermined time range relative to the reference point. Identifying the second point also includes selecting a time point corresponding to one of the target features as the second point.
  • the target features includes at least one of a peak and a valley.
  • the method also includes computing a blood pressure of the subject as a function of the PTT.
  • the blood pressure includes a systolic pressure and a diastolic pressure.
  • a diastolic pressure is calculated as a linear function of the logarithm of the PTT.
  • a systolic pressure is calculated as a linear function of the diastolic pressure.
  • the pre-determined time range is associated with the systole portion of the subject's heartbeat.
  • the method also includes accepting user-input for initiating computation of the PTT.
  • the method also includes computing arterial stiffness as a function of the PTT.
  • the device also includes a mechanism that allows the device to be worn by the subject.
  • the mechanism does not reduce a mobility of the subject.
  • the one or more processors are also configured to compute a blood pressure of the subject as a function of the PTT.
  • the device also includes an input mechanism configured to accept user-input for initiating computation of the PTT.
  • the one or more processors are also configured to compute arterial stiffness as a function of the PTT.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The method also includes detecting arrhythmia of the subject based on the data.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the operations also include detecting arrhythmia of the subject based on the data.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to detect arrhythmia of the subject based on the data.
  • Implementations can include one or more of the following features.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the arrhythmia includes atrial fibrillation (AFIB).
  • AFIB atrial fibrillation
  • the arrhythmia includes atrial flutter.
  • the method also includes identifying, based on gross motion data of the subject, one or more period of high activity of the subject.
  • the data that the arrhythmia detection is based on does not include data collected during the one or more periods of high activity.
  • the data that the arrhythmia detection is based on includes data collected during the one or more periods of high activity.
  • processing the data includes plotting R wave to R wave intervals (RR i ) versus next consecutive R wave to R wave intervals (RR i+1 ).
  • processing the data includes determining whether a spread of plotted data points exceeds a predetermined spread value.
  • the method also includes determining that the subject experienced atrial fibrillation (AFIB) if the spread of the plotted data points exceeds the predetermined spread value.
  • AFIB atrial fibrillation
  • processing the data includes determining whether multiple clusters of plotted data points are offset from a diagonal.
  • the method also includes determining that the subject experienced atrial flutter if there are multiple clusters of plotted data points offset from the diagonal.
  • processing the data includes determining one or more of heart rate, heart rate variability, and blood pressure of the subject.
  • determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • Atrial fibrillation is detected if the heart rate variability of the subject crosses a threshold.
  • determining the blood pressure of the subject includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. Determining the blood pressure of the subject also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. Determining the blood pressure of the subject also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject, wherein the PTT is related to an elasticity of one or more blood vessels of the subject. Determining the blood pressure of the subject also includes determining the blood pressure of the subject based on the elasticity of the one or more blood vessels.
  • PTT pulse transit time
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • processing the data includes plotting R wave to R wave intervals (RR i ) versus next consecutive R wave to R wave intervals (RR i+1 ).
  • processing the data includes determining whether a spread of plotted data points exceeds a predetermined spread value.
  • the processor is also configured to determine that the subject experienced atrial fibrillation (AFIB) if the spread of the plotted data points exceeds the predetermined spread value.
  • AFIB atrial fibrillation
  • processing the data includes determining whether multiple clusters of plotted data points are offset from a diagonal.
  • the processor is also configured to determine that the subject experienced atrial flutter if there are multiple clusters of plotted data points offset from the diagonal.
  • a method in another aspect, includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of one or more subjects acquired at a location of each of the subjects.
  • the method also includes processing data that represents time-varying information about motion of the one or more subjects acquired at the location of each of the subjects.
  • the method also includes determining, based on the data, a quality of care provided to the one or more subjects by a care facility that cares for the one or more subjects.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of one or more subjects acquired at a location of each of the subjects.
  • the operations also include processing data that represents time-varying information about motion of the one or more subjects acquired at the location of each of the subjects.
  • the operations also include determining, based on the data, a quality of care provided to the one or more subjects by a care facility that cares for the one or more subjects.
  • a biofeedback device configured to be worn by one or more subjects includes a light source configured to emit light toward the skin of the subject.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor.
  • the processor is also configured to determine, based on the data, a quality of care provided to one or more subjects by a care facility that cares for the one or more subjects.
  • Implementations can include one or more of the following features.
  • the information about at least one pulse pressure wave propagating through blood in the subjects includes photoplethysmographic (PPG) data and the information about motion of the subjects includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at single locations of each of the subjects.
  • the data is acquired by devices worn by the subjects.
  • the devices are mobile and do not reduce mobility of the subjects.
  • the devices process the data.
  • the single location of each of the subjects is an arm of the subject.
  • the single location is a wrist of the subject.
  • determining a quality of care provided to the one or more subjects includes determining a level of physical activity experienced by each of the one or more subjects by comparing gross motion data of each subject to a threshold value.
  • the threshold is based on a metric defined by a health organization.
  • the level of physical activity includes an amount of time that each subject has exercised over a particular time period.
  • the level of physical activity includes an amount of time or a distance that each subject has walked over a particular time period.
  • the method also includes processing data that represents information about an amount of ultraviolet light that each of the one or more subjects has been exposed to over a particular time period.
  • the method also includes determining an amount of time that each of the one or more subjects has spent outside over the particular time period based on the information about the ultraviolet light.
  • the method also includes comparing the quality of care provided by the care facility to a quality of care provided by another care facility that cares for one or more other subjects.
  • the device also includes an ultraviolet light sensor configured to measure levels of ultraviolet light that each of the one or more subjects is exposed to over a particular time period.
  • the processor is also configured to process data that represents information about the levels of ultraviolet light that each of the one or more subjects is exposed to over the particular time period.
  • the processor is also configured to determine an amount of time that each of the one or more subjects has spent outside over the particular time period based on the information about the levels of ultraviolet light.
  • determining the quality of care provided to the one or more subjects includes determining a level of physical activity experienced by each of the one or more subjects by comparing gross motion data of each subject to a threshold value.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The data is acquired while the subject is in a situation associated with risk indicated by the data.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The data is acquired while the subject is in a situation associated with risk indicated by the data.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to process the data to determine whether the subject is in a situation associated with risk and to derive a measure of a level of risk associated with the subject.
  • Implementations can include one or more of the following features.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the method also includes using the processed data to derive a measure of a level of risk associated with the subject.
  • the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject.
  • the method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • the method also includes determining a blood pressure of the subject based on the PTT.
  • the risk includes trauma to the subject and the data is indicative of the existence of the trauma.
  • the method also includes providing the processed data to a party that is responding to the trauma.
  • the processed data is transmitted from a device worn by the subject to a remote device.
  • the remote device is a server associated with an emergency service provider.
  • the processed data is provided to the party before the party has reached the subject.
  • the method also includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in additional subjects acquired at a location of each of the subjects.
  • the method also includes processing data that represents time-varying information about motion of the additional subjects acquired at the location of each of the subjects.
  • the data is acquired while the additional subjects are in the situation associated with the risk, and the risk includes trauma.
  • the method also includes providing the processed data for the subject and the additional subjects to a party that is responding to the trauma, before the party has reached the subjects.
  • the processed data is transmitted from devices worn by the subjects to a remote device.
  • the remote device is a server associated with an emergency service provider.
  • the method also includes providing information to the party that enables the party to assess a level of risk associated with each of the subjects before the party has reached the subjects.
  • the method also includes providing the processed data to a medical facility to which the subject is taken for medical care.
  • the risk includes trauma.
  • providing the processed data to a medical facility includes providing the processed data to an urgent care division of the medical facility.
  • the information is provided to the urgent care division before the subject is treated by the urgent care division.
  • the method also includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in additional subjects acquired at a location of each of the subjects.
  • the method also includes processing data that represents time-varying information about motion of the additional subjects acquired at the location of each of the subjects. The data is acquired while the additional subjects are in the situation associated with the risk.
  • providing the processed data to a medical facility includes providing the processed data to an urgent care division of the medical facility.
  • the information is provided to the urgent care division before one or more of the subjects are treated by the urgent care division.
  • the subjects are treated in an order that is based on a severity of an injury.
  • relatively more severely injured subjects are treated before relatively less severely injured subjects.
  • the processed data is used to determine the subject's compliance with a particular standard of care throughout a progression of steps of the standard of care.
  • the processed data is used to determine whether the subject is receiving care that is appropriate according to a particular standard of care.
  • the data is processed after the subject is in the situation associated with risk.
  • the processing of the data occurs after the data has been acquired and with a short enough delay to enable an effect of the risk to be resolved.
  • the situation includes firefighting.
  • the situation includes a natural disaster or a sudden act of violence.
  • the risk includes one or more of heart failure, emotional stress, abnormal skin temperature, abnormal body temperature, hypertension, heart attack, stroke, arrhythmia, exhaustion, and anxiety.
  • the method also includes determining one or more of a blood pressure, a skin temperature, a body temperature, a heart rate, and a heart rate variability of the subject based on the datasets.
  • the method also includes detecting emotional stress in the subject by determining whether one or more of the determined blood pressure, heart rate, and heart rate variability of the subject is a predetermined amount above a threshold.
  • the data indicates that the subject is about to experience an effect of one of the risks.
  • the risk includes overexposure of the subject to ultraviolet light.
  • the method also includes processing data that represents information about an amount of ultraviolet light that the subject has been exposed to.
  • the method also includes comparing the amount of ultraviolet light that the subject has been exposed to a threshold to determine whether the subject has been overexposed to ultraviolet light.
  • the method also includes alerting the subject if the subject has been overexposed to ultraviolet light.
  • the risk includes trauma to the subject and the data is indicative of the existence of the trauma.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive and process the data from the motion sensor.
  • the processor is also configured to cause the biofeedback device to provide the processed data to a party that is responding to the trauma.
  • the processor is also configured to cause the biofeedback device to provide the processed data to a remote device.
  • the remote device is a server associated with an emergency service provider.
  • the processor is also configured to cause the biofeedback device to provide the processed data to a medical facility to which the subject is taken for medical care.
  • the device also includes a transceiver configured to provide the processed data.
  • the processed data is used to determine the subject's compliance with a particular standard of care throughout a progression of steps of the standard of care.
  • the processed data is used to determine whether the subject is receiving care that is appropriate according to a particular standard of care.
  • the risk includes overexposure of the subject to ultraviolet light.
  • the device also includes an ultraviolet light sensor configured to measure an amount of ultraviolet light that the subject is exposed to.
  • the processor is also configured to process data that represents information about the amount of ultraviolet light that the subject is exposed to.
  • the processor is also configured to compare the amount of ultraviolet light that the subject is exposed to a threshold to determine whether the subject has been overexposed to ultraviolet light.
  • the device is also configured to alert the subject if the subject has been overexposed to ultraviolet light.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes providing information related to the data to a remote device.
  • a system in another aspect, includes a remote device and a biofeedback device configured to be worn by a subject.
  • the biofeedback device includes a light source configured to emit light toward the skin of the subject.
  • the biofeedback device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the biofeedback device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to provide information related to the data to a remote device.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include providing information related to the data to a remote device.
  • the biofeedback device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the biofeedback device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to provide information related to the data to a remote device.
  • Implementations can include one or more of the following features.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the remote device is a server.
  • the method also includes determining, based on the data in the first and second datasets, that the subject is experiencing or has experienced a health-related problem.
  • the method also includes causing the remote device to alert one or both of a caregiver and the subject that the subject is experiencing or has experienced a health-related problem.
  • the method also includes causing the remote device to alert the subject that the subject is experiencing a health-related problem.
  • the remote device sends an alert to a device worn by the subject that acquires the data.
  • the remote device sends an alert to a mobile phone of the subject.
  • determining that the subject is experiencing or has experienced a health-related problem includes determining whether a blood pressure of the subject satisfies a threshold.
  • wherein the health-related problem is hypertension.
  • determining that the subject is experiencing or has experienced a health-related problem includes determining a rate of change of a blood pressure of the subject.
  • the medical event is a stroke
  • the subject is determined to be having a stroke if the rate of change of the blood pressure of the subject is positive and above a threshold.
  • the medical event is abnormal heart function
  • the subject is determined to be experiencing abnormal heart function if the rate of change of the blood pressure of the subject is negative and below a threshold.
  • the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject.
  • the method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • the blood pressure of the subject is determined based on the PTT.
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • determining that the subject is experiencing a health-related problem includes determining whether a heart rate of the subject satisfies a threshold.
  • the health-related problem is tachycardia.
  • determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys in the first dataset.
  • determining that the subject is experiencing a health-related problem includes determining whether a heart rate variability of the subject satisfies a threshold.
  • the threshold is based on whether the subject experiences arrhythmia.
  • determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • determining that the subject has experienced a health-related problem includes determining whether the subject has sustained an impact of a magnitude that satisfies a threshold.
  • determining the magnitude of the impact includes analyzing gross motion data of the subject at the time of the impact.
  • the health-related problem is a concussion.
  • the method also includes determining, based on the data in the first and second datasets, that the subject is about to experience a health-related problem.
  • the method also includes causing the remote device to alert a caregiver that the subject is about to experience a health-related problem.
  • the method also includes causing the remote device to alert the subject that the subject is about to experience a health-related problem.
  • the remote device sends an alert to a device worn by the subject that acquires the data.
  • the remote device sends an alert to a mobile phone of the subject.
  • determining that the subject is about to experience a health-related problem includes determining whether a blood pressure of the subject satisfies a threshold.
  • the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject.
  • the method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • the blood pressure of the subject is determined based on the PTT.
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • determining that the subject is about to experience a health-related problem includes determining whether a heart rate of the subject satisfies a threshold.
  • determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys in the first dataset.
  • determining that the subject is about to experience a health-related problem includes determining whether a heart rate variability of the subject satisfies a threshold.
  • determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • the method also includes providing location information related to the subject to the remote device.
  • the location information is provided by a location module of a device worn by the subject that acquires the data.
  • the location module is a GPS transponder.
  • the method also includes providing temperature information related to the subject to the remote device.
  • the remote device is a thermostat.
  • the subject is remote from a location that is temperature-controlled by the thermostat.
  • the thermostat is configured to adjust its temperature settings based on the temperature information related to the subject.
  • a time when the thermostat adjusts its temperature settings is based on the location information related to the subject.
  • the thermostat adjusts its temperature settings when the location information indicates that the subject is within a predefined distance from a location that is temperature-controlled by the thermostat.
  • the remote device is a light.
  • the subject is remote from a location that can be illuminated by the light.
  • the light is configured to adjust its lighting settings at a time that is based on the location information related to the subject.
  • the light adjusts its lighting settings when the location information indicates that the subject is within a predefined distance from a location that is lighting-controlled by the light.
  • the method also includes determining that the subject is interacting with a particular object based on a location of the subject.
  • the remote device is a server.
  • the particular object is an advertisement.
  • the particular object is a product display.
  • the particular object is a retail product.
  • the location of the subject is determined by a GPS module of a device worn by the subject that acquires the data.
  • the location of the subject is determined based on a strength of a wireless connection between a device worn by the subject that acquires the data and one or more proximity sensors.
  • a relatively higher strength of the wireless connection between the device and the proximity sensor indicates that the device is relatively closer to the proximity sensor.
  • the wireless connection is a Bluetooth connection.
  • the method also includes determining, based on the processed data, that the subject is experiencing one or more of an increase in heart rate, blood pressure, and respiratory rate while the subject is interacting with the particular object.
  • the method also includes inferring that the subject is interested in the particular object based on one or more of the heart rate, the blood pressure, and the respiratory rate of the subject while the subject is interacting with the particular object.
  • the remote device is an entertainment device.
  • the entertainment device is a television.
  • the entertainment device is an audio output device.
  • the entertainment device is a gaming device.
  • the processed data indicates whether the subject has exercised for a predetermined length of time
  • the entertainment device can be turned on only if the subject has exercised for the predetermined length of time.
  • the entertainment device is configured to provide content personalized for the subject based on a state of the subject as determined from the processed data.
  • the state of the subject includes a level of interest in the content provided by the entertainment device.
  • a rise in one or more of a heart rate, a heart rate variability, an electrical skin impedance, a respiratory rate, and a blood pressure of the subject while the subject is experiencing the content indicates an increased level of interest in the content.
  • the heart rate, the heart rate variability, the electrical skin impedance, and the respiratory rate of the subject are determined from the processed data.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the blood pressure of the subject is determined from the processed data.
  • the entertainment device provides content designed to excite the subject if the heart rate variability of the subject is within a predefined range.
  • the entertainment device provides content designed to excite the subject if one or more of the heart rate, the electrical skin impedance, the respiratory rate, and the blood pressure of the subject is below a respective threshold.
  • the state of the subject includes a level of stress of the subject while the subject is experiencing the content.
  • a rise in one or more of a heart rate, a heart rate variability, an electrical skin impedance, a respiratory rate, and a blood pressure of the subject while the subject is experiencing the content indicates an increased level of interest in the content.
  • the heart rate, the heart rate variability, the electrical skin impedance, and the respiratory rate of the subject are determined from the processed data.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject, wherein the blood pressure of the subject is determined from the processed data.
  • the entertainment device provides content designed to calm the subject if the heart rate variability of the subject is within a predefined range.
  • the entertainment device provides content designed to calm the subject if one or more of the heart rate, the electrical skin impedance, the respiratory rate, and the blood pressure of the subject is above a respective threshold.
  • the entertainment device is a television and the content includes one or more of television shows, movies, and games.
  • the entertainment device is a gaming device that is configured to adjust game settings based on a state of the subject as determined from the processed data.
  • game settings include one or more of difficulty settings, sound settings, and situational settings.
  • the entertainment device is configured to turn off based on a state of the subject as determined from the processed data.
  • the method also includes causing the remote device to adjust a dating preference in a dating profile of the subject based on a state of the subject as determined from the processed data.
  • the method also includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in one or more other subjects acquired at locations on the other subjects.
  • the method also includes processing data that represents time-varying information about motion of the one or more other subjects acquired at the locations on the other subjects.
  • the method also includes determining a compatibility between the subject and each of the other subjects based on states of the subjects as determined from the data.
  • the method also includes ranking the compatibilities between the subject and each of the other subjects.
  • the remote device is a device operated by the subject.
  • the method also includes determining, based on the data in the first and second datasets, that the subject is not adequately alert.
  • determining that the subject is not adequately alert is based on one or more of a heart rate, a respiratory rate, a blood pressure, and an activity level of the subject.
  • determining that the subject is not adequately alert includes determining, based on the processed data, whether one or more of the heart rate, the respiratory rate, the blood pressure, and the activity level of the subject is below a threshold.
  • the method also includes causing the device to activate an alarm if the subject is not adequately alert.
  • the method also includes causing the device to slow down if the subject is not adequately alert.
  • the device is a vehicle.
  • the data is acquired by the device and the device is wearable by the subject.
  • the method also includes causing an alarm of the wearable device to be activated if the subject is not adequately alert.
  • the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive data from the motion sensor.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive data from the motion sensor.
  • the processor is also configured to determine, based on the received data, that the subject is experiencing or has experienced a health-related problem.
  • the processor is also configured to determine, based on the received data, that the subject is about to experience a health-related problem.
  • the processor is also configured to cause the remote device to alert a caregiver that the subject is experiencing, has experienced, or is about to experience a health-related problem.
  • the processor is also configured to cause the remote device to alert the subject that the subject is experiencing, has experienced, or is about to experience a health-related problem.
  • the remote device sends an alert to the biofeedback device.
  • the remote device sends an alert to a mobile phone of the subject.
  • the processor is also configured to provide location information related to the subject to the remote device.
  • the biofeedback device also includes a location module configured to provide the location information related to the subject to the remote device.
  • the location module is a GPS transponder.
  • the processor is also configured to provide temperature information related to the subject to the remote device.
  • the processor is also configured to determine that the subject is interacting with a particular object based on a location of the subject.
  • the remote device is a server.
  • the particular object is an advertisement.
  • the particular object is a product display.
  • the particular object is a retail product.
  • the location of the subject is determined by the GPS module of the biofeedback device.
  • the location of the subject is determined based on a strength of a wireless connection between the biofeedback device and one or more proximity sensors.
  • a relatively higher strength of the wireless connection between the biofeedback device and the proximity sensor indicates that the biofeedback device is relatively closer to the proximity sensor.
  • the wireless connection is a Bluetooth connection.
  • the remote device is a device operated by the subject.
  • the processor is also configured to determine, based on the received data, that the subject is not adequately alert.
  • the processor is also configured to cause the biofeedback device to activate an alarm if the subject is not adequately alert.
  • the processor is also configured to cause the device operated by the subject to slow down if the subject is not adequately alert.
  • the device is a vehicle.
  • a method in another aspect, includes deriving a score associated with a state of a subject, the state of the subject being one or more members selected from the group consisting of health, sleep, fitness, and stress. Deriving the score is based on data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired at a location of the subject.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including deriving a score associated with a state of a subject.
  • the state of the subject is one or more members selected from the group consisting of health, sleep, fitness, and stress. Deriving the score is based on data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired at a location of the subject.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor.
  • the processor is also configured to derive a score associated with a state of the subject, the state of the subject being one or more members selected from the group consisting of health, sleep, fitness, and stress.
  • Implementations can include one or more of the following features.
  • deriving the score is also based on data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the score is a numerical value.
  • the numerical value is between 1 and 100.
  • the numerical value is between 1 and 10.
  • the data is acquired by a device that is worn by the subject and that displays the score.
  • the device worn by the subject derives the score.
  • the device worn by the subject provides the data to a remote device that derives the score.
  • the remote device is a server.
  • the remote device provides the score to the device worn by the subject.
  • the remote device provides the score to a mobile phone of the subject.
  • the score is provided to one or both of the subject and another party.
  • the state of the subject includes a sleep state
  • the score includes a sleep score
  • the sleep score is associated with a level of quality of the subject's sleep.
  • deriving the score includes identifying one or more potential sleep rest periods of the subject based on gross motion data of the subject.
  • deriving the score also includes calculating one or more of an average heart rate, a standard deviation of the average heart rate, and an average heart rate variability of the subject during each of the one or more potential sleep rest periods based on the information about at least one pulse pressure wave propagating through blood in the subject.
  • one or more of the potential sleep rest periods are identified as sleep rest periods by comparing one or more of the average heart rate, the standard deviation of the average heart rate, and the average heart rate variability of the subject during the respective potential sleep rest period to a threshold.
  • the sleep state of the subject is associated with one or more of sleep duration, sleep latency, and sleep staging.
  • deriving the score includes determining one or more of the sleep duration, the sleep latency, and the sleep staging of the subject.
  • the method also includes determining the sleep duration of the subject.
  • determining the sleep duration of the subject includes determining a total length of time during which the subject was asleep based on information related to one or more sleep rest periods of the subject.
  • the information related to the one or more sleep rest periods includes a time associated with a beginning of each sleep rest period, a time associated with an end of each sleep rest period, gross motion data of the subject during each sleep rest period, and heart rate data of the subject during each sleep rest period.
  • determining the sleep duration of the subject includes determining a percentage of time that the subject was asleep between a time when the subject started to try to fall asleep and a time when the subject awoke based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • the method also includes determining the sleep latency of the subject.
  • determining the sleep latency of the subject includes determining a length of time that it takes for the subject to transition from a state of wakefulness to the sleep state based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • the method also includes determining the sleep staging of the subject.
  • determining the sleep staging of the subject includes determining a deepness of the subject's sleep during a portion of each of one or more sleep rest periods of the subject based on information related to the one or more sleep rest periods.
  • the sleep staging of the subject is determined based on at least a heart rate and gross motion data of the subject during one or more of the portions of the sleep rest periods.
  • the data is acquired by a device that is worn by the subject.
  • the method also includes causing the device to calculate and display the sleep score when the subject is determined to have awoken.
  • the method also includes providing information to the subject that assists the subject in improving the sleep score.
  • the information includes a recommended sleep schedule.
  • the information is provided to a device that is worn by the subject that acquires the data.
  • the information is provided to a mobile phone of the subject.
  • the state of the subject includes a fitness state
  • the score includes a fitness score
  • the fitness score is associated with one or more of a degree of physical fitness, cardiac condition, coaching, dehydration, social interaction, adherence to a regimen, and coaching effectiveness of the subject.
  • deriving the score includes calculating a resting heart rate of the subject while the subject is inactive based on the information about at least one pulse pressure wave propagating through blood in the subject and gross motion data of the subject.
  • deriving the score also includes calculating a heart rate of the subject based on the information about at least one pulse pressure wave propagating through blood in the subject. Deriving the score also includes determining that the subject is in the fitness state based on the heart rate and the gross motion data of the subject.
  • deriving the score includes determining a length of time that it takes for the subject's heart rate to transition from the heart rate in the fitness state to the resting heart rate.
  • deriving the score includes determining a length of time that it takes for the subject's heart rate to transition from the resting heart rate to the heart rate in the fitness state.
  • the data is acquired by a device that is worn by the subject.
  • the method also includes causing the device to calculate and display the fitness score when the subject is determined to be in the fitness state.
  • the method also includes causing the device to calculate and display the fitness score when the subject is determined to have transitioned from the fitness state to a non-fitness state.
  • the method also includes providing information to the subject that assists the subject in improving the fitness score.
  • the information includes a recommended fitness routine.
  • the information is provided to a device that is worn by the subject that acquires the data.
  • the information is provided to a mobile phone of the subject.
  • the method also includes embedding a visual indication of one or more of the fitness score, a heart rate, a respiratory rate, and a blood pressure of the subject into a video showing the subject performing a fitness routine.
  • the visual indications are updated throughout the video according to the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject during the fitness routine.
  • the method also includes predicting an outcome of an athletic event that the subject is participating in based on one or more of the fitness score, a heart rate, a respiratory rate, and a blood pressure of the subject during the athletic event.
  • the method also includes comparing one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of other individuals who are participating in the athletic event.
  • the method also includes, while the subject is performing physical activity, comparing one or more of the fitness score, a heart rate, a respiratory rate, and a blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of one or more individuals who have previously performed the physical activity.
  • performing the physical activity includes performing an athletic event, and the one or more individuals are professional athletes who compete in the athletic event.
  • the state of the subject includes a stress state
  • the score includes a stress score
  • deriving the score includes calculating one or more of a heart rate, a heart rate variability, a blood pressure, an electrical skin impedance, and a respiratory rate of the subject based on the information about at least one pulse pressure wave propagating through blood in the subject and information about motion of the subject.
  • the stress state of the subject is associated with hypertension
  • deriving the score includes determining whether the subject is experiencing hypertension by comparing a blood pressure of the subject to a threshold.
  • the stress state of the subject is associated with emotional stress
  • deriving the score includes determining a level of emotional stress experienced by the subject by comparing one or more of a heart rate, a heart rate variability, a blood pressure, an electrical skin impedance, and a respiratory rate of the subject to a threshold.
  • determining the level of emotional stress experienced by the subject is based at least in part on audio data.
  • the audio data is captured by a microphone of a device that acquires the data in the first dataset.
  • the audio data includes one or both of environmental noise and a tonality of the subject's voice.
  • determining the level of emotional stress experienced by the subject includes analyzing the environmental noise to determine whether the subject is in an environment attributed to an increased emotional stress level.
  • determining the level of emotional stress experienced by the subject includes analyzing the tonality of the subject's voice to determine whether the subject is in a confrontational situation attributed to an increased emotional stress level.
  • the data is acquired by a device that is worn by the subject.
  • the method also includes causing the device to calculate and display the stress score when the subject is determined to be in the stress state.
  • the method also includes providing information to the subject that assists the subject in improving the stress score.
  • the information includes a recommended stress-reducing routine.
  • the information is provided to a device that is worn by the subject that acquires the data.
  • the information is provided to a mobile phone of the subject.
  • the state of the subject includes a sleep state
  • the score includes a sleep score
  • the sleep state of the subject is associated with one or more of sleep duration, sleep latency, and sleep staging
  • deriving the score includes determining one or more of the sleep duration, the sleep latency, and the sleep staging of the subject.
  • the processor is also configured to determine the sleep duration of the subject.
  • determining the sleep duration of the subject includes determining a total length of time during which the subject was asleep based on information related to one or more sleep rest periods of the subject.
  • determining the sleep duration of the subject includes determining a percentage of time that the subject was asleep between a time when the subject started to try to fall asleep and a time when the subject awoke based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • the processor is also configured to determine the sleep latency of the subject.
  • determining the sleep latency of the subject includes determining a length of time that it takes for the subject to transition from a state of wakefulness to the sleep state based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • the processor is also configured to determine the sleep staging of the subject.
  • determining the sleep staging of the subject includes determining a deepness of the subject's sleep during a portion of each of one or more sleep rest periods of the subject based on information related to the one or more sleep rest periods.
  • the sleep staging of the subject is determined based on at least a heart rate and gross motion data of the subject during one or more of the portions of the sleep rest periods.
  • the biofeedback device also includes a display, and the processor is also configured to cause the display to display the sleep score.
  • the processor causes the display to display the sleep score when the subject is determined to have awoken.
  • the state of the subject includes a fitness state
  • the score includes a fitness score
  • the fitness score is associated with one or more of a degree of physical fitness, cardiac condition, coaching, dehydration, social interaction, adherence to a regimen, and coaching effectiveness of the subject.
  • deriving the score includes calculating a resting heart rate of the subject while the subject is inactive based on the information about at least one pulse pressure wave propagating through blood in the subject and gross motion data of the subject.
  • deriving the score also includes calculating a heart rate of the subject based on the information about at least one pulse pressure wave propagating through blood in the subject. Deriving the score also includes determining that the subject is in the fitness state based on the heart rate and the gross motion data of the subject.
  • deriving the score also includes determining a length of time that it takes for the subject's heart rate to transition from the heart rate in the fitness state to the resting heart rate.
  • deriving the score also includes determining a length of time that it takes for the subject's heart rate to transition from the resting heart rate to the heart rate in the fitness state.
  • the processor is also configured to cause the display to display the fitness score.
  • the processor causes the display to display the fitness score when the subject is determined to be in the fitness state.
  • the processor causes the display to display the fitness score when the subject is determined to have transitioned from the fitness state to a non-fitness state.
  • the processor is also configured to determine one or more of a heart rate, a respiratory rate, and a blood pressure of the subject based on data received from one or more of the light-emitting element, the optical sensor, and the motion sensor.
  • the device also includes a transceiver, and the processor is configured to cause the transceiver to provide one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to a remote device.
  • the processor causes the transceiver to provide one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to a video that shows the subject performing a fitness routine.
  • a visual indication of one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject is embedded into the video.
  • the visual indications are updated throughout the video according to the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject during the fitness routine.
  • the processor is also configured to predict an outcome of an athletic event that the subject is participating in based on one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject during the athletic event.
  • the transceiver is configured to communicate with transceivers of other biofeedback devices.
  • the processor is also configured to compare one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of other individuals who are participating in the athletic event.
  • the processor is also configured to, while the subject is performing physical activity, compare one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of one or more individuals who have previously performed the physical activity.
  • performing the physical activity includes performing an athletic event, and the one or more individuals are professional athletes who compete in the athletic event.
  • the state of the subject includes a stress state
  • the score includes a stress score
  • the stress state of the subject is associated with emotional stress
  • deriving the score includes determining a level of emotional stress experienced by the subject by comparing one or more of a heart rate, a heart rate variability, a blood pressure, an electrical skin impedance, and a respiratory rate of the subject to a threshold.
  • the biofeedback device also includes an audio input device.
  • determining the level of emotional stress experienced by the subject is based at least in part on audio data provided to the processor by the audio input device.
  • the audio data includes one or both of environmental noise and a tonality of the subject's voice.
  • determining the level of emotional stress experienced by the subject includes analyzing the environmental noise to determine whether the subject is in an environment attributed to an increased emotional stress level
  • determining the level of emotional stress experienced by the subject includes analyzing the tonality of the subject's voice to determine whether the subject is in a confrontational situation attributed to an increased emotional stress level.
  • the processor is also configured to cause the display to display the stress score.
  • the processor causes the display to display the stress score when the subject is determined to be in the stress state.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The method also includes deriving information about a psychological state of the subject from the processed data.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the operations also include deriving information about a psychological state of the subject from the processed data.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to derive information about a psychological state of the subject from the processed data.
  • Implementations can include one or more of the following features.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the psychological state of the subject includes a state of stress.
  • the method also includes determining one or more of a blood pressure, a heart rate, and a heart rate variability of the subject based on the datasets.
  • the method also includes deriving information about the state of stress of the subject based on one or more of the determined blood pressure, heart rate, and heart rate variability of the subject.
  • the method also includes correlating a level of stress of the subject to an amount of ultraviolet light that the subject has been exposed to.
  • deriving the information includes inferring a relationship between at least some of the processed data and one psychological state of the subject.
  • the method also includes inferring an existence of a second psychological state of the subject by comparing other processed data with the processed data related to the one psychological state.
  • the one psychological state includes a state of relatively lower stress.
  • the one psychological state includes a baseline state of the subject, and the relationship between at least some of the processed data and the one psychological state is inferred prior to the subject performing a polygraph test.
  • the psychological state includes a malicious intent.
  • the psychological state includes lying.
  • a device worn by the subject acquires the data.
  • deriving information about the psychological state of the subject includes determining a baseline state of the subject based on one or more of a blood pressure, a heart rate, a heart rate variability, a respiratory rate, and an electrical skin impedance.
  • the device is worn by the subject for an extended period of time to determine the baseline state of the subject.
  • the device is continuously worn by the subject for more than one day.
  • the processor is also configured to determine one or more of a blood pressure, a heart rate, and a heart rate variability of the subject based on the received data.
  • the processor is also configured to derive information about a state of stress of the subject based on one or more of the determined blood pressure, heart rate, and heart rate variability of the subject.
  • the device also includes an ultraviolet light sensor configured to measure an amount of ultraviolet light that the subject is exposed to.
  • the processor is also configured to correlate a level of stress of the subject to an amount of ultraviolet light that the subject has been exposed to.
  • a method in another aspect, includes processing data in a dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • a method in another aspect, includes processing data in a dataset that represents time-varying information about motion of a subject acquired at a location of the subject. The method also includes determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the method also includes, based on the first and second datasets, determining at least two parameters of the subject, the parameters selected from the group consisting of blood pressure, respiratory rate, blood oxygen levels, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, and PPG morphology.
  • the method also includes determining a biometric signature of the subject, the biometric signature represented by a multi-dimensional space that is defined by at least two axes, each axis corresponding to at least one of the determined parameters.
  • the method also includes determining whether the biometric signature was captured from a subject who is an expected subject by analyzing features of the biometric signature.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a dataset that represents time-varying information about motion of a subject acquired at a location of the subject.
  • the operations also include determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the operations also include determining at least two parameters of the subject based on the first and second datasets. The parameters are selected from the group consisting of blood pressure, respiratory rate, blood oxygen levels, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, and PPG morphology.
  • the operations also include determining a biometric signature of the subject.
  • the biometric signature is represented by a multi-dimensional space that is defined by at least two axes. Each axis corresponds to at least one of the determined parameters.
  • the operations also include determining whether the biometric signature was captured from a subject who is an expected subject by analyzing features of the biometric signature.
  • the biofeedback device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the biofeedback device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to determine whether one or more segments of the data were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • Implementations can include one or more of the following features.
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the determining includes analyzing other biometric data.
  • the other biometric data includes one or more of electrical skin impedance, respiratory rate, heart rate, heart rate variability, PPG morphology, and vocal sound frequency of the subject.
  • analyzing the other biometric data includes determining whether the subject is under distress.
  • the determining includes analyzing confidential information provided by the subject.
  • the confidential information includes one or more of a password, a personal identification number, and a predefined gesture.
  • the analyzing includes comparing morphological features of different segments of biometric data.
  • the method also includes taking an action when it is determined that one or more of the segments were captured from a subject other than the expected subject.
  • taking an action includes prompting the subject to provide confidential information to authenticate the subject as the expected subject.
  • the expected subject is a subject associated with a particular device that captures the data segments at a location on the expected subject.
  • the determining includes taking account of one or both of a changing level of activity and a changing heart rate of the subject.
  • the method also includes sending information to a device upon determining that the subject is the expected subject.
  • the device is a payment gateway, and the information includes a payment authorization.
  • the device is a lock
  • the information causes a lock to unlock.
  • causing the lock to unlock is also based on a location of the subject.
  • the method also includes sending information to a device upon determining that the subject is under distress.
  • the subject is determined to be under distress if one or more of a heart rate, a blood pressure, and a respiratory rate of the subject surpasses a threshold.
  • the device is a payment gateway
  • the information includes instructions for the payment gateway to prevent the subject from accessing the payment gateway.
  • the device is a lock
  • the information includes instructions for the lock to remain locked.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the method also includes determining whether one or more segments of the datasets were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the method also includes determining a pulse transit time (PTT) based on the datasets, the PTT representing a transit time of a pulse pressure wave within the subject.
  • PTT pulse transit time
  • the method also includes determining a blood pressure of the subject based on the datasets.
  • the determining includes analyzing other biometric data.
  • the other biometric data includes one or more of electrical skin impedance, respiratory rate, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, PPG morphology, and vocal sound frequency of the subject.
  • analyzing the other biometric data includes determining whether the subject is under distress.
  • the morphological features include differences in blood pressure at specific times during each of the data segments.
  • the specific times include times of peaks or valleys in blood pressure during the data segments.
  • the morphological features include differences in blood pressure at successive peaks of blood pressure, successive valleys of blood pressure, or successive peaks and valleys of blood pressure.
  • determining whether one or more segments of the data were captured from a subject other than an expected subject includes analyzing confidential information provided by the subject.
  • the confidential information includes one or more of a password, a personal identification number, and a predefined gesture.
  • the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive data from the motion sensor
  • the processor is also configured to take an action when it is determined that one or more of the segments were captured from a subject other than the expected subject.
  • taking an action includes prompting the subject to provide confidential information to authenticate the subject as the expected subject.
  • the motion sensor is also configured to determine when a subject performs the predefined gesture.
  • the biofeedback device also includes a transceiver configured to send information to a device upon determining that the subject is the expected subject.
  • the device is a payment gateway, and the information includes a payment authorization.
  • the device is a lock
  • the information causes a lock to unlock.
  • the biofeedback device also includes a location module, and causing the lock to unlock is also based on a location of the subject as determined by the location module.
  • the transceiver is also configured to send information to a device upon determining that the subject is under distress.
  • the device is a payment gateway
  • the information includes instructions for the payment gateway to prevent the subject from accessing the payment gateway.
  • the device is a lock
  • the information includes instructions for the lock to remain locked.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes providing, based on the data, information about a medication regimen of the subject.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include providing, based on the data, information about a medication regimen of the subject.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to provide, based on the data, information about a medication regimen of the subject.
  • Implementations can include one or more of the following features.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the method also includes determining, based on the data, that the subject has potentially missed a dose of a medication.
  • the method also includes providing a notification indicating that the subject has potentially missed the dose of the medication.
  • determining that the subject has potentially missed a dose of a medication includes determining that a blood pressure of the subject has crossed a threshold.
  • the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject.
  • the method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • determining that the subject has potentially missed a dose of a medication includes determining that a heart rate of the subject has crossed a threshold.
  • determining that the subject has potentially missed a dose of a medication includes determining that a respiratory rate of the subject has crossed a threshold.
  • the method also includes determining, based on the data, a reaction of the subject to a medication.
  • the method also includes providing a recommended medication regimen of the medication based on the reaction of the subject to the medication.
  • the recommended medication regimen includes one or more recommended dosage timings.
  • the recommended medication regimen also includes one or more recommended dosage amounts. Each of the recommended dosage amounts corresponds to one of the dosage timings.
  • determining a reaction of the subject to a medication includes determining a blood pressure of the subject.
  • the blood pressure of the subject is determined periodically.
  • the recommended dosage timings and amounts are determined so as to maintain a blood pressure of the subject within a defined range.
  • determining a reaction of the subject to a medication includes determining a heart rate of the subject.
  • the heart rate of the subject is determined periodically.
  • determining a reaction of the subject to a medication includes determining a regularity of a heart rate of the subject.
  • the recommended dosage timings and amounts are determined so as to maintain a heart rate of the subject within a defined range.
  • determining a reaction of the subject to a medication includes determining a cardiac output of the subject.
  • the recommended dosage timings and amounts are determined so as to maintain a cardiac output of the subject within a defined range.
  • determining a reaction of the subject to a medication includes determining a temperature of the subject.
  • the recommended dosage timings and amounts are determined so as to maintain the temperature of the subject within a defined range.
  • the recommended dosage timings and amounts are determined so as to maintain a heart rate of the subject within a defined range.
  • determining a reaction of the subject to a medication includes determining a respiratory rate of the subject.
  • the respiratory rate of the subject is determined periodically.
  • the recommended dosage timings and amounts are determined so as to maintain a respiratory rate of the subject within a defined range.
  • the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive data from the motion sensor.
  • the processor is also configured to determine, based on the data, that the subject has potentially missed a dose of a medication and provide a notification indicating that the subject has potentially missed the dose of the medication.
  • the processor is also configured to determine, based on the data, a reaction of the subject to a medication and provide a recommended medication regimen of the medication based on the reaction of the subject to the medication.
  • the recommended medication regimen includes one or more recommended dosage timings.
  • the recommended medication regimen also includes one or more recommended dosage amounts, each of which corresponds to one of the dosage timings.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject
  • a method in another aspect, includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of two or more subjects acquired at a location of each of the subjects. The method also includes processing data that represents time-varying information about motion of the two or more subjects acquired at the location on each of the subject. The method also includes providing information to a user that reports relative states of the subjects.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of two or more subjects acquired at a location of each of the subjects.
  • the operations also include processing data that represents time-varying information about motion of the two or more subjects acquired at the location on each of the subject.
  • the operations also include providing information to a user that reports relative states of the subjects.
  • a biofeedback device configured to be worn by two or more subjects includes a light source configured to emit light toward the skin of the subject.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to provide information to a user that reports relative states of the subjects.
  • Implementations can include one or more of the following features.
  • the information about at least one pulse pressure wave propagating through blood in the subjects includes photoplethysmographic (PPG) data and the information about motion of the subjects includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at single locations of each of the subjects.
  • the data is acquired by devices worn by the subjects.
  • the devices are mobile and do not reduce mobility of the subjects.
  • the devices process the data.
  • the single location of each of the subjects is an arm of the subject.
  • the single location is a wrist of the subject.
  • the relative states of the subjects are determined based on one or more of respiratory rates, heart rates, and blood pressures of the subjects.
  • the relative states of the subjects are determined by comparing one or more of the respiratory rates, the heart rates, and the blood pressures of the subjects to respective threshold values.
  • devices worn by the subjects acquire the data, and the respiratory rates, the heart rates, and the blood pressures of the subjects are determined according to the data.
  • the method also includes managing the subjects based on the relative states.
  • the method also includes assigning tasks to the subjects based on the relative states of the subjects.
  • one or more of the subjects are put into an athletic contest according to the relative states of the subjects.
  • a subject is put into the athletic contest if one or more of the respiratory rate, the heart rate, and the blood pressure of the subject is above a respective threshold.
  • one or more of the subjects are assigned particular combat tasks according to the relative states of the subjects.
  • a subject is assigned a particular combat task if one or more of the respiratory rate, the heart rate, and the blood pressure of the subject is above a respective threshold.
  • the relative states include one or more of relative psychological states, relative physical states, and relative states of readiness.
  • the two or more subjects are managed based on the relative states.
  • the processor is also configured to assign tasks to the subjects based on the relative states of the subjects.
  • one or more of the subjects are put into an athletic contest according to the relative states of the subjects.
  • one or more of the subjects are assigned particular combat tasks according to the relative states of the subjects.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject while the subject is sleeping. The method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject while the subject is sleeping. The method also includes determining, based on the data, information about a characteristic of the subject's sleep.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject while the subject is sleeping.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject while the subject is sleeping.
  • the operations also include determining, based on the data, information about a characteristic of the subject's sleep.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to determine, based on the data, information about a characteristic of the subject's sleep.
  • Implementations can include one or more of the following features.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the method also includes generating a reduced set of data by excluding data associated with non-sleep periods of the subject.
  • a period of time is identified as a non-sleep period based on gross motion data of the subject.
  • identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is above a threshold.
  • identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is substantially irregular.
  • a period of time is identified as a sleep period based on gross motion data of the subject.
  • identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is below a threshold.
  • identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is substantially flat.
  • the method also includes determining a start and an end of the sleep period.
  • determining the start of the sleep period includes identifying a time when the gross motion data falls below a threshold, and determining the end of the sleep period includes identifying a time when the gross motion data rises above a threshold.
  • the method also includes calculating a property of the sleep of the subject based on the data.
  • the property is associated with one or more of heart rate, heart rate variability, activity level, respiratory rate, and blood pressure of the subject.
  • one or more of the heart rate, the heart rate variability, the activity level, the respiratory rate, and the blood pressure of the subject are determined based on the processed data.
  • determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • determining the blood pressure of the subject includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. Determining the blood pressure of the subject also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. Determining the blood pressure of the subject also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject, wherein the PTT is related to an internal pressure of one or more blood vessels of the subject. Determining the blood pressure of the subject also includes determining the blood pressure of the subject based on the internal pressure of the one or more blood vessels.
  • PTT pulse transit time
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • the characteristic of the subject's sleep is determined based on the property.
  • the characteristic includes sleep apnea.
  • determining that the subject is experiencing sleep apnea includes identifying a simple signal in a heart rate signal of the subject that is acquired during a sleep period of the subject.
  • determining that the subject is experiencing sleep apnea includes identifying recurring simple signals in the heart rate signal of the subject.
  • the simple signals recur at least every two minutes during the sleep period of the subject.
  • the characteristic includes a quality of the sleep, including one or more of a sleep duration, a sleep latency, a sleep staging, a number of disturbances, and a number of tosses and turns.
  • determining information about a characteristic of the subject's sleep includes determining the sleep duration of the subject.
  • determining the sleep duration of the subject includes determining a total length of time during which the subject was asleep based on information related to one or more sleep rest periods of the subject.
  • the information related to the one or more sleep rest periods includes a time associated with a beginning of each sleep rest period, a time associated with an end of each sleep rest period, gross motion data of the subject during each sleep rest period, and heart rate data of the subject during each sleep rest period.
  • determining information about a characteristic of the subject's sleep includes determining the sleep latency of the subject.
  • determining the sleep latency of the subject includes determining a length of time that it takes for the subject to transition from a state of wakefulness to the sleep state based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • determining information about a characteristic of the subject's sleep includes determining the sleep staging of the subject.
  • determining the sleep staging of the subject includes determining a deepness of the subject's sleep during a portion of each of one or more sleep rest periods of the subject based on information related to the one or more sleep rest periods.
  • the sleep staging of the subject is determined based on at least a heart rate and gross motion data of the subject during one or more of the portions of the sleep rest periods.
  • the method also includes alerting the subject when the sleep duration exceeds a threshold while the subject is in a light sleep stage.
  • the characteristic includes a sleep disorder.
  • the characteristic includes a level of nocturnal dip of blood pressure.
  • the characteristic includes a sleep period.
  • the method also includes deriving a value representing an evaluation of a state of the subject based on the data.
  • the state of the subject includes a health-related state.
  • the state of the subject is associated with one or more of sleep quality, sleep duration, sleep latency, and sleep staging.
  • the value is provided to the subject or to another party.
  • the value is derived based on data related to motion of the subject.
  • the data is acquired by a device that is worn by the subject and that displays the value.
  • the device derives the value.
  • the device provides the data to a remote device that derives the value.
  • the method also includes processing data that represents information about an amount of ultraviolet light that the subject has been exposed to.
  • the method also includes correlating a characteristic of the subject's sleep to the amount of ultraviolet light that the subject has been exposed to.
  • the method also includes correlating a quality of the subject's sleep to the amount of ultraviolet light that the subject has been exposed to.
  • the method also includes correlating a duration of the subject's sleep to the amount of ultraviolet light that the subject has been exposed to.
  • the processor is also configured to identify a period of time as a non-sleep period based on gross motion data of the subject measured by the motion sensor.
  • identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is above a threshold.
  • identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is substantially irregular.
  • the processor is also configured to identify a period of time as a sleep period based on gross motion data of the subject measured by the motion sensor.
  • identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is below a threshold.
  • identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is substantially flat.
  • the processor is also configured to determine a start and an end of the sleep period.
  • determining the start of the sleep period includes identifying a time when the gross motion data falls below a threshold, and determining the end of the sleep period includes identifying a time when the gross motion data rises above a threshold.
  • the processor is also configured to calculate a property of the sleep of the subject based on the data.
  • the characteristic of the subject's sleep is determined based on the property, and the characteristic of the subject's sleep includes sleep apnea.
  • the processor is also configured to determine that the subject is experiencing sleep apnea. Determining that the subject is experiencing sleep apnea includes identifying a simple signal in a heart rate signal of the subject that is acquired during a sleep period of the subject.
  • determining that the subject is experiencing sleep apnea includes identifying recurring simple signals in the heart rate signal of the subject.
  • the simple signals recur at least every two minutes during the sleep period of the subject.
  • the characteristic includes a quality of the sleep, including one or more of latency to sleep, number of disturbances, and number of tosses and turns.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the data in the first and second datasets is acquired while the subject is in a situation that requires at least a predetermined amount of alertness of the subject.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The data is acquired while the subject is in a situation that requires at least a predetermined amount of alertness of the subject.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to process the data to derive a measure of alertness of the subject.
  • Implementations can include one or more of the following features.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the situation includes one in which a likelihood of harm to one or more human lives is increased if the alertness of the subject is below the predetermined amount.
  • the situation is one in which a likelihood of damage to one or more properties is increased if the alertness of the subject is below the predetermined amount.
  • the situation is one in which a likelihood of economic damage is increased if the alertness of the subject is below the predetermined amount.
  • the situation is one or more of air traffic control, intelligence analysis, vehicle driving, machinery driving, security guarding, baggage screening, and aircraft piloting.
  • the method also includes using the processed data to derive a measure of alertness of the subject.
  • the measure of alertness of the subject is based on one or more of a heart rate, a respiratory rate, a blood pressure, and an activity level of the subject.
  • the method also includes activating an alarm on a device worn by the subject if the measure of alertness of the subject falls below a threshold.
  • the device worn by the subject acquires the data.
  • the device worn by the subject processes the data.
  • the method also includes causing a speed of a vehicle being operated by the subject to be decreased if the measure of alertness of the subject falls below a threshold.
  • the method also includes causing an alarm in a vehicle being operated by the subject to be activated if the measure of alertness of the subject falls below a threshold.
  • the method also includes causing a device being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • the method also includes causing an operation switch of a vehicle being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • the method also includes assigning a task to the subject based on the measure of alertness.
  • the subject is put into an athletic contest if the measure of alertness of the subject is above a threshold.
  • the subject is assigned a particular combat task if the measure of alertness of the subject is above a threshold.
  • the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive and process the data from the motion sensor.
  • the biofeedback device also includes a transceiver configured to provide one or both of the processed data and the measure of alertness.
  • the transceiver is also configured to cause a speed of a vehicle being operated by the subject to be decreased if the measure of alertness of the subject falls below a threshold.
  • the transceiver is also configured to cause an alarm in a vehicle being operated by the subject to be activated if the measure of alertness of the subject falls below a threshold.
  • the transceiver is also configured to cause a device being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • the transceiver is also configured to cause an operation switch of a vehicle being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • the processor is also configured to assign a task to the subject based on the measure of alertness.
  • the subject is put into an athletic contest if the measure of alertness of the subject is above a threshold.
  • the subject is assigned a particular combat task if the measure of alertness of the subject is above a threshold.
  • operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes predicting a medical event of the subject based on the processed data.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include predicting a medical event of the subject based on the processed data.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to predict a medical event of the subject based on the data.
  • Implementations can include one or more of the following features.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the method also includes alerting a caregiver when a medical event of the subject is predicted.
  • processing the data includes determining one or more of heart rate, heart rate variability, blood pressure, blood pressure variability, body temperature, skin temperature, vocal tonality, electrical skin impedance, respiratory rate, blood oxygen level, stroke volume, cardiac output, MoCG morphology, and PPG morphology of the subject.
  • predicting the medical event of the subject includes determining whether a heart rate of the subject satisfies a threshold.
  • the medical event is tachycardia.
  • determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • predicting the medical event of the subject includes determining whether a heart rate variability of the subject satisfies a threshold.
  • the threshold is based on whether the subject experiences arrhythmia.
  • determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • predicting the medical event of the subject includes determining whether a blood pressure of the subject satisfies a threshold.
  • the medical event is hypertension.
  • predicting the medical event of the subject includes determining a rate of change of a blood pressure of the subject.
  • the medical event is a stroke
  • a stroke is predicted if the rate of change of the blood pressure of the subject is positive and above a threshold.
  • the medical event is abnormal heart function
  • abnormal heart function is predicted if the rate of change of the blood pressure of the subject is negative and below a threshold.
  • the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject.
  • the method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • the blood pressure of the subject is determined based on the PTT.
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive data from the motion sensor.
  • the device also includes a transceiver configured to alert a caregiver when a medical event of the subject is predicted.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • Blood pressure and/or other biometric parameters may be measured based on continuously acquired data, without the need for cuffs, pressure points or electrodes.
  • Continuous acquiring data means acquiring data at a sufficient frequency (e.g., a sufficient number of times per second) to allow for the derivation of the parameters described herein from that data.
  • the data can, for example, be collected at a frequency ranging from 16 Hz to 256 Hz. In certain implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • Vital signs can be measured at one location, using a comfortable and unobtrusive device.
  • the disclosed technology may be integrated with third party devices (for example, mobile devices) thereby allowing for using external sensors such as motion detectors and light sensors disposed in the third party devices.
  • FIG. 1A illustrates pulse transit time (PTT) calculation using an example BCGB plot, and a photoplethysmogram (PPG) plot.
  • PTT pulse transit time
  • PPG photoplethysmogram
  • FIGS. 1B and 1C are example block diagrams of a device that performs biometric measurements based on MoCG and PPG data.
  • FIGS. 1D-1F are plots generated based on data collected using sensors of the device of FIGS. 1B and 1C .
  • FIG. 1G illustrates side and top views of an example configuration of optical sensors that can be used in the device of FIGS. 1B and 1C .
  • FIGS. 2A-2C , 3 , and 4 illustrate plots generated based on data collected by the sensors of the device of FIGS. 1B and 1C .
  • FIGS. 5A-5E illustrate examples of cardiac signals.
  • FIGS. 6A-6C are flowcharts depicting example processes for biometric authentication.
  • FIG. 7A is a flowchart depicting an example of a process for calculating motion pulse transit time (MPTT).
  • FIG. 7B is a flowchart depicting an example of another process for calculating MPTT.
  • FIG. 8 shows examples of heat maps that relate to data collected from the motion sensors of the device of FIGS. 1B and 1C , and are used in determining weights for data corresponding to accelerometers oriented along different axes.
  • FIGS. 9 , 10 A- 10 C, 11 A, and 11 B illustrate plots used in calculating MPTT.
  • FIG. 12 is a flowchart depicting an example of a process for calibration of the device of FIGS. 1B and 1C .
  • FIGS. 13 and 14 illustrate examples related to calibration of the device of FIGS. 1B and 1C .
  • FIGS. 15A-15D and 16 A- 16 C show examples of plots used in detecting various heart conditions.
  • FIG. 17 is a flowchart of an example of a process for detecting arrhythmia.
  • FIG. 18 is an example of a plot of arterial stiffness vs. exercise frequency.
  • FIGS. 19A and 19B are examples of plots used in determining sleep quality and/or sleep disorders.
  • FIG. 20 is an example of a screenshot for showing sleep quality.
  • FIG. 21 is a flowchart depicting an example of a process for determining sleep quality.
  • FIG. 22 is an example of a screenshot for showing a fitness-related metric.
  • FIG. 23 is an example of a screenshot for showing a stress-related metric.
  • FIG. 24 is a flowchart depicting an example of a process for deriving information about a psychological state of a subject.
  • FIG. 25 is a flowchart depicting an example of a process for determining a metric for quality of care provided at a care facility.
  • FIG. 26 shows an example where the technology described is used by emergency responders.
  • FIG. 27 is a flowchart depicting an example of a process for determining relative states of multiple subjects.
  • FIG. 28 is a flowchart depicting an example of a process for predicting a medical event.
  • FIG. 29 is a flowchart depicting an example of a process for determining information about a medication regimen.
  • FIG. 30 shows an example where the technology is used at a medical or caregiving facility.
  • FIG. 31 shows an example of the technology being used with a proximity system.
  • FIGS. 32A and 32B show an example implementation of the device of FIGS. 1B and 1C in the form of a wearable watch.
  • FIG. 33 shows an example of an environment where the technology is used for access control.
  • FIG. 34 shows an example where the technology is used for allowing a user to access/operate a vehicle of other machinery.
  • FIG. 35 shows an example where the technology is used for controlling gaming and/or entertainment systems.
  • FIG. 36 shows an example where the technology is used for controlling various devices connected to a network.
  • FIG. 37 is an example of a screenshot that displays and allows sharing of blood pressure results.
  • FIG. 38 is a flowchart depicting an example of a process for controlling remote devices using the technology described in this document.
  • FIGS. 39A-39C show examples of user interfaces of an application that makes data collected by the device of FIGS. 1B and 1C available to a user.
  • FIG. 40 is an example of a block diagram of a computer system.
  • This document describes technology for determining pulse transit time (PTT) of blood based on motion data such as motioncardiogram (MoCG) data (which is related to, and also referred to in this document as ballistocardiogram (BCG) data) and optical data such as photoplethysmographic (PPG) data.
  • PTT pulse transit time
  • MoCG motioncardiogram
  • BCG ballistocardiogram
  • PPG photoplethysmographic
  • MPTT motion pulse transit time
  • the terms PTT and MPTT may be used interchangeably.
  • This document also describes technology for performing various biometric measurements (e.g., blood pressure, respiratory rate, blood oxygen level, stroke volume, cardiac output, arterial stiffness, and temperature) based on the MoCG data and the PPG data.
  • the MoCG is an example of a motion of the subject.
  • MoCG is a pulsatile motion signal of the body measurable, for example, by a motion sensor such as an accelerometer or a gyroscope.
  • the pulsatile motion signal results from a mechanical motion of portions of the body that occurs in response to mechanical motion of the heart.
  • the pulsatile motion signal can result from mechanical motion of portions of the body that occurs in response to blood being pumped during a heartbeat. This motion is a mechanical reaction of the body to the internal pumping of blood and is externally measurable.
  • the MoCG signal therefore corresponds to, but is delayed from, the heartbeat.
  • the MoCG signal recorded at a given portion of the body therefore represents the motion of the blood due to a heartbeat, but is delayed from, the heart's electrical activation (e.g. when the ventricles are electrically depolarized).
  • PPG data is data optically obtained via a plethysmogram, a volumetric measurement of the vasculature.
  • PPG data can be obtained using an optical device which illuminates the skin and measures changes in light absorption. With each cardiac cycle the heart pumps blood resulting in a pressure pulse wave within the vasculature. This causes time-varying changes in the volume of the vasculature. The changes can be detected, for example, by illuminating the skin with light from a light-emitting diode (LED) and then measuring the amount of light either transmitted or reflected to a detector such as a photodiode. Each cardiac cycle is therefore represented as a pattern of crests and troughs. The shape of the PPG waveform differs from subject to subject, and varies with the location and manner in which the waveform is recorded.
  • LED light-emitting diode
  • FIG. 1A illustrates pulse transit time (PTT) calculation using an example BCG plot 102 , and a photoplethysmogram (PPG) plot 103 .
  • BCG plot 102 can be analyzed to determine points at which a pulse (or pressure wave) originates at a first location on the body. The BCG however, may be measured at a second location on the body.
  • the points e.g., local maxima
  • 108 a , 108 b and 108 c in the BCG plot 102 may represent time points at which corresponding pulses originate at or near the chest. These points are often referred to in this document as pulse origination points.
  • the time of arrival of the pulse at a second location can be determined from PPG data obtained at the second location.
  • the PPG data can be measured at the wrist using one or more optical sensors.
  • Light from the optical sensors i.e., the light sources such as LEDs of the optical sensors
  • the reflected light which is modulated by blood volume changes underneath the skin
  • the output of the photo-detector may be amplified by an amplifier before being converted to a digital signal (for example, by an analog to digital converter (ADC)) that represents the PPG.
  • ADC analog to digital converter
  • the plot 103 of FIG. 1A represents PPG data that can be used to determine the arrival time of the pulses at the wrist.
  • the maximum slope points 109 a , 109 b , and 109 c represent the arrival times of the pulses that originated at the chest at time points represented by 108 a , 108 b , and 108 c , respectively. These points may in general be referred to in this document as pulse arrival points 109 .
  • the plot 103 is synchronized with the BCG plot 102 such that the PTT (or MPTT) 113 between the chest and the wrist can be determined as a time difference between the originating point at the chest and the corresponding arrival point at the wrist.
  • PTT or MPTT
  • the time difference between 108 b and 109 b represents the PTT 113 .
  • the time difference between 108 a and 109 a , or the time difference between 108 c and 109 c can be used in determining the PTT 113 .
  • the technology described in this document allows for determination of PTT from MoCG (or BCG) and PPG data measured at substantially the same location on a human body (e.g., the wrist). This includes identifying, from the PPG data, a time point (e.g., the time points 109 ) at which a pulse wave arrives at the location, identifying, from the BCG data, a time point (e.g., the time points 108 ) at which the pulse originated at a different location on the body (e.g., the heart) from the MoCG data, and determining the PTT 113 as a difference between the two identified time points.
  • a time point e.g., the time points 109
  • a time point e.g., the time points 108
  • FIG. 1B is a block diagram of an example of a device 100 that performs biometric measurements based on MoCG and PPG data.
  • the biometric measurements can be used for monitoring health related parameters, as well as in diagnosing conditions and predicting an onset of such conditions.
  • the device 100 can be a wearable device that a subject can wear on the body.
  • the device 100 can be disposed in a wearable watch, bracelet, anklet, armband, chest-patch, or belt.
  • An example implementation of the device in the form of a wearable watch 3200 is shown in FIGS. 32A and 32B .
  • the watch 3200 includes a case 3202 that is configured to hold the internal components of the watch, including light sources 3204 a , 3204 b , an optical sensor 3206 , a motion sensor 3208 , a processor 3210 , and an ultraviolet light sensor 3212 .
  • the device may also be disposed as a part of a garment worn by the subject.
  • the device 100 may also be disposed in a rug or mat (e.g., a bathroom mat or a shower mat).
  • the device 100 may also be disposed in a separate device carried or worn by the subject.
  • the device 100 can be disposed internally or externally in a watch or mobile device used by the subject.
  • the device 100 can include a transceiver that is configured to communicate wirelessly with another device to perform a biometric monitoring process. For example, data collected and/or computed by the device 100 may be transmitted to an application executing on a mobile device for additional analysis or storage.
  • alerts and messages may be transmitted from a server or mobile device for display on the device 100 .
  • Devices similar to the device 100 are described in U.S. patent application Ser. Nos. 13/166,388 and 13/803,165, and 61/660,987, the contents of which are incorporated by reference herein.
  • Various combinations of the operations described in this document may also be performed by a general purpose computing device that executes appropriate instructions encoded on a non-transitory computer readable storage device such as an optical disk, a hard disk, or a memory device.
  • the device 100 can be configured to make MoCG and PPG measurements either directly (such as when implemented as a part of an armband, wristband, chest patch, undergarment) or indirectly (such as when implemented as part of a mobile device) from a portion of the body proximate to the location of the device.
  • the MoCG data can be measured using one or more motion sensors 105 such as an accelerometer or a gyroscope.
  • the motion sensors 105 include multiple accelerometers (e.g., one for each of the x, y, and z axes) and/or multiple gyroscopes (e.g., one each for measuring tilt, rotation, and yaw). Even though FIG.
  • the device 100 can include one or more sensors to measure or detect ambient conditions.
  • sensors can include, for example, a microphone (e.g., to measure environmental noise), an altimeter, a humidity sensor, a GPS device (for determining geographical location), and an ultraviolet light sensor (e.g. to detect level of sun exposure).
  • the device 100 can be configured to warn the user (for example, by displaying a message) if a measured, derived, or inferred health parameter is outside an acceptable range for the parameter.
  • health parameters can include (without being limited to the following) measured parameters such as heart rate, respiratory rate, or arrhythmia, derived parameters such as blood pressure, stroke volume, or arterial stiffness, and inferred parameters such as mood, stress level, or sleep deprivation.
  • the level of sun exposure (as measured by the ultraviolet light sensor) can be correlated to the mood or stress level of the user, and related suggestions and recommendations can be provided accordingly. For example, if sun exposure above a certain threshold level is known to decrease stress for a particular user, the user may be asked to increase sun exposure during a period when a stress level detected by the device 100 is high.
  • environmental sounds captured by the microphone can be used to contextualize or interpret vital signs data captured using the device 100 .
  • a tonality e.g., amplitude and/or frequency
  • a tonality e.g., amplitude and/or frequency
  • environmental noise can be detected during a user's commute to determine, for example, if, and to what extent driving (or rush hour subway) affects the user's health parameters.
  • the data captured by the microphone can be used to determine and/or confirm if that is attributable to environmental noise (e.g., snoring, or an alarm clock going off).
  • environmental noise e.g., snoring, or an alarm clock going off.
  • an unacceptable condition e.g., a user's increased stress level
  • construction activity determined, for example, via pile driver sounds captured by the microphone
  • the data captured using the motion sensors 105 includes both MoCG data and motion data associated with an activity of the subject.
  • the MoCG data can be filtered out from the combination using, for example, one or more band pass filters (BPF) 125 shown in FIG. 1C .
  • BPF band pass filters
  • a pass band of the BPF 125 can be designed to filter out constant components (e.g., acceleration due to gravity) and high frequency noise components.
  • a pass band of 3-12 Hz may be used for the band pass filter 125 .
  • multiple band pass filters may be used concurrently.
  • a filter with a 3-12 Hz passband and another filter with a 10-30 Hz passband can be used simultaneously to measure different parameters measurable in the two different bands.
  • the band pass filtered accelerometers can be combined to obtain an activity index 127 , which in turn is used in calculating appropriate weights 130 for obtaining updated biometric measurements 132 .
  • the activity index 127 can be less than a threshold value (e.g., 5) indicating, for example, that the band pass filtered accelerometer outputs can be used directly in determining the biometric measurements.
  • the activity index 127 can be higher (e.g., between 5 and 15), indicating that the band pass filtered accelerometer data may need to be adjusted (e.g., by applying a threshold) before being used in determining the biometric measurements.
  • the band pass filtered accelerometer data may be discarded as being unreliable.
  • weights 130 may be adjusted to reflect if and how the band pass filtered data from the accelerometer 105 is used. Examples of band pass filtered accelerometer data are illustrated in FIG. 1F , where plots 170 , 172 , and 174 represent outputs of accelerometers in the x, y, and z axes, respectively.
  • the PPG data can be measured using one or more optical sensors 110 .
  • the optical sensors 110 can include one or more light emitting diodes (LEDs) whose output can be controlled, for example, by a microcontroller.
  • Example configurations of the optical sensors 110 are depicted in FIG. 1G .
  • the optical sensors include a 7.5 mm 2 photodiode with two green LEDs placed within 1.5 mm of either side. The photodiode has an opaque optical shield surrounding the sides. The LEDs can have a peak wavelength of 525 nm and a viewing angle of 60 degrees.
  • the optical sensors 110 In operation, light from the optical sensors 110 (i.e., from the light sources such as LEDs of the optical sensors) is directed toward the skin of the subject, and the reflected light is modulated by blood flow underneath the skin.
  • the optical sensors 110 also include one or more photo-detectors (e.g., photodiodes) that receive the reflected light and provide a resulting signal to the microcontroller.
  • the resulting signal may be amplified by an amplifier before being converted to a digital signal (for example, by an analog to digital converter (ADC)) that is provided to the microcontroller.
  • the PPG signal is synchronized with the heartbeat and can therefore be used to determine the heart rate (HR) 112 of a wearer of the device. This is shown in additional detail in FIG. 1C .
  • the heart rate signal can be within a particular range of the spectrum (e.g., 0 to one half of the sampling frequency) of the PPG signal 150 , and can be isolated using, for example, a band pass filter (BPF) 154 .
  • BPF band pass filter
  • FIG. 1D An example of this is shown in FIG. 1D , where the plot 160 represents raw PPG data, and the plot 162 represents the output of the BPF 154 .
  • the pass band of the filter used for the example depicted in FIG. 1C is 0.4-4 Hz. As seen from FIG. 1C , the low frequency portion of the raw data, as well as the high frequency variations are filtered out in the output plot 162 .
  • a frequency between 75-85 Hz is chosen such that reasonable power saving is achieved, and the optical interferers are aliased into non-biological optical signal frequency range (>10 Hz). For example, if 80 Hz is chosen, then the aliased interferers would be at frequencies such as 20 Hz, 30 Hz, and/or 40 Hz.
  • An appropriate low pass filter e.g., a filter with cut-off frequency of 10 Hz
  • the filtered PPG signal can be interpolated accordingly in time domain without signal loss.
  • the output of the BPF 154 can be used to determine a heart rate 144 of the subject, and can also be combined with the output of the BPF 125 to determine other biometric parameters such as pulse transit time (MPTT) and stroke volume (SV) 145 , as well as other parameters 146 , including, for example, systolic and diastolic blood pressure, stroke volume (SV), and cardiac output (CO).
  • MPTT pulse transit time
  • SV stroke volume
  • CO cardiac output
  • calibration data 155 is used in computing one or more of the parameters 146 .
  • the calibration data 155 can include user-specific calibration information (e.g., constants used in equations) that may be used in computing one or more of the parameters 146 .
  • the calibration data 155 can be computed based on user-provided data. For example, a user may be asked to provide biographical data such as age, height, and weight for use in computing the calibration data. In some implementations, the user can be asked to provide his/her last-known blood-pressure data to determine one or more constants or parameters included in the calibration data 155 . In some cases, a medical professional may measure a user's blood pressure during set up of the device 100 .
  • calibration data 155 can be calculated based on a user action. For example, the user may be asked to hold the device 100 at or near chest level to equalize hydrostatic pressure effects and sense chest vibrations that are used in computing a calibration point. This way, a delay between a chest vibration and the time of arrival of a pulse wave at the wrist (if the device 100 is worn on the wrist) can be used to calibrate for blood pressure for a scenario where there is no height difference between the heart and the measuring point.
  • the calibration data 155 can include information related to skin tone calibration where LED intensity and amplifier gain are adjusted until an optimal DC level is reached.
  • calibration data 155 may be included in the calibration data 155 .
  • the calibration factors may be adjusted retroactively once the user enters valid calibration data.
  • Calibration data may also be imported from the user's medical records if, for example, the device is dispensed to the user by their medical professional.
  • a signal representing respiratory rate is typically within the 0-1 Hz range of PPG, and can be obtained using low pass filtering.
  • FIG. 1C where the PPG data 150 is passed through the low pass filter (LPF) 152 and optionally combined with the output of another LPF 135 (used for low pass filtering the MoCG data) to obtain biometric parameters such as sleep data 142 and respiratory rate 143 .
  • LPF low pass filter
  • FIG. 1E An example of determining the respiratory rate 143 from the PPG data 150 is illustrated in FIG. 1E .
  • the plot 166 represents the raw PPG data
  • the plot 168 shows the output of the LPF 152 representing the low frequency variations due to respiration.
  • biometric parameters may also be computed.
  • blood oxygenation SpO 2
  • the device 100 can also include a computing device 115 that can be configured to compute the biometric parameters, including, for example, blood pressure, respiratory rate, blood oxygen, stroke volume, cardiac output, and temperature.
  • an activity index 148 (which may be the activity index 127 , also shown in FIG. 1C ) can be used in determining a set of weights 147 used in calculating one or more of the biometric parameters 146 .
  • the heart rate information 144 is used in calculating one or more of the biometric parameters 146 .
  • the heart rate information 144 can be obtained from the PPG by detecting peaks and/or valleys in a graphical representation (e.g., the plot 162 shown in FIG. 1C ) of the PPG data 150 . This can include, for example, cross-correlating a portion of the PPG data (e.g., samples or data corresponding to a two second segment of the plot 162 of FIG. 1C ) with similar segments to produce a plot 180 (shown in FIG. 2A ) representing a series of cross-correlation products.
  • two-second segments from the plot 162 are cross-correlated with adjacent (possibly with some partial overlap) two-second segments to produce the plot 180 of FIG. 2A .
  • a particular cross correlation result (for example, one that produces the highest cross-correlation amplitude) can then be selected as a template.
  • the plot 178 shown in FIG. 2B is an example of a template.
  • the template can be adjusted to conform to a desired morphology, allowing for a beat to beat natural variation but discounting noise and non-heartbeat signals.
  • the selected template can then be correlated with segments from the plot 162 (shown in FIG. 1C ) to identify locations of correlation peaks. This is illustrated in FIG. 2A , where the plot 180 represents a series of such peaks.
  • the location of the correlation signal peaks can be used to direct a search for valleys, inflection points, and/or peaks within the band pass filtered PPG signal.
  • the inflection point in this case is defined as the point of maximum slope.
  • FIG. 2C illustrates an example of a PPG signal with identified peaks 181 , inflection points 183 and valleys 185 . For brevity, only a few of the peaks, inflection points, and valleys are marked using the reference numbers 181 , 183 , and 185 , respectively.
  • the instantaneous heart rate for each of the heartbeats can be plotted as shown in FIG. 3 , and can be used for other purposes such as computing other parameters and diagnosing conditions such as arrhythmia.
  • confidence levels associated with a calculated instantaneous heart rate can be determined before being used in any subsequent analysis. For example, if a person suddenly stands up from a sitting position, the instantaneous heart rate during the transition may shoot up. In some implementations, the rate of such rapid increase can include meaningful information. However, in some implementations, the information obtained during this transition may not be reliable as an indicator of the person's health status. Determining confidence levels associated with the computed heart rates can allow for discarding such outliers in subsequent analyses.
  • a given computed instantaneous heart rate can be compared, for example, to the average (or median) instantaneous heart rate over a predetermined time range (e.g., ⁇ 10 seconds) to determine whether the given instantaneous heart rate is reliable. If the given instantaneous heart rate differs (e.g., differs by more than a predetermined amount) from the average heart rate over the predetermined time range, the given instantaneous heart rate may be determined to be unreliable and therefore de-weighed in subsequent computations. This allows for selecting reliable data points at the expense of a short latency (10 seconds in the above example).
  • the instantaneous heart-rate data as shown in FIG. 3 can be used for computing instantaneous heart-rate variability (HRV).
  • HRV heart-rate variability
  • FIG. 4 An example of HRV plotted against the corresponding heartbeats is shown in FIG. 4 .
  • the HRV data can be used to calculate a mean HRV for a set of heartbeats.
  • HRV data can be used in detecting conditions such as stress. For example, if the mean HRV is above a certain threshold, the subject may be determined to be under higher than usual stress.
  • HRV can be calculated by computing a variance of individual RR intervals (distance between the ‘R’ points of two consecutive QRS complex curves representing heartbeats, or alternatively the distance between valleys as shown in FIG. 2C ) from the average RR interval, over a period of time (e.g., 5 minutes).
  • the HRV can also be calculated in the frequency domain by comparing the power spectrum at very low frequencies (e.g., 0.04-0.15 Hz) with the power spectrum at slightly higher frequencies (e.g., 0.18 to 0.4 Hz).
  • Cardiac waveform morphology (also referred to as cardiac morphology) can be defined as the shape of a plot representing cardiac activity.
  • FIG. 5A represents a Wiggers diagram, which is a standard diagram used in cardiac physiology.
  • the shape of an electro-cardiogram (ECG) QRS complex 505 represents a morphology associated with a heartbeat.
  • Cardiac morphology depends on where and how cardiac activity is measured.
  • the morphology 510 of a phonocardiogram signal is different from that of the ECG morphology 505 .
  • the morphology associated with ventricular volume 515 is different from the morphology associated with ventricular pressure 520 .
  • FIG. 5B shows an example of a cardiac signal illustrating the morphology 525 associated with a PPG signal.
  • the morphology of a measured PPG signal can be checked to determine whether the measured PPG signal reliably represents heartbeats.
  • the relative separations of the peaks and valleys of the PPG signal are analyzed to determine whether the PPG signal reliably represents heartbeats. For example, a segment of the PPG signal can be determined to represent heartbeats if the following threshold condition is satisfied:
  • the condition above uses the range [0.25, 0.4] as an example, and other values can also be used.
  • the range (or threshold) could be determined for an individual user by using, for example, a range considered to be normal for the particular user.
  • the ratio from the above condition can vary within the range for various conditions of the subject. For example, the ratio can be at a low portion of the range during relaxation or sleep conditions, and at a high portion of the range during stressful events such as anger or fear.
  • other morphology checks can also be performed. For example, one morphology check can involve verifying that at a resting position, the user's systolic amplitude is approximately half of the diastolic amplitude. In some implementations, segments that do not satisfy the morphology check conditions are discarded from being used in biometric parameter computations.
  • Cardiac morphology also typically varies from one person to another due to, for example, unique heart beat signatures, breathing patterns and the unique ‘transmission line’ reflection signatures that are caused by the lengths and stiffness of an individual's arteries.
  • the main peak represents the first systolic peak which is followed by the secondary peak (or bump) representing the early diastolic peak (or reflection).
  • the time between the two peaks is also inversely proportional to arterial stiffness. This is easier to visualize from the first and/or the second derivatives of the PPG signal.
  • FIGS. 5C and 5D show examples of cardiac signals illustrating morphology based on PPG signals. In the example of FIG.
  • cardiac morphology can be used as a biometric identifier.
  • the device 100 described with reference to FIG. 1B can be configured to verify, based on a determined cardiac morphology, that the person wearing the device is the person for whom the device was assigned.
  • the determined cardiac morphology may also be used to uniquely identify a wearer of the device 100 .
  • biometric identification can be used, for example, in security and accessibility applications.
  • the device 100 can be configured to transmit a cardiac morphology based signature to a receiver (e.g., on a mobile phone, or at secured access point) to gain access to a secure resource.
  • a receiver e.g., on a mobile phone, or at secured access point
  • the wearer of the device may be identified based on the identified cardiac morphology of the wearer.
  • FIG. 5E shows examples of cardiac signals illustrating morphology for four different individuals, and illustrates how the cardiac morphology varies from one person to another.
  • multiple measured or derived parameters can be used as a biometric signature to uniquely identify a wearer.
  • a wearer can be identified based on a multi-dimensional space defined based on the measured or derived parameters. Because the parameters vary from one person to another, each person would be mapped to a different region within the multi-dimensional space.
  • a simple two-dimensional example of such a space can be defined, for example, by using heart rate as one axis and PPG shape as the second axis. Because the PPG shape and heart rate varies from one person to another, each person can typically be mapped to a separate region on the two-dimensional plane, and can be identified based on a location of the region. Higher dimensional spaces can be used for robustly identifying individuals among a large population.
  • parameters that can be used as axes for such spaces include cardiac morphology, heart rate, cardiac volume, PPG, or other parameters derived as a function of one or more of these parameters.
  • cardiac morphology can be combined with another parameter such as the MoCG morphology to achieve increased accuracy and/or resolution for bio-authentication applications. Examples of such applications include access control, digital wallet authorization, digital passwords/signature and environmental control.
  • MoCG data can be used to provide a MPTT signature and/or a MoCG signature waveform that may be unique to a particular user.
  • the biometric signature based user identification can be used in electronic payment applications.
  • the device 100 can be configured to communicate with a payment gateway using, for example, near field communication (NFC) or Bluetooth Low Energy (BLE) protocols.
  • the payment gateway can be configured to identify the user based on a corresponding biometric signature to initiate the payment process.
  • the payment gateway can communicate the identification information to a server that stores credit card or bank information of the corresponding user, for example, within a corresponding user account.
  • the server may initiate communications with the payment gateway that result in the credit card being charged or the bank account being debited.
  • the biometric signature based user identification is disabled if the device determines that the wearer is under distress.
  • the device can determine whether the wearer is under distress based on the wearer's vital signs (e.g., such as heart rate (HR), heart rate variability (HRV), blood pressure (BP), and respiratory rate). For example, if a wearer of the device is being forced to access a payment gateway, the device can detect the wearer's distress, as indicated by a sudden increase in HR, BP, and/or respiratory rate, and prevent him or her from accessing the payment gateway.
  • HR heart rate
  • HRV heart rate variability
  • BP blood pressure
  • the device can detect the wearer's distress, as indicated by a sudden increase in HR, BP, and/or respiratory rate, and prevent him or her from unlocking the lock.
  • the wearer's vital signs do not produce a match of the wearer's biometric signature when the wearer is under distress.
  • the wearer when the wearer is under distress, the multi-dimensional space defined based on the measured or derived parameters takes on a modified for that does not match the wearer's biometric signature. As such, a wearer under distress is unable to be identified by the biometric signature.
  • a wearer may exhibit signs that are synonymous with distress when the wearer is not in fact in distress. For example, if the wearer is involved in a non-dangerous and exciting event, such as buying an extremely expensive item, the wearer may experience an increase in HR, BP, and/or respiratory rate that may mistakenly be interpreted by the device as signs of distress. Thus, in some implementations, the wearer is provided with an opportunity to authenticate himself or herself in the event that the device detects false signs of distress or fails to identify the biometric signature of the wearer. The wearer can authenticate himself or herself using confidential information such as a password or a personal identification number that is communicated to the device or a server in communication with the device. In some implementations, the wearer can authenticate himself or herself by performing a private, predefined gesture. The one or more motion sensors of the device can be configured to determine whether the authenticating gesture matches the predefined gesture.
  • FIG. 6A An example process 600 of bio-authenticating a subject is shown in FIG. 6A .
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process 600 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 602 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • a determination can then be made of whether one or more segments of the dataset were captured from a subject other than an expected subject ( 604 ). The determination can be made by analyzing morphological features of the segments.
  • FIG. 6B Another example process 610610 of bio-authenticating a subject using information about motion of the subject is shown in FIG. 6B .
  • a machine such as a processor, that receives information from the motion sensor 105 of the device 100 can perform one or more steps of the process 610610 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a dataset that represents time-varying information about motion of a subject can be processed 612 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • a determination can then be made of whether one or more segments of the dataset were captured from a subject other than an expected subject 614 ). The determination can be made by analyzing morphological features of the segments.
  • FIG. 6C Another example process 620 of bio-authenticating a subject is shown in FIG. 6C .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 620 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • the machine may also use the calculated MPTT to further generate additional biometric measurements, the processes for which are discussed below.
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 622 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed ( 624 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • at least two parameters of the subject can be determined ( 626 ).
  • the parameters can include one or more of blood pressure, respiratory rate, blood oxygen levels, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, and PPG morphology.
  • a biometric signature of the subject can then be determined ( 628 ).
  • the biometric signature can be represented in a multi-dimensional space. Each axis can correspond to at least one of the determined parameters.
  • a determination can then be made of whether the biometric signature was captured from a subject who is an expected subject ( 630 ). The determination can be made by analyzing features of the biometric signature.
  • the biometric signature based user identification can be used in providing rewards and/or discounts to a user. For example, if the identified user is determined to be adhering to a particular exercise regimen, reward points or incentives such as discounts on particular products can be credited to the corresponding user account. Therefore, a user can be motivated to keep adhering to particular good practices to keep getting such rewards or discounts.
  • the information collected from the motion sensors 105 and the optical sensors 110 of FIG. 1B is used to calculate the MPTT, which can be used to further calculate the biometric parameters, such as blood pressure, stroke volume, etc.
  • An example process 700 for the MPTT calculation is shown in FIG. 7A .
  • a machine such as a processor, that receives the information from the motion sensors 105 and the optical sensors 110 can perform one or more steps of the process 700700 .
  • the machine may further provide the calculated results to, for example, the wearer, another person who is interested and authorized to receive the information, or another machine for further data processing or data storage.
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • the machine may also use the calculated MPTT to further generate additional biometric measurements, the processes for which are discussed below.
  • the MoCG data for use in the MPTT calculation can be preprocessed ( 702 ).
  • the motion sensor or sensors e.g., the accelerometers
  • the motion sensor or sensors collect three sets of MoCG data along three orthogonal axes, x, y, and z, or along polar coordinates.
  • the three sets may be combined by selecting a weight, w x , w y , w z for each set and summing the weighted sets.
  • An example of the weight selection is shown in FIG. 8 , which illustrates two dimensional heat-map diagrams 800 , 802 , and 804 produced from power spectra of MoCG ensembles collected over time.
  • each row in the diagrams represents the power spectrum of a corresponding frame of MoCG data.
  • the colors represent the values of the energy level.
  • the weights w x , w y , w z can be assigned, using respective diagrams, based on the ratio of energy inside the heart rate range to the energy outside the heart rate range. If the power spectra is consistent across the different frames and/or is a harmonic of the already calculated heart rate (as illustrated in the diagram 804 ), the corresponding axis (the z axis in this example) is assigned a higher weight than the other axes.
  • the lines 806 , 808 , and 810 in FIG. 8 represent the first, second, and third harmonic, respectively of the measured heart rate in this time segment.
  • the MoCG data for the MPTT calculation is then calculated as the weighted sum of the three sets of MoCG data for the three axes.
  • a single axis can be selected (e.g., the axis with the highest weight) while ignoring the others.
  • only the z axis can be selected for the example shown in FIG. 8 .
  • axis selection can be performed by independently analyzing each axis and then combining the axes based on agreement of the candidate MPTT values. This may be done, for example, to avoid the calculation of a power spectrum signal without sacrificing on the accuracy.
  • a representative segment of the PPG data is generated ( 704704 ) for calculating the PPT.
  • the representative PPG segment is generated by averaging across multiple PPG segments of the same length.
  • FIG. 9 shows an example of the representative segment 904 of the PPG data used in determining the MPTT.
  • the representative segment 904 in this example is calculated by averaging across multiple segments 906 of equal duration.
  • the MoCG data is then analyzed using the representative segment ( 706 ) to calculate candidate MPTT values.
  • the representative segment can be calculated, for example, by averaging across multiple segments of equal duration arranged on the same time grid as a representative PPG signal.
  • a short segment of the MoCG data 902 (of equal duration to the representative segment 904 ) and the representative segment 904 are aligned in time, for example, by aligning inflection points (or valleys or peaks).
  • the length of the segment 904 and the corresponding MoCG data can be in the order of several seconds. In the example shown in FIG.
  • the length of the segment 904 is 2 seconds. However segments of other lengths (e.g. 1.5 seconds-5 second) can also be used.
  • the representative segment is generated from data collected when a user is stationary, so that the data does not include a significant amount of unwanted noise.
  • the MPTT is measured as the difference between a time point to when a mid-systole portion 908 of the representative PPG segment 904 is measured, and a second time point representing the portion of MoCG data corresponding to the mid-systole.
  • the MoCG data represents the motion due to an actual heartbeat
  • the PPG data represents a pulse wave arrival recorded at a distance from the heart
  • the second time point generally occurs before to. Since a human body is not a rigid body, as defined by the laws of mechanics, the MoCG pulse arrives at the location where the device is located in a somewhat delayed (but constant per individual) fashion.
  • the portion of MoCG data corresponding to the mid-systole is typically manifested as a peak or valley in the MoCG data, and the MPTT can be determined by identifying the correct peak or valley corresponding to the mid-systole. While mid-systole is used as a reference point in this example, other portions of the cardiac morphology can also be used as the reference point. Based on a priori knowledge of typical MPTT, a predetermined time range relative to to is searched and the peaks and valleys detected within the predetermined time range are flagged as potential candidates for being the correct peak or valley corresponding to the mid-systole. Therefore, the difference between the time point corresponding to each such valley or peak and the time to represents a hypothetical MPTT. The correct MPTT value is determined based on the hypothetical MPTTs, as described using the example below.
  • the predetermined time range can be chosen to be, for example, between 10 to 400 ms, or another duration longer than an actual expected range.
  • seven peaks and valleys 910 , 912 , 913 , 914 , 916 , 918 , 920 , corresponding to time points t 1 , t 2 , t 3 , t 4 , t 5 , t 6 , t 7 , respectively, are identified on the MoCG plot 902 .
  • h 1 t 0 ⁇ t 1
  • h 2 t 0 ⁇ t 2
  • h 3 t 0 ⁇ t 3
  • h 4 t 0 ⁇ t 4
  • h 5 t 0 ⁇ t 6
  • h 6 t 0 ⁇ t 6
  • h 7 t 0 ⁇ t 7 .
  • a longer segment 1000 of the MoCG data (e.g., of 20 second duration, as shown in FIG. 10A ) is aligned with the corresponding PPG data, and the time points corresponding to mid-systoles in the PPG pulses are identified as reference points.
  • the MoCG data is checked at each time point preceding the reference points by h 1 (and possibly within a small time range around such time points) for the presence of a peak or valley. If a peak or valley is detected, it is flagged, and the total number of flagged peaks and valleys for the entire segment of MoCG data are recorded.
  • FIG. 10A illustrates a 20 second segment of MoCG data, along with flagged peaks and valleys corresponding to one particular hypothetical MPTT.
  • the flagged peaks and valleys are identified by markers (e.g., circles) 1008 , 1010 .
  • one of the hypothetical MPTTs is chosen as the true MPTT value, based on the recorded number of peaks or valleys.
  • the hypothetical MPTT that yields the maximum number of peaks or valleys can be chosen as the true MPTT value.
  • the hypothetical MPTTs can be combined together as a weighted sum to obtain the true MPTT value.
  • the weights can be assigned based on, for example, a ratio of the number of flagged peaks (or valleys) to the total number of reference points, and a consistency of the flagged peaks (or valleys) defined as a signal-to-noise ratio:
  • a weight for a given hypothetical MPTT can then be determined as:
  • Weight ((Number of flagged peaks)/(total reference points))*log (SNR)
  • the predetermined time range can be the duration for which a user wears the device 100 .
  • An example of such a histogram is shown in FIG. 11A , where the y axis represents a calculated MPTT value (averaged over 60 seconds), the y axis represents time, and the darkness of each point represents calculated confidence measure associated with the calculated MPTT.
  • the different horizontal sets represent candidate MPTT values for different time ranges.
  • a representative set can be selected from the candidate sets based on, for example, a priori knowledge about the expected MPTT, and/or confidence measures associated with the points in the set. For example, from FIG.
  • the sets 1111 or 1112 can be selected as the best representative sets for the MPTT, based on the confidence levels associated with the points (as represented by the darkness of the points), as well as a priori knowledge that the MPTT is expected to be within a 250-350 ms range. Therefore, more consistent (and hence reliable) estimates of MPTT values can be identified from the histograms, and the average MPTT value over the predetermined time range can be calculated ( 710 ), for example, as an average of the consistent MPTT values. Inconsistent MPTT values can be discarded from being included in computing the average MPTT. Other parameters such as average SV can also be calculated using similar plots. Before generating such plots, individual estimates of SV (in ml/heartbeat) can be calculated from the amplitude of the MoCG signal based on the fact that SV varies directly with the average amplitude of the MoCG.
  • only one candidate MPTT value can be selected.
  • the candidate MPTT value having the highest weights and/or an appropriate or expected morphology can be selected.
  • a confidence measure can be determined for each measurement of MPTT (or other biometric parameters) to indicate the confidence one has in the reading.
  • An example is shown in FIG. 11B , which illustrates computation of confidence measures 1120 corresponding to the calculated values of MPTT 1115 .
  • the confidence measures can be used, for example, to determine whether a calculated value can be used for subsequent computations.
  • FIG. 7B An example process for calculating MPTT is shown in FIG. 7B .
  • the process can be executed, for example by the device 100 described above with reference to FIG. 1B .
  • Operations of the process can include obtaining a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject ( 722 ).
  • the first data set can be obtained from a first sensor such as a PPG sensor.
  • the operations also include obtaining a second data set representing time-varying information about motion of the subject at the first body part of a subject ( 724 ).
  • the second data set can be obtained from a second sensor such as a motion sensor.
  • the operations further include identifying a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part ( 726 ) and identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject ( 728 ).
  • Identifying the first point can include, for example, computing a cross-correlation of a template segment with each of multiple segments of the first dataset, identifying, based on the computed cross-correlations, at least one candidate segment of the first dataset as including the first point, and identifying a first feature within the identified candidate segment as the first point.
  • Identifying the second point can include, for example, determining a reference point in the second data set, wherein the reference point corresponds to substantially the same point in time as the first point in the first data set.
  • One or more target features can then be identified within a predetermined time range relative to the reference point, and a time point corresponding to one of the target features can be selected as the second point.
  • the operations also include computing MPTT as a difference between the first and second time points ( 730 ).
  • the MPTT represents a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject can then be used in computing various parameters such as blood pressure or arterial stiffness.
  • the calculated MPTT value is related to elasticity of the blood vessels as shown in the following equation:
  • L is the vessel length
  • PWV is the pulse wave velocity
  • E is the Young's modulus
  • h is the vessel wall thickness
  • is the blood density
  • r is the vessel radius.
  • the elasticity is in turn related to the vessel pressure P as:
  • the vessel pressure P can be derived as:
  • the pressure value calculated using (3) represents diastolic pressure (Dia).
  • the systolic pressure (Sys) can then be computed as:
  • A is a universal constant that applies to all users and is unitless
  • B is an individual constant in units of mmHg
  • C is an individual constant in units of mmHg/mg
  • SV is the stroke volume
  • the parameters B and C for calculating the diastolic and systolic pressures may vary from one person to another. Accordingly, a process or device may need to be calibrated for an individual before use. Generally, the calibration is performed the first time the accelerometer and the optical sensor are used for measuring and the algorithms are used for calculating the MPTT, SV, and the other parameters.
  • FIG. 12 An example process 1200 of calibration performed by a machine, such as a processor, is shown in FIG. 12 .
  • the machine receives ( 1202 ) known reference systolic and diastolic pressures (SysO and DiaO), e.g., as input from a wearer. If the pressures are unknown to the wearer, generic values of 120/80 mmHg are used. In such cases, the wearer may be allowed to alter the calibration at a later time when the actual pressures becomes known.
  • the machine also calculates ( 1204 ) the MPTT and the SV using methods described above. The machine then calculates the constants B and C ( 1206 ) for this particular wearer based on the following equations:
  • the values of the parameters are saved or stored ( 1208 ) for the individual.
  • a device e.g., the device 100 including the accelerometer and the optical sensor can be used by multiple people.
  • a calibration is performed for each individual following the process 1200 and a set of calculated parameters are stored in association with the corresponding person.
  • the device may automatically choose a set of stored parameters for use with an individual based on biometric identifications of the individual, or may ask the individual to self-identify and choose the correct set of parameters for use, in case the device is shared among multiple users.
  • blood pressure measurements based on continuously acquired data can be made available for each individual by converting the MPTT and SV into systolic and diastolic pressures as described above.
  • the systolic and diastolic pressures can also be calculated by adding time-varying parameter estimations based on second order parameters.
  • the diastolic pressure can be calculated as:
  • f(.) and g(.) are predetermined functions, and the parameters D and E are time dependent and individual dependent.
  • the parameters can be calibrated when at least two calibration points (e.g., two known sets of systolic and diastolic pressures) at different times are available.
  • the calibrated parameters do not change frequently. These parameters may be affected by arterial diameters, arterial wall thicknesses, arterial lengths, arterial elasticity, and other physical parameters related to the cardiovascular system of a human body.
  • the majority of the volume of blood related to MPTT travels through large arteries, and is less susceptible to hydrostatic changes, temperature, or peripheral tone.
  • Curves representing relationships between MPTT and blood pressure are illustrated in FIG. 13 . As seen from this example, while the curves may differ from one person to another, the general shapes of the curves are similar.
  • FIG. 14 illustrates systolic pressure measured over 90 days after a single calibration, and in the absence of any additional recalibration.
  • a processor e.g., a processor of the computing device 115 (shown in FIG. 1B ), or of an external computing device to which the PPG data and the MoCG data is transmitted
  • BP blood pressure
  • HR HR
  • HRV HR
  • HRV respiratory rate
  • CO cardiac output
  • the processor can be programmed to use the PPG data and accelerometer data to detect arrhythmia or irregular heart rhythms, such as arterial fibrillation (AFIB) or atrial flutter.
  • FIGS. 15A-15D shows graphs in which heart rate data of the wearer of the device 100 is plotted.
  • the graphs show heart rate data plotted over a 24 hour period ( FIG. 15A ), during the day ( FIG. 15B ), and during the night ( FIG. 15C ).
  • each of these graphs includes R wave to R wave interval (RR i ) along the x-axis and RR i+1 along the y-axis.
  • the plotted data can be used to determine whether the subject has a normal heart rhythm or an irregular heart rhythm, as described below.
  • the plots can be updated after predetermined intervals (e.g., every 5-10 minutes) in order to capture any transient anomaly.
  • the PPG and accelerometer signals are used in the manner described above to determine the instantaneous heart rate of the wearer for each heartbeat of the wearer over a period of time (e.g., 20 seconds).
  • the RR values are then determined by examining the instantaneous heart rate curve to determine the time between each of the successive heartbeats. Each RR value is equal to the time between two consecutive heartbeats. Each RR value (RR i ) is then plotted versus the subsequent RR value (RR i+1 ).
  • the graphs shown in FIGS. 15A-15D represent plots of a subject with a normal heart rhythm.
  • a normal heart rhythm the time between beats tends to be fairly consistent.
  • the heart rate tends to increase gradually over time.
  • the individual's heart rate may be significantly higher during such activities (as compared to his or her heart rate at rest), the difference in time between consecutive heartbeats should be fairly consistent over the course of a small number of consecutive heart beats.
  • the RR i vs. RR i+1 plot will typically be fairly linear along a diagonal, as shown in FIG. 15D .
  • FIGS. 16A-16C show heart rate data for individuals with different heart conditions.
  • FIG. 16A shows heart rate data taken over a 24 hour period from an individual having atrial fibrillation (AFIB).
  • FIG. 16B shows heart rate data taken over a 24 hour period from an individual having atrial flutter
  • FIG. 16C shows heart rate data taken over a 24 hour period from an individual having a normal heart rhythm.
  • AFIB is apparent since the spread of the various RR data points from the expected diagonal is greater than a predetermined spread value.
  • AFIB causes erratic beating of the heart resulting in the time between consecutive heartbeats varying significantly from one pair of heartbeats to the next. It is this characteristic that causes the plot of RR vs. RR i+1 to spread significantly from the expected diagonal (i.e., the diagonal plot of an individual who has a regular heart rhythm (as shown in FIG. 16C )).
  • Atrial flutter can be seen by the multiple clusters of data that are offset from the diagonal. Atrial flutter results in changes in heart rate in multiples, which produces the multiple clusters of data that are offset from the diagonal.
  • the processor can be programmed to alert the wearer in response to detecting such irregular heart rhythms.
  • the processor can activate an audio or visual alarm of the device, which can, for example, instruct the wearer to seek medical attention.
  • FIG. 17 An example process 1700 of detecting arrhythmia of a subject is shown in FIG. 17 .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 1700 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 1702 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed ( 1704 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • Arrhythmia of the subject can be detected based on the processed data ( 1706 ).
  • Arrhythmia can include atrial fibrillation or atrial flutter.
  • Processing the data can include determining whether a spread of plotted R wave to R wave intervals versus next consecutive R wave to R wave intervals exceeds a predetermined spread value.
  • Processing the data can also include determining whether multiple clusters of plotted data points are offset from a diagonal
  • arterial stiffness is an indicator for vascular health (e.g. arteriosclerosis), risk for hypertension, stroke, and heart attack.
  • the processor can therefore be programmed to calculate arterial stiffness as a function of the pulse transit time (MPTT).
  • Certain conventional devices that are used to assess arterial stiffness require devices to be placed at two different locations of the subject (e.g., at the carotid and leg of the subject).
  • the device described herein which is able to collect from a single location of the subject all necessary data for determining arterial stiffness, tends to be more convenient than those conventional devices.
  • the processor can be programmed to inform the wearer of the device of his or her arterial stiffness value by, for example, causing that value to be displayed on the display of the device.
  • the arterial stiffness value can be used as one of multiple factors for assessing the overall health of the wearer.
  • the processor is programmed to use arterial stiffness of the wearer to determine a health metric (e.g., a health score) for the wearer.
  • the health score may be a numerical value. In some cases, the numerical value is between 1 and 10 or between 1 and 100.
  • arterial stiffness As shown in FIG. 18 , the arterial stiffness of a subject tends to decrease as the activity level of the subject (e.g., the number of times per week that the subject exercises) increases.
  • activity level of the subject e.g., the number of times per week that the subject exercises
  • arterial stiffness is one parameter that can be monitored by the device and shared with the user to track the progress of a subject involved in an exercise regimen. This can serve as positive feedback for the user in addition to conventional feedback, such as weight loss.
  • the processor can also be programmed to use the PPG data and accelerometer data to detect sleep disorders, such as sleep apnea, and to deduce sleep quality and sleep stages.
  • sleep disorders such as sleep apnea
  • the processor first analyzes the low frequency components of the accelerometer data to identify sleep rest periods (SRPs), which are periods in which the accelerometer data is substantially flat for a minimum period of time (e.g., 90 seconds).
  • SRPs sleep rest periods
  • the flatness of the accelerometer data indicates that the wearer of the device is not moving during the SRPs.
  • SRPs are periods during which the wearer of the device is likely to be asleep.
  • FIG. 19 illustrates three separate SRPs (SRP1, SRP2, and SRP3).
  • SRP1 and SRP2 and SRP2 and SRP3 are respectively separated from one another by a brief period of motion by the wearer of the device.
  • the three SRPs are treated as a single sleep cycle.
  • the processor can, for example, be programmed to treat periods of motion that last less than five minutes as not interrupting a sleep cycle during which that motion occurs.
  • the processor uses the PPG data and the accelerometer data collected during the SRPs to calculate the average heart rate, the standard deviation of the heart rate, the average heart rate variability (HRV), and the average activity level for each of the SRPs.
  • the processor analyzes the complexity of the heart rate signal and the deviation from diagonal of values plotted on an RR, vs. RR i+1 plot. These parameters can be used to confirm that the wearer of the device was sleeping during the SRP being analyzed and to identify certain sleep conditions and sleep disorders, as discussed below.
  • jetlag can also be detected by analyzing heart rate during sleep. For example, an upward heart rate during sleep can indicate a presence of jetlag, and a flat heart rate during sleep can indicate that the subject is not jetlagged.
  • the processor can be programmed to consider the average heart rate, the standard deviation of the heart rate, and the average heart rate variability (HRV) to confirm that the wearer was sleeping during the SRP being considered. For example, the average heart rate, the standard deviation of the heart rate, and the average heart rate variability (HRV) of the subject over the SRP being analyzed is compared to the baselines of these values in the subject. If they fall below the baseline by a predetermined amount, this confirms that the subject was asleep during the period being analyzed.
  • HRV average heart rate variability
  • the processor can determine the number of hours slept by the wearer, the sleep latency of the wearer (e.g., the length of time that it took for the subject to transition from wakefulness to sleep), the number of times that the wearer tossed and turned, and the percent of time that the wearer was asleep between the time that he or she went to bed and got up. In some cases, the processor can further determine the deepness of the sleep of the wearer during each of the SRPs.
  • the deepness of the sleep is sometimes referred to as the sleep stage. For example, if the accelerometer detected minimal movement and the patient's heart rate variability was a predetermined amount below the wearer's baseline heart rate during a portion of the SRP, it can be concluded that the wearer was in a deep sleep during that portion of the SRP. If the accelerometer detected some movement and the patient's heart rate was higher than can be expected of a deep sleep during a portion of the SRP, it can be concluded that the wearer was in REM sleep during that portion of the SRP. Otherwise, it can be concluded that the wearer was in a light sleep during that portion of the SRP.
  • the processor is programmed to use the above-noted parameters (e.g., the number of hours slept by the wearer, the number of times that the wearer tossed and turned, the percent of time that the wearer was asleep between the time that he or she went to bed and got up, and the deepness of sleep) to derive a quality of sleep metric or sleep score.
  • the wearer can monitor his or her sleep score over time in an effort to modify his or her sleep habits and maximize the quality of his or her sleep. It has been found that the use of such scores, as opposed to the various different related parameters, are more easily understood by users.
  • the processor can cause the device to automatically display the sleep score when the wearer is determined to have awoken.
  • the device can determine when the wearer has awoken based on information related to the SRPs. Based on characteristics related to the wearer's sleep, information can be provided to the wearer to assist the wearer in improving his or her sleep score.
  • the wearer can be provided with a recommended sleep schedule. For example, if the wearer is determined to have been getting too little sleep, the recommended sleep schedule may suggest that the wearer go to bed earlier in the evening or sleep in later into the morning.
  • the information can be provided on the display of the device or on a separate device, such as a mobile phone of the wearer.
  • FIG. 19B illustrates the heart rate signal of the wearer during a period of time in which the wearer experienced an episode of sleep apnea.
  • the heart rate signal of the wearer is complex from 2:54 AM until about 3:16 AM at which time the heart rate of the wearer spikes suddenly. From 3:16 AM until about 3:30 AM, the heart rate signal is simple (i.e., includes periodicity or a repeating pattern). The presence of a simple heart signal at least every two minutes during an SRP can be indicative of sleep apnea.
  • the processor can be programmed to carry out a multi-step test to detect sleep apnea.
  • the processor analyzes the heart rate throughout the SRP being analyzed. If the difference between the minimum heart rate and the maximum heart rate during the SRP is less than a threshold heart rate differential, then the processor determines that there was no sleep apnea and the test is concluded. If, however, the minimum-maximum heart rate differential exceeds the threshold heart rate differential, then the processor determines that sleep apnea could be the cause and a carries out a further analysis of the SRP. Specifically, the processor analyzes the heart rate variability, the plotted RR points, the complexity of the signal, and the activity level of the subject during the SRP
  • the heart rate variability is lower during the SRP than in neighboring periods, then this weighs against a finding of sleep apnea. If, however, the heart rate variability during the SRP exceeds the heart rate variability during neighboring periods, then this weighs in favor of a finding of sleep apnea.
  • Another factor used to determine whether the wearer has sleep apnea is the complexity of the heart rate signal. If the heart rate signal is complex during the SRP, then this weighs against a finding of sleep apnea. If, however, at least every two minutes, the heart rate signal becomes simple (i.e., has periodicity or a repeating pattern), then this weighs in favor of sleep apnea.
  • Activity level is another factor used to identify sleep apnea. If the activity level of the wearer during the SRP being analyzed (as determined using the accelerometer data) is greater than the activity level of the wearer during neighboring periods, this weighs against a finding of sleep apnea. If, however, the activity level of the wearer during the SRP being analyzed is less than the activity level of the wearer during neighboring periods, this weighs in favor of a finding of sleep apnea.
  • the processor can be programmed to determine the presence or absence of sleep apnea as a function of heart rate, heart rate variability, the location of data points on the RR i vs. RR i+1 plot, the complexity of the heart rate signal, and the activity level of the subject.
  • FIG. 20 shows an example screenshot 2000 on a mobile phone 2002 of a wearer that displays qualities of the wearer's sleep in conjunction with light levels during various times.
  • the wearer slept for 7 hours and 52 minutes total, awoke, 4 times, and has a sleep score of 74.
  • the screenshot also includes two bars: one bar shows times when the wearer had low-quality sleep, and another bar shows the measured light levels during those times. In this way, a correlation is made between the wearer's sleep quality and light levels experienced by the wearer.
  • the screenshot 2000 also includes a link 2004 for the wearer to receive sleeping environment tips that can improve his or her sleep quality.
  • the processor can alert the wearer that he or she may have experienced an irregular sleep pattern.
  • FIG. 21 An example process 2100 of determining information about a characteristic of a subject's sleep is shown in FIG. 21 .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2100 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 2102 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed ( 2104 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • the information about at least one pulse pressure wave propagating through blood in the subject can include photoplethysmographic (PPG) data, and the information about motion of the subject can include one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • information about a characteristic of the subject's sleep can be determined ( 2106 ).
  • the characteristic can include a quality of the sleep of the subject.
  • the quality of the sleep of the subject can include one or more of a sleep duration, a sleep latency, a sleep staging, latency to sleep, a number of disturbances, and a number of tosses and turns.
  • the characteristic of the subject's sleep can also include sleep apnea.
  • the processor can also be programmed to perform various fitness applications that allow the wearer to monitor his or her fitness level.
  • the processor can be programmed to analyze the accelerometer data over a given period of time (e.g., 15 minutes) to determine the total number of steps taken by the wearer during that time.
  • the processor is programmed to look for rhythm/cadence to detect walking as opposed to other ordinary motion, such as hand motions and vibrations.
  • the absolute value of the accelerometer data will typically be higher during periods of walking that during periods of most other daily activities.
  • the processor can calculate calories burned over a given period of time by analyzing the activity level of the wearer and/or the heart rate of the user. Using both the activity level and the heart rate to determine calories burned can lead to a more accurate estimation of caloric output.
  • the processor is programmed to provide a fitness score based on certain fitness-related parameters, such as resting heart rate. The more fit an individual is, the lower his or her baseline HR will be. Thus, in some cases, the processor is programmed to determine a fitness score based on the average heart rate of the wearer during sleep periods or periods of inactivity. Additionally, the speed of heart rate recovery can be a strong indicator of a person's fitness level. For example, the more fit an individual is, the faster his or her heart rate returns to the baseline after exercising. Similarly the more fit an individual is, the longer it takes for his or her heart rate to increase during exercise.
  • the processor is programmed to determine an individual's fitness score based on the amount of time that it takes for the individual's heart rate to reach a maximum during exercise and the amount of time that it takes for his or her heart rate to return to the baseline after exercise.
  • the processor can cause the device to automatically display the fitness score when the wearer is determined to be in the fitness state.
  • the fitness score may be displayed when the wearer starts to go for a run, and may be displayed throughout the run.
  • the fitness score may be displayed when the wearer transitions from a fitness state to a non-fitness state.
  • the fitness score may be displayed when the wearer finishes a run.
  • the device can determine when the wearer is in the fitness state based on the gross motion data and the vitals of the wearer, such as the wearer's heart rate. Based on characteristics related to the wearer's fitness, information can be provided to the wearer to assist the wearer in improving his or her fitness score.
  • FIG. 22 shows an example screenshot 2200 displaying a fitness score on a mobile phone 2202 of a wearer.
  • the information on the screenshot indicates that the wearer has improved his or her fitness score by two points.
  • the screenshot also provides the wearer with updated personalized training zones.
  • the personalized training zones represent the heart rate that the wearer should strive to achieve under various exercise conditions. For example, if the wearer is performing extreme exercise, he or she should strive to have a heart rate of more than 151 beats per minute.
  • the wearer can be provided with a recommended fitness routine. For example, it may be determined that the wearer has trouble completing a three-mile run, as indicated by an abnormally high heart rate during the run.
  • the recommended fitness schedule may suggest that the wearer run one mile twice a week for a week in order to improve his or her fitness, thereby allowing the wearer to work up to a fitness level appropriate for safely completing a three-mile run.
  • the information for assisting the wearer can be provided on the display of the device or on a separate device, such as a mobile phone of the wearer.
  • the device may have access to other users' vital information and fitness scores, such that a wearer of the device can compare his or her fitness score to those of other people.
  • a wearer of the device may want to follow the same training regimen as the one that the professional athlete follows. However, following the same training regimen does not necessarily produce the same results.
  • a wearer of the device may follow the same training regimen as a professional athlete, but he may not exhibit the same level of effort as the professional athlete.
  • the device can determine the degree of similarity between the wearer's training level and the professional athlete's training level.
  • vital information of a professional athlete from when the athlete performed or is performing a particular training routine is presented to the wearer while the wearer performs the same training routine.
  • a video showing the athlete performing the training routine can include a visual indication of the athlete's BP, HR, and respiratory rate over the course of the training routine.
  • the wearer can determine whether he or she is experiencing a similar BP, HR, and respiratory rate as the athlete, thereby indicating whether the wearer is training with the same intensity as the athlete.
  • the video may be configured to interact with the device such that the video encourages the wearer to try harder if the wearer's intensity is below that of the athlete.
  • the device can continue to monitor the BP, HR, and respiratory rate of the wearer to determine whether the wearer is physically recovering as well as the athlete.
  • the vital information of the professional athlete can be used to determine the athlete's physical state at particular times during competition.
  • the athlete's vital information can represent how the athlete physically feels while completing the last 20 meters of a 100 meter dash, or while catching a game-winning touchdown as time expires. A wearer may desire to recreate this feeling for himself or herself.
  • the device is configured to assist the wearer in recreating similar competition situations.
  • the athlete's vital information may indicate that a wide receiver had a particular BP, HR, and respiratory rate while catching a game-winning touchdown in a championship game.
  • the particular BP, HR, and respiratory rate may be significantly higher than they typically would be due to the intensity and importance of the game situation.
  • a wearer In order to recreate the situation, a wearer cannot simply go to a local football field and catch a pass from a friend because the wearer would not be in the same physical state that the wide receiver was in at the time of the catch. Rather, the user needs to match the wide receiver's BP, HR, and respiratory rate before recreating the catch.
  • the wearer may perform various actions or activities to artificially match the wide receiver's vitals (e.g., running, listening to loud or exciting music, etc.).
  • the device can alert the wearer. At that point, the wearer can recreate the game situation with improved accuracy.
  • the wearer can recreate the game situation with the aid of a virtual reality device, such as a stereoscopic device that creates a computer-simulated environment.
  • a virtual reality device such as a stereoscopic device that creates a computer-simulated environment.
  • the stereoscopic device can be used to aid the wearer in artificially matching his or her vitals with the athlete's by presenting to the wearer the same visuals and sounds that the athlete experienced before the game situation.
  • the stereoscopic device can also be used to recreate the particular game situation or play. That is, rather than catching a real football from a real person, the stereoscopic device can display visuals that simulate the action of catching the game-winning touchdown.
  • a person in a real combat situation typically exhibits increases in BP, HR, and respiratory rate due to the danger of the situation. Training for these situations does not involve the same risk of danger. Thus, such training is typically not performed under the same physical conditions. That is, a trainee does not have the same BP, HR, and respiratory rate that he would otherwise have in a real combat situation.
  • a person's vital information can be used to determine the person's physical state at particular times during a real combat situation. For example, a Navy SEAL may exhibit a particular BP, HR, and respiratory rate while performing a raid of a terrorist hideout. A trainee who is wearing the device may perform various actions or activities to artificially match the Navy SEAL's vitals. When the trainee has achieved a physical state that matches the Navy SEAL's, the device can alert the trainee, who can then recreate a training scenario with improved accuracy.
  • the processor can also be programmed to analyze the PPG data and the accelerometer data in a way to determine the stress level of the wearer of the device.
  • Heart rate (HR), heart rate variability (HRV), blood pressure (BP), and respiratory rate are all indicators of stress. Specifically, the values of these parameters increase as stress levels increase. Thus, by comparing these values to baseline values of the wearer for associated parameters, the level of stress of the wearer can be estimated.
  • the stress level can, for example, be provided to the wearer as a stress score.
  • the processor can cause the device to automatically display the stress score when the wearer is determined to be in a stress state.
  • the device can determine when the wearer is in a stress state based on the vitals of the wearer, such as the wearer's heart rate, heart rate variability, blood pressure, and respiratory rate. Based on characteristics related to the wearer's stress, information can be provided to the wearer to assist the wearer in improving his or her stress score. In some implementations, the wearer can be provided with a recommended stress-reducing routine.
  • the recommended stress-reducing routine may suggest that the wearer meditate at particular times (e.g., once a day) or adjust his or her daily schedule to minimize circumstances that are generally attributed to stress (e.g., sitting in traffic, working too much, etc.).
  • the information can be provided on the display of the device or on a separate device, such as a mobile phone of the wearer.
  • FIG. 23 shows an example screenshot 2300 on a mobile phone 2302 of a wearer that includes a number of stress moments experienced by the wearer.
  • the wearer has experienced four stress moments on the current day.
  • a graph indicates the number of stress moments that the wearer has experienced throughout the week.
  • the screenshot includes recommendations for the wearer to reduce his or her stress.
  • the screenshot recommends that the wearer plan some rest, relaxation, and/or a meditation session to reduce stress.
  • the screenshot also includes a link 2304 to a 1-minute relax sessions, during which the mobile phone guides the wearer on a relaxation session.
  • FIG. 24 An example process 2400 of deriving information about a psychological state of a subject is shown in FIG. 24 .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2400 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 2402 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed ( 2404 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • Information about a psychological state of the subject can be derived from the processed data ( 2406 ).
  • the psychological state of the subject can be a state of stress, a malicious intent, or a state of lying. Relationships between at least some of the processed data and a psychological state of the subject can be inferred.
  • one or more scores can be derived based on data collected by the device 100 .
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process.
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • Operations of the process can include deriving a score that is associated with a state of a subject.
  • the state of the subject can be one or more of a health state, a sleep metric, a fitness state, and a stress state.
  • Deriving the score can be based on data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject.
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). Deriving the score can also be based on data in a second dataset that represents time-varying information about motion of the subject.
  • the machine can receive information from the motion sensor 105 of the device 100 .
  • the data produced by the device can be used to assist triage medical personnel in various settings.
  • the device could be worn by military personnel in battle to provide medical personnel with valuable information regarding the vital signs of the military personnel.
  • the devices worn by the military personnel can, for example, be configured to transmit data regarding their vital signs to a central computer manned by medical personnel.
  • the medical personnel can view the vital signs of the various military personnel to prioritize medical care. As a result, the people that most need urgent treatment will receive it first, while those who have less threatening injuries will be attended to later.
  • the devices described herein could be used to assist medical personnel in various other triage settings, such as sites of natural disasters or terrorist attacks.
  • the medical personnel could be provided with a number of devices that could be put on patients in the triage setting as those patients are being assessed.
  • the medical personnel can leave that victim and focus their efforts on victims in more urgent need of medical care. While doing so, the vital signs of those victims who were initially assessed and determined not to require urgent medical care will be monitored and transmitted to a central monitoring station.
  • medical personnel in the area can be directed to that victim to provide the necessary medical care.
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform a process for risk assessment.
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • the process can include processing data from a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject.
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed.
  • the machine can receive information from the motion sensor 105 of the device 100 .
  • the data can be acquired while the subject is in a situation associated with risk. Whether the subject is in a situation associated with risk can be indicated by the data.
  • the risk can be trauma to the subject, and the data can be indicative of the existence of the trauma.
  • the devices described herein could be used to assist medical personnel in a hospital setting. Once a patient is stabilized following triage, he or she is typically monitored based on a provider's standard of care or mandate (e.g., according to an accountable care organization (ACO)). In some implementations, the device can continue to monitor the vital signs of the patient outside of the triage context to ensure that the care that the patient is receiving is appropriate in view of the patient's vitals. A provider's standard of care may require a patient to go through a progression of steps before the patient is deemed to be ready for discharge. The device can monitor the vital signs of the patient during each step of the progression.
  • a provider's standard of care may require a patient to go through a progression of steps before the patient is deemed to be ready for discharge.
  • the device can monitor the vital signs of the patient during each step of the progression.
  • the first step of the progression may involve monitoring the patient's vitals while the patient is resting (e.g., lying down and/or sleeping)
  • the second step of the progression may involve monitoring the patient's vitals while the patient is sitting up in bed
  • the third step of the progression may involve monitoring the patient's vitals while the patient is standing up while being supported
  • the fourth step of the progression may involve monitoring the patient's vitals while the patient is standing up unassisted
  • the fifth step of the progression may involve monitoring the patient's vitals while the patient is walking.
  • the device continuously monitors the patient's vitals throughout each of these stages and can present a notification if the vitals indicate that the patient is in a dangerous state (e.g., if the patient is progressing through each step too quickly without giving his or her body a chance to recover). In this way, the device monitors the patient's compliance with the provider's standard of care.
  • the patient's vitals can also serve as an indicator of the quality of care that the patient is receiving at a care facility.
  • the device can monitor the vitals of residents at a nursing home to determine the level of activity that the residents are experiencing. Data from the motion sensor of the device may indicate that the residents typically walk or perform other exercises one hour per day, and data from the ultraviolet light sensor of the device may indicate that the residents typically spend two hours per day outdoors.
  • the monitored vitals can be compared to metrics defined by a health organization (e.g., the American Heart Association) to determine whether the residents are adhering to the organization's recommendations regarding physical activity and other health-related actions.
  • the residents' level of compliance with the organization's recommendations can be used to assess the quality of care at the nursing home.
  • the nursing home may be assigned a quality score based on the monitored vitals and the level of compliance with the organization's recommendations, and multiple nursing homes may be compared and/or ranked according to their quality scores. Similar concepts can also apply in the context of child care.
  • FIG. 25 An example process 2500 of determining a quality of care provided to the one or more subjects by a care facility is shown in FIG. 25 .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2500 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of one or more subjects can be processed ( 2502 ).
  • Data that represents time-varying information about motion of the one or more subjects can also be processed ( 2504 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • a quality of care provided to the one or more subjects by a care facility that cares for the one or more subjects can be determined ( 2506 ). Determining a quality of care can include determining a level of physical activity experienced by each of the one or more subjects. The level of physical activity can be determined by comparing gross motion data gathered by the motion sensor 105 to a threshold. Data that represents information about an amount of ultraviolet light that each of the one or more subjects has been exposed to over a particular time period can also be processed, and an amount of time that each of the one or more subjects has spent outside can be determined.
  • the devices described herein can also be beneficial to first responders, such as firefighters and police offers. By wearing the devices, the first responders will ensure that their vital signs are monitored before, during, and after any stressful events that they experience to ensure that they receive the help they need. This is illustrated in the example of FIG. 26 , where health parameters of one or more firefighters 2605 on a potentially hazardous mission are obtained via devices 100 worn or carried by the firefighters 2605 .
  • the firefighters' vital signs could be obtained by the devices 100 and transmitted to a central monitoring station (e.g., within a fire truck 2610 , or at a fire station) where the vital signs can be monitored to determine whether the firefighters 2605 are well enough to continue fighting a fire or otherwise responding to an emergency.
  • the devices 100 worn or carried by the firefighters 2605 further include GPS transponders.
  • Such devices are particularly beneficial for situations in which one or more first responders 2605 become incapacitated in a dangerous setting.
  • the device could not only send the firefighter's vital sign data to the central monitoring station to alert someone that the firefighter is in need of medical care, the device could also identify the location of the firefighter 2605 to a rescuer 2620 (possibly via a device 100 ) sent to assist the incapacitated firefighter 2605 , such that the rescuer 2620 knows exactly where to go.
  • the communications about the health parameters of the one or more firefighters 2605 can be sent directly to the central monitoring station, or via a server 2630 .
  • the server 2630 determines that a firefighter's mental/physical state is not suitable for continuing the mission, the server 2630 can send a signal to the firefighter (e.g., via the device 100 , or via another communication device) to alert the firefighter 2605 about the situation. For example, if the health condition of the firefighter deteriorates during the mission (e.g., because of excessive smoke inhalation), a signal can be sent to the device 100 to alert the firefighter to take corrective measures.
  • the device 100 can be configured to communicate with the central monitoring station on the fire truck 2610 .
  • the data from the devices 100 can be transmitted to the server 2630 (possibly via the central monitoring station) for determining whether a firefighter 2605 is safe. The determination can also be made at the central monitoring station.
  • the data from the device 100 may also indicate whether the wearer of the device 100 requires assistance from a rescuer 2620 .
  • the server 2630 and/or the central monitoring station can then alert the firefighter 2605 and/or a rescuer 2620 accordingly.
  • his/her location may also be tracked using information transmitted from the corresponding device.
  • the processor can also be programmed to monitor the alertness of the wearer. This can be particularly advantageous for personnel who perform tasks that require attention and concentration, and could result in serious harm or danger if carried out incorrectly. Examples of such personnel include air traffic controllers, pilots, military truck drivers, tanker drivers, security guards, TSA agents, intelligence analysts, etc.
  • the processor can analyze the respiratory rate, heart rate, blood pressure, and activity level of the wearer. Each of these parameters tends to decrease as a subject falls asleep. Thus, the processor can be programmed to conclude that the wearer's alertness level has dropped to an unacceptable level when one or more of those parameters falls a predetermined amount from the baseline of those parameters.
  • the processor can be programmed so that, upon determining that the wearer's alert level has dropped to an unacceptable level, an alarm (e.g., an audible, visual, or tactile alarm) on the device is activated.
  • an alarm e.g., an audible, visual, or tactile alarm
  • the alarm can raise the alertness level of the wearer and thus reduce risk of harm to the wearer and others.
  • the processor can be configured to communicate with the vehicle or machinery for which the wearer is responsible.
  • the device worn by a truck driver can transmit data regarding his or her alertness level to a controller of the truck.
  • the controller can be configured to disable operation of the truck if the alertness level is below an acceptable threshold. For example, the controller can warn the driver that he or she has a certain period of time to pull the truck over before it is disabled. This will encourage the driver to pull off the road and either get some sleep or otherwise increase his or her alertness level before driving the truck again.
  • the alertness data can be stored in a database for later analysis. Studying the alertness data from a large sampling of personnel in a given industry can help regulatory bodies for those industries to draft safety standards that increase or maximize safety while maintaining productivity.
  • alertness data over a period of time for a particular wearer of the device can be analyzed to determine the overall physical and/or mental state of a given wearer (e.g., as opposed to the instantaneous state of the given user). Such information can be used to detect a trend of regressing physical and/or mental state of the given wearer. For example, although a wearer of the device may exhibit vitals that indicate that he is alert enough to perform a particular task (e.g., fly a plane) at a particular time, the wearer's alertness data over a period of time may indicate that the wearer's general alertness is on the decline. This may be due to the wearer's old age. The device can detect such a trend and alert the wearer and/or an external entity that the wearer should be closely monitored.
  • a process can be configured to acquire data while a subject is in a situation that requires a predetermined amount of alertness of the subject.
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of such a process.
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • Operations of the process can include processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject.
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • the operations can also include processing data in a second dataset that represents time-varying information about motion of the subject.
  • the machine can receive information from the motion sensor 105 of the device 100 .
  • the data can be acquired while the subject is in a situation that requires at least a predetermined amount of alertness of the subject.
  • the situation can include one or more of air traffic control, intelligence analysis, vehicle driving, machinery driving, security guarding, baggage screening, and aircraft piloting.
  • the devices described herein can also be used as polygraph devices. Like conventional polygraph devices, the devices described herein gather a baseline for the wearer's vital signs (e.g., respiratory rate, electrical skin impedance, heart rate, heart rate variability, and blood pressure) and those baselines can later be compared to associated vital signs recorded during questioning. Because the devices described herein are wearable, untethered, and non-cumbersome, and thus do not reduce the mobility of the wearer, the individual being tested can be required to wear the device for a specified period of time (e.g., 24 hours) before and after questioning without hindering the normal, everyday activities of the individual.
  • a specified period of time e.g., 24 hours
  • the baselines for the subject's vital signs can be more accurately determined. For example, it is less likely that the subject could artificially adjust his or her vital baselines due to the large amounts of data collected to form those baselines. Therefore, the accuracy of the polygraph test can be increased relative to certain conventional polygraph devices.
  • the accelerometer data can be analyzed to identify movements or lack of movements that may indicate that the subject is lying. It is believed, for example, that individuals freeze for a moment when they are caught doing something wrong. In the case of polygraph examinations, it is believed that a subject will freeze when asked a question about the subject's wrongdoing. Thus, by analyzing the accelerometer data of the device, it is possible to identify those times during questioning that the subject freezes. This information can be used to further assess the truthfulness of the subject's response during that time.
  • the processor can be programmed to analyze the PPG data and the accelerometer data to determine the physical and mental readiness of a subject to perform a certain task.
  • General fatigue and stress which can result in a drop in physical and mental readiness, is generally evidenced by an increase in respiratory rate, heart rate, and blood pressure.
  • the processor can be programmed to analyze the wearer's respiratory rate, heart rate, and blood pressure and to indicate a state of unreadiness if those parameters fall a certain amount below the baseline for those parameters.
  • the processor is programmed to also consider other factors in this readiness assessment, including the quality of the wearer's sleep (e.g., the wearer's sleep score) over a period of time (e.g., 24 hours or 48 hours) leading up to the assessment.
  • the quality of the wearer's sleep e.g., the wearer's sleep score
  • a period of time e.g., 24 hours or 48 hours
  • the determination of readiness of wearers of the device can assist leaders of those wearers with maximizing his or her human resources during taxing situations. For example, military leaders can analyze the data of soldiers in their units to determine which of those soldiers is most physically and mentally able to successfully carry out a mission and can staff the mission accordingly. Similarly, coaches may analyze the data of their team members to determine which of those athletes are best physically and mentally fit to play at their top level at any given time during a competition and can use those players that are able to perform at their top level.
  • the physical and mental readiness of a subject can be used by the device to predict a winner of a competition. For example, by analyzing vital signs (e.g., BP, HR, respiratory rate) of the contestant before and during a track race, a change in physical and mental readiness can be inferred.
  • the device can also consider information such as the force exerted against the ground by the contestant and the velocity of the contestant at various points during the race to determine a likelihood that the contestant will win the race.
  • the contestant's device can also consider similar information related to other contestants in determining the likelihood that the contestant will win the race.
  • the device may determine that a first contestant got off to a quicker start than a second contestant in a 100 meter dash based on collected motion data. Historical data may indicate that the contestant who is “first out of the blocks” has a 65% chance of winning the race. Thus, the device can predict the winner of the race within milliseconds of the start of the race.
  • the device can monitor a contestant's performance at an infinite number of intervals while correlating the contestant's performance to the measured vitals.
  • a contestant typically keeps track of his lap times for each of the four laps.
  • the contestant does not typically have access to more detailed data, such as his or her performance over the first 100 meters, the last 100 meters, at various points in the middle of the race, etc.
  • the device can be configured to keep track of the contestant's performance at any time or range of times during the race, and can also correlate the contestant's performance to the vitals measured by the device. For example, the contestant may complete the first lap of the mile in 50 seconds, putting him or her on pace to easily break the world record.
  • the device may determine that the contestant has a BP, HR, and respiratory rate significantly higher than what would typically be seen in someone who has only completed 25% of the race, and thus determine that the contestant likely will not win the race. By exhibiting so much effort early in the race, the contestant burns out and finishes the race with a mediocre time.
  • the contestant can use the performance data and the measured vitals to improve his or her training in the future. For example, the next time the contestant runs a mile, the device may detect that the contestant is exhibiting too much effort early in the race by measuring a high BP, HR, and respiratory rate. The device can be configured to notify the contestant to reserve energy in order to optimize his or her performance.
  • the device can be used to monitor the performance of an entire team of individuals wearing the device. For example, the collective physical and mental readiness of a football team, as well as motion sensor data and information related to other factors, can be used to determine whether the football team is performing to its potential.
  • Information related to the vitals of a first team such as the team's collective BP, HR, and respiratory rate, may indicate that the first team is exhibiting a large amount of effort.
  • Information related to the vitals of a second team may indicate that the second team is exhibiting minimal effort.
  • the second team is winning the football game against the first team, indicating that the first team may have inferior technique or coaching.
  • Information related to a team's vitals can also be used to ensure that the team does not exhibit too much effort early in the season, thereby making it susceptible to “burning out” towards the end of the season.
  • FIG. 27 An example process 2700 of providing information to a user that reports relative states of subjects is shown in FIG. 27 .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2700 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of two or more subjects can be processed ( 2702 ).
  • Data that represents time-varying information about motion of the two or more subjects can also be processed ( 2704 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • the Information can be provided to a user that reports relative states of the subjects ( 2706 ).
  • the information can be based on the processed data.
  • the relative states of the subjects can include one or more of relative psychological states, relative physical states, and relative states of readiness.
  • the subjects can be put into an athletic contest or assigned a particular combat task according to the relative states of the subjects.
  • the processor can be programmed to analyze the vital signs of multiple users in the moments leading up to a collision. For example, when two players collide during a sporting competition, a large amount of force is absorbed by each player. Force data can be measured by the motion sensor of the device, and the device can determine the magnitude of force absorbed by each player. The device can determine the effect of the force on each player by analyzing the players' vitals (e.g., BP, HR, respiratory rate, body temperature) before, during, and after the collision. The vitals and the force information can be used to determine whether a player has sustained bodily damage due to the impact force. For example, if a player experiences a sudden increase in HR, respiratory rate, and body temperature following a collision, it may be an indication that the player has sustained a concussion.
  • BP BP
  • HR respiratory rate
  • body temperature body temperature
  • a player's bodily reaction to sustaining a concussion is delayed.
  • a player may experience a sudden increase in HR, respiratory rate, and body temperature at some time following a collision, or the player may experience a gradual increase in HR, respiratory rate, and body temperature beginning at the time of the collision.
  • the device can monitor the player's vitals for an extended time following the collision and compare the monitored vital information to vital information of a player who was previously diagnosed with a concussion. In this way, the device can determine vital patterns that are indicative of a person who sustains a concussion. If the device determines that a player has sustained a concussion, the device may be configured to alert the player or a third party.
  • the player may be required to pass a protocol before reentering the game. If the device determines that there is a possibility that the player has sustained a concussion, the device may enter a mode where the player is monitored more closely in order to make a more definitive determination.
  • the devices described herein can also be used as human flight recorders. While accident investigators (e.g., National Transportation Safety Board (NTSB) investigators) have traditionally been limited to analyzing voice recorders and, in some cases, black boxes, after airplane and train crashes, the devices described herein, when worn by the operators of those vehicles, will provide insight into the state of the operator at the time of the crash. For example, by analyzing vital signs of the operator (e.g., the respiratory rate, heart rate, heart rate variability, and blood pressure of the operator) in the moments leading up to the crash, the investigators can learn whether the operator fell asleep, experienced some form of medical emergency, etc. This information is valuable for the investigators to determine whether the crash was the result of the operator's actions as opposed to some other reason, such as mechanical failure.
  • NTSB National Transportation Safety Board
  • information related to the vital signs of the operator as well as information related to the operating characteristics of the car can be used to determine the cause of the accident, the mechanism of injury to the operator, and the impact of the injury to the operator.
  • the mental and/or physiological state of the operator before, during, and/or after the accident can be ascertained.
  • the 60 minutes following a traumatic injury is generally referred to as the “golden hour,” during which there is the highest likelihood that prompt medical treatment will prevent death. It is especially important to quickly gather vital information during this time to assist first responders and doctors in diagnosing and treating the operator.
  • the human flight recorder information can be used by third parties to determine who was at fault in creating the accident. For example, a law enforcement body may analyze the human flight recorder information to determine whether a tort or a crime was committed by an operator. In some implementations, the human flight recorder information can be used to determine an exact time when an event occurred. For example, the information can be used to determine an exact time of death, an exact time when a person went missing (e.g., by being abducted), or an exact time when a person fell down.
  • the data could be analyzed by his or her physician to help diagnose the condition. For example, if a wearer has a heart attack, the data could be analyzed to investigate the variation in the vital signs leading up to the attack. Other data can also be considered, such as the wearer's genetics, epigenetics, diet, exercise practice, and environmental circumstances surrounding the event or condition. This information may be correlated and used to prevent onset of similar conditions in the future, for example, by alerting the user of such a possibility upon detecting similar variations in vital signs.
  • the device is able to determine a “baseline biorhythm” of a wearer based on the wearer's vital signs in various circumstances and environmental environments.
  • the baseline biorhythm is typically unique to each individual.
  • the device is able to detect when the wearer's vital signs are shifting away from the baseline biorhythm. For example, the device may detect that a wearer's biorhythm has gradually shifted over a particular time period, as indicated by variations in the wearer's vital signs.
  • the device may also detect that the wearer has spent minimal time outside over the same time period, as indicated by measurements from the device's ultraviolet light sensor. The device can identify a correlation between the wearer's changed biorhythm and the change in ultraviolet light exposure.
  • the device can identify a correlation between the wearer's changed biorhythm and changes in the weather. For example, the device can consider the wearer's location information in conjunction with weather information from the National Oceanic and Atmospheric Administration to determine the type of weather experienced by the wearer over a particular period of time. The device may identify that the wearer experiences higher BP and HR when the weather is cold and/or rainy and determine that such weather causes increased stress in the wearer.
  • the device can include a temperature sensor for determining the skin temperature of the wearer and an ambient temperature sensor for detecting the ambient temperature.
  • the processor can be programmed to estimate the wearer's core temperature as a function of the measured skin temperature and ambient temperature (e.g., based on the difference between the skin temperature and the ambient temperature).
  • the processor can be programmed to use this data to predict medical conditions before they happen. For example, the heart rate, heart rate variability, and blood pressure of the wearer can be monitored and processed by the processor to make such predictions.
  • a medical event that can be predicted in a subject is tachycardia. Tachycardia is when a subject's heart rate is over 100 beats per minute. If a subject's heart rate is trending upwards, a prediction can be made as to when the subject will experience tachycardia.
  • Other examples of medical events that can be predicted are hypertension and stroke.
  • Hypertension is diagnosed when a subject's blood pressure exceeds 140/90 mmHg. If the increase is rapid, a prediction can be made as to when the subject will have a high likelihood of experiencing a stroke.
  • a subject's blood pressure is decreasing rapidly (e.g., if the rate of change of the blood pressure is negative and below a threshold)
  • a prediction can be made as to whether the subject will have a heart condition.
  • the heart rate variability of the subject is used to predict a medical event, whether the subject experiences arrhythmia (e.g., atrial fibrillation) can determine what an appropriate heart rate variability of the subject is. For example, a subject who experiences arrhythmia may have a high heart rate variability, but this may be normal given the subject's condition.
  • arrhythmia e.g., atrial fibrillation
  • FIG. 28 An example process 2800 of predicting a medical event of a subject is shown in FIG. 28 .
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process ##00.
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 2802 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed.
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • a medical event of the subject can then be predicted ( 2804 ).
  • the medical event can be predicted based on the processed data.
  • Medical events that can be predicted include tachycardia, hypertension, stroke, and heart condition.
  • the processor can also be programmed to ensure that the wearer of the device is adhering to a prescribed medication regimen. For example, for wearers who are prescribed blood pressure medication, the processor can be programmed to monitor the blood pressure of the wearer and to alert the wearer if, based on the blood pressure data, it appears that the wearer forgot to take his or her medication.
  • the device can be used in this manner to monitor a wearer's adherence to a prescribed medication schedule for any of various other medications that impact the various different vital signs monitored by the device.
  • the processor can also be programmed to determine the effectiveness of a medication. For example, in the context of inhalation medications, it is unknown if generic inhalation medications have the same effectiveness as brand name inhalation medications. One reason for this is that environmental and genetic makeups are generally different between users.
  • the processors can be programmed to monitor the heart rate and the blood oxygenation (SpO 2 ) of wearers of devices who are prescribed generic inhalation medication and wearers of devices who are prescribed name brand inhalation medication.
  • the processors can also consider data related to environment and genetic makeups of the wearers. Data related to the effects of the inhalation medication on the wearers can be used to determine the effectiveness of the generic inhalation medication compared to the effectiveness of the name brand inhalation medication.
  • the device can be used in this manner to monitor the effectiveness of any of various other medications that impact the various different vital signs monitored by the device.
  • the processor can determine a correlation between a particular medication's effectiveness and environmental factors. For example, two wearers of the device who reside in two different extreme environments (e.g., Alaska and Florida) may experience different effects from the particular medication. Differences in the medication's effectiveness may be attributed to the different extreme environments experienced by the wearers. For example, the processor can determine a correlation between the particular medication's effectiveness and the environmental temperature experienced the wearer.
  • two wearers of the device who reside in two different extreme environments (e.g., Alaska and Florida) may experience different effects from the particular medication. Differences in the medication's effectiveness may be attributed to the different extreme environments experienced by the wearers.
  • the processor can determine a correlation between the particular medication's effectiveness and the environmental temperature experienced the wearer.
  • the device may identify a correlation between a particular medication's effectiveness and other environmental factors. For example, differences in a medication's effectiveness between two users may be attributed to the food that people generally eat in a particular region, thereby allowing the device to identify food-drug interaction information related to the medication.
  • the device can be configured to determine an optimal timing and dosage regimen for a particular wearer by monitoring the wearer's vitals while the wearer is under the influence of the medication. For example, a wearer may take a medication to maintain his or her blood pressure below a particular level.
  • the device may determine that the wearer's blood pressure was reduced too much, and recommend that the wearer take a smaller dose the next day. The following day, the wearer may take the dosage amount recommended by the device. The device may determine that the wearer's blood pressure was reduced to the ideal level, but that the wearer may need to take a second small dose of the medication to maintain his or her blood pressure at the ideal level over the course of the day. In this way, the device can continuously refine the wearer's dosage regimen to be custom tailored to the wearer. The device can be used in this manner to determine an optimal dosage regimen for any of various other medications that impact the various different vital signs monitored by the device as described herein.
  • the processor can determine an optimal time for a wearer of the device to take a medication. For example, a doctor typically tell a patient to take particular medications at particular times of the day or under particular circumstances (e.g., in the morning, in the evening, with food, etc.). Such blanket directions do not typically apply to all patients under all circumstances.
  • the processor can monitor the vital signs of the wearer of the device to determine the optimal time for the wearer to take the medication under the current circumstances.
  • the processor can consider characteristics of the particular medication when making the determination.
  • the wearer of the device may take a medication that has a tendency to cause the wearer to be energetic.
  • a doctor may suggest that the medication be taken no later than 3:00 pm to prevent disruption of the wearer's sleep.
  • the processor may determine that the wearer is more energized than usual.
  • the processor may recommend that the wearer take the medication earlier than usual to prevent the wearer from becoming too energized and having his sleep disrupted later.
  • FIG. 29 An example process 2900 of providing information about a medication regimen of a subject is shown in FIG. 29 .
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process 2900 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 2902 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed.
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
US14/522,398 2013-10-23 2014-10-23 Alertness Detection Abandoned US20150112159A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/522,398 US20150112159A1 (en) 2013-10-23 2014-10-23 Alertness Detection

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201361894884P 2013-10-23 2013-10-23
US201462002531P 2014-05-23 2014-05-23
US14/522,398 US20150112159A1 (en) 2013-10-23 2014-10-23 Alertness Detection

Publications (1)

Publication Number Publication Date
US20150112159A1 true US20150112159A1 (en) 2015-04-23

Family

ID=52825691

Family Applications (10)

Application Number Title Priority Date Filing Date
US14/521,897 Abandoned US20150112208A1 (en) 2013-10-23 2014-10-23 Medication management
US14/522,230 Expired - Fee Related US9396643B2 (en) 2013-10-23 2014-10-23 Biometric authentication
US14/522,398 Abandoned US20150112159A1 (en) 2013-10-23 2014-10-23 Alertness Detection
US14/521,829 Abandoned US20150112606A1 (en) 2013-10-23 2014-10-23 Calculating Pulse Transit Time
US14/522,132 Abandoned US20150112158A1 (en) 2013-10-23 2014-10-23 Health Metrics
US14/521,823 Abandoned US20150112156A1 (en) 2013-10-23 2014-10-23 Predicting medical events
US14/521,767 Abandoned US20150112154A1 (en) 2013-10-23 2014-10-23 Biometrics in risk situations
US14/521,907 Abandoned US20150112157A1 (en) 2013-10-23 2014-10-23 Arrhythmia detection
US14/522,157 Expired - Fee Related US9396642B2 (en) 2013-10-23 2014-10-23 Control using connected biometric devices
US14/521,822 Abandoned US20150112155A1 (en) 2013-10-23 2014-10-23 Sleep parameters

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US14/521,897 Abandoned US20150112208A1 (en) 2013-10-23 2014-10-23 Medication management
US14/522,230 Expired - Fee Related US9396643B2 (en) 2013-10-23 2014-10-23 Biometric authentication

Family Applications After (7)

Application Number Title Priority Date Filing Date
US14/521,829 Abandoned US20150112606A1 (en) 2013-10-23 2014-10-23 Calculating Pulse Transit Time
US14/522,132 Abandoned US20150112158A1 (en) 2013-10-23 2014-10-23 Health Metrics
US14/521,823 Abandoned US20150112156A1 (en) 2013-10-23 2014-10-23 Predicting medical events
US14/521,767 Abandoned US20150112154A1 (en) 2013-10-23 2014-10-23 Biometrics in risk situations
US14/521,907 Abandoned US20150112157A1 (en) 2013-10-23 2014-10-23 Arrhythmia detection
US14/522,157 Expired - Fee Related US9396642B2 (en) 2013-10-23 2014-10-23 Control using connected biometric devices
US14/521,822 Abandoned US20150112155A1 (en) 2013-10-23 2014-10-23 Sleep parameters

Country Status (6)

Country Link
US (10) US20150112208A1 (fr)
EP (1) EP3060107A1 (fr)
JP (1) JP2016538097A (fr)
KR (1) KR20160075677A (fr)
CA (1) CA2928197A1 (fr)
WO (1) WO2015061579A1 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016174634A1 (fr) * 2015-04-30 2016-11-03 Pontificia Universidad Católica De Chile Procédé et dispositif de détection et d'enregistrement d'au moins un événement émotionnel et des conditions environnementales du cadre dans lequel se trouve l'individu avant, pendant et après ledit événement émotionnel, en vue de son analyse ultérieure et avec de faibles besoins en énergie
US20170135593A1 (en) * 2015-11-13 2017-05-18 Acme Portable Corp. Wearable device which diagnoses personal cardiac health condition by monitoring and analyzing heartbeat and the method thereof
US20180033299A1 (en) * 2015-12-29 2018-02-01 Ebay Inc. Traffic disruption detection using passive monitoring of vehicle occupant frustration level
US10019882B2 (en) * 2014-10-16 2018-07-10 Teijin Limited Protective equipment comprising alarm system
US20180318622A1 (en) * 2017-05-02 2018-11-08 International Business Machines Corporation Cognitive solution to enhance firefighting capabilities
US20190021633A1 (en) * 2017-11-21 2019-01-24 Ling Wang Detecting respiratory rates in audio using an adaptive low-pass filter
US20190122517A1 (en) * 2016-04-19 2019-04-25 Teijin Limited Article provided with warning system
US10497244B2 (en) * 2014-12-18 2019-12-03 Wearable Technology Limited Issuing alarm signal to operatives
US10542961B2 (en) 2015-06-15 2020-01-28 The Research Foundation For The State University Of New York System and method for infrasonic cardiac monitoring
US10768009B2 (en) 2015-12-29 2020-09-08 Ebay Inc. Proactive re-routing of vehicles using passive monitoring of occupant frustration level
US10914598B2 (en) 2015-12-29 2021-02-09 Ebay Inc. Proactive re-routing of vehicles to control traffic flow
US10980428B2 (en) 2016-12-15 2021-04-20 ViviPulse, LLC Wearable pulse waveform measurement system and method
US11006875B2 (en) 2018-03-30 2021-05-18 Intel Corporation Technologies for emotion prediction based on breathing patterns
WO2022107845A1 (fr) * 2020-11-19 2022-05-27 Jvckenwood Corporation Authentification biométrique par surveillance vasculaire
TWI799821B (zh) * 2021-03-30 2023-04-21 許維綸 危險預測預防系統

Families Citing this family (210)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9946844B2 (en) * 2013-02-22 2018-04-17 Cloud Dx, Inc. Systems and methods for monitoring patient medication adherence
US11872053B1 (en) * 2013-02-22 2024-01-16 Cloud Dx, Inc. Systems and methods for monitoring medication effectiveness
US10463299B1 (en) * 2013-02-22 2019-11-05 Cloud Dx, Inc. Systems and methods for monitoring medication effectiveness
US11612352B1 (en) * 2013-02-22 2023-03-28 Cloud Dx, Inc. Systems and methods for monitoring medication effectiveness
RU2522400C1 (ru) * 2013-04-05 2014-07-10 Общество С Ограниченной Ответственностью "Хилби" Способ определения фазы сна человека, благоприятной для пробуждения
US11693383B1 (en) * 2013-05-31 2023-07-04 Signify Holding B.V. Systems and methods for providing hub-based motion detection using distributed, light-based motion sensors
US9674949B1 (en) 2013-08-27 2017-06-06 Flextronics Ap, Llc Method of making stretchable interconnect using magnet wires
US9554465B1 (en) 2013-08-27 2017-01-24 Flextronics Ap, Llc Stretchable conductor design and methods of making
US10231333B1 (en) 2013-08-27 2019-03-12 Flextronics Ap, Llc. Copper interconnect for PTH components assembly
US10786161B1 (en) 2013-11-27 2020-09-29 Bodymatter, Inc. Method for collection of blood pressure measurement
JP6075277B2 (ja) * 2013-12-04 2017-02-08 オムロンヘルスケア株式会社 ユーザ認証システム
US9521748B1 (en) 2013-12-09 2016-12-13 Multek Technologies, Ltd. Mechanical measures to limit stress and strain in deformable electronics
US9338915B1 (en) 2013-12-09 2016-05-10 Flextronics Ap, Llc Method of attaching electronic module on fabrics by stitching plated through holes
US9659478B1 (en) * 2013-12-16 2017-05-23 Multek Technologies, Ltd. Wearable electronic stress and strain indicator
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US9694156B2 (en) 2014-06-05 2017-07-04 Eight Sleep Inc. Bed device system and methods
US9981107B2 (en) 2014-06-05 2018-05-29 Eight Sleep Inc. Methods and systems for gathering and analyzing human biological signals
EP3157414B1 (fr) 2014-06-18 2019-07-24 Nokia Technologies Oy Procédé, dispositif et agencement pour déterminer un temps de transit d'impulsion
US10278638B2 (en) * 2014-07-21 2019-05-07 Withings System and method to monitor and assist individual's sleep
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
US9588625B2 (en) 2014-08-15 2017-03-07 Google Inc. Interactive textiles
US10268321B2 (en) 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
US9451893B2 (en) 2014-08-18 2016-09-27 Cameron Health, Inc. Calculation of self-correlation in an implantable cardiac device
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
CN107072538B (zh) 2014-09-08 2021-07-13 苹果公司 将脉搏传导时间(ptt)测量系统电耦接到心脏以用于血压测量
US10517489B2 (en) 2014-09-08 2019-12-31 Apple Inc. Wrist worn accelerometer for pulse transit time (PTT) measurements of blood pressure
CN107106054B (zh) 2014-09-08 2021-11-02 苹果公司 使用多功能腕戴式设备进行血压监测
US10702171B2 (en) 2014-09-08 2020-07-07 Apple Inc. Systems, devices, and methods for measuring blood pressure of a user
KR20170057313A (ko) 2014-09-09 2017-05-24 토르벡 인코포레이티드 웨어러블 디바이스를 활용하여 개인의 각성을 모니터링하고 통보를 제공하기 위한 방법들 및 장치
US20170277858A1 (en) * 2014-09-19 2017-09-28 Shinano Kenshi Co., Ltd. System for predicting risk of onset of cerebrovascular disease
US9600080B2 (en) 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
WO2016065476A1 (fr) * 2014-10-30 2016-05-06 2352409 Ontario Inc. Dispositif portable et procédé de surveillance non invasive continue de la pression artérielle et d'autres paramètres physiologiques, avec réduction des artéfacts de mouvement
USD772903S1 (en) * 2014-11-14 2016-11-29 Volvo Car Corporation Display screen with transitional graphical user interface
USD772905S1 (en) * 2014-11-14 2016-11-29 Volvo Car Corporation Display screen with graphical user interface
USD772260S1 (en) * 2014-11-14 2016-11-22 Volvo Car Corporation Display screen with graphical user interface
USD772904S1 (en) * 2014-11-14 2016-11-29 Volvo Car Corporation Display screen with transitional graphical user interface
US10168430B2 (en) 2014-11-17 2019-01-01 Adam Sobol Wireless devices and systems for tracking patients and methods for using the like
US20170354331A1 (en) 2014-11-17 2017-12-14 Rochester Institute Of Technology Blood Pressure and Arterial Compliance Estimation from Arterial Segments
KR102364842B1 (ko) * 2014-12-02 2022-02-18 삼성전자주식회사 맥파 측정 장치 및 방법
CN107427267B (zh) * 2014-12-30 2021-07-23 日东电工株式会社 用于导出主体的精神状态的方法和装置
US10064582B2 (en) 2015-01-19 2018-09-04 Google Llc Noninvasive determination of cardiac health and other functional states and trends for human physiological systems
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
EP3073400B1 (fr) * 2015-03-25 2022-05-04 Tata Consultancy Services Limited Système et procédé permettant de déterminer le stress psychologique d'une personne
WO2016154256A1 (fr) * 2015-03-25 2016-09-29 Quanttus, Inc. Mesure de pression sanguine sans contact
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
US9848780B1 (en) * 2015-04-08 2017-12-26 Google Inc. Assessing cardiovascular function using an optical sensor
KR102002112B1 (ko) 2015-04-30 2019-07-19 구글 엘엘씨 제스처 추적 및 인식을 위한 rf―기반 마이크로―모션 추적
CN111880650A (zh) 2015-04-30 2020-11-03 谷歌有限责任公司 基于宽场雷达的手势识别
KR102327044B1 (ko) 2015-04-30 2021-11-15 구글 엘엘씨 타입-애그노스틱 rf 신호 표현들
CA2985452A1 (fr) * 2015-05-08 2016-11-17 Eight Sleep Inc. Systeme d'alarme par vibration et procedes de fonctionnement
US20160361032A1 (en) * 2015-05-14 2016-12-15 Abraham Carter Systems and Methods for Wearable Health Alerts
US10080528B2 (en) 2015-05-19 2018-09-25 Google Llc Optical central venous pressure measurement
US9693592B2 (en) 2015-05-27 2017-07-04 Google Inc. Attaching electronic components to interactive textiles
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US11147505B1 (en) 2015-06-01 2021-10-19 Verily Life Sciences Llc Methods, systems and devices for identifying an abnormal sleep condition
US10376195B1 (en) 2015-06-04 2019-08-13 Google Llc Automated nursing assessment
US10376163B1 (en) 2015-06-14 2019-08-13 Facense Ltd. Blood pressure from inward-facing head-mounted cameras
US10638938B1 (en) 2015-06-14 2020-05-05 Facense Ltd. Eyeglasses to detect abnormal medical events including stroke and migraine
US11103139B2 (en) 2015-06-14 2021-08-31 Facense Ltd. Detecting fever from video images and a baseline
US10349887B1 (en) 2015-06-14 2019-07-16 Facense Ltd. Blood pressure measuring smartglasses
US11154203B2 (en) 2015-06-14 2021-10-26 Facense Ltd. Detecting fever from images and temperatures
US11103140B2 (en) 2015-06-14 2021-08-31 Facense Ltd. Monitoring blood sugar level with a comfortable head-mounted device
US10799122B2 (en) 2015-06-14 2020-10-13 Facense Ltd. Utilizing correlations between PPG signals and iPPG signals to improve detection of physiological responses
US11064892B2 (en) 2015-06-14 2021-07-20 Facense Ltd. Detecting a transient ischemic attack using photoplethysmogram signals
US10667697B2 (en) 2015-06-14 2020-06-02 Facense Ltd. Identification of posture-related syncope using head-mounted sensors
US10791938B2 (en) 2015-06-14 2020-10-06 Facense Ltd. Smartglasses for detecting congestive heart failure
US11903680B2 (en) * 2015-06-14 2024-02-20 Facense Ltd. Wearable-based health state verification for physical access authorization
KR102436726B1 (ko) * 2015-06-15 2022-08-26 삼성전자주식회사 생리학적 노화 수준을 평가하는 방법 및 장치
US10194871B2 (en) 2015-09-25 2019-02-05 Sanmina Corporation Vehicular health monitoring system and method
US10307100B2 (en) * 2015-07-20 2019-06-04 iFeel Healthy Ltd. Methods and systems of controlling a subject's body feature having a periodic wave function
US9955303B2 (en) 2015-07-21 2018-04-24 IP Funding Group, LLC Determining relative position with a BLE beacon
US10678890B2 (en) 2015-08-06 2020-06-09 Microsoft Technology Licensing, Llc Client computing device health-related suggestions
FI126600B (en) * 2015-08-10 2017-03-15 Murata Manufacturing Co Detection of sleep phenomena using ballistocardiography
KR101696602B1 (ko) * 2015-08-11 2017-01-23 주식회사 슈프리마 제스처를 이용한 생체 인증
US10582890B2 (en) 2015-08-28 2020-03-10 Awarables Inc. Visualizing, scoring, recording, and analyzing sleep data and hypnograms
US10321871B2 (en) 2015-08-28 2019-06-18 Awarables Inc. Determining sleep stages and sleep events using sensor data
US10709371B2 (en) 2015-09-09 2020-07-14 WellBrain, Inc. System and methods for serving a custom meditation program to a patient
CN107847164B (zh) 2015-09-29 2021-01-08 苹果公司 压力测量设计
US10881307B1 (en) 2015-09-29 2021-01-05 Apple Inc. Devices and systems for correcting errors in blood pressure measurements
ES2607721B2 (es) * 2015-10-02 2019-07-04 Univ Catalunya Politecnica Método y aparato para estimar el tiempo de tránsito del pulso aórtico a partir de intervalos temporales medidos entre puntos fiduciales del balistocardiograma
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
CN105167759A (zh) * 2015-10-09 2015-12-23 谢洪武 一种基于智能手机的人体脉搏波速度测量方法及其系统
US20170126613A1 (en) * 2015-11-03 2017-05-04 Joiiup Technology Inc. Instant information exchange system and method for online sports teams
US9837760B2 (en) 2015-11-04 2017-12-05 Google Inc. Connectors for connecting electronics embedded in garments to external devices
US10105092B2 (en) 2015-11-16 2018-10-23 Eight Sleep Inc. Detecting sleeping disorders
US10154932B2 (en) 2015-11-16 2018-12-18 Eight Sleep Inc. Adjustable bedframe and operating methods for health monitoring
US10321831B2 (en) 2015-11-25 2019-06-18 Texas Instruments Incorporated Heart rate estimation apparatus with state sequence optimization
US10758185B2 (en) 2015-11-25 2020-09-01 Texas Instruments Incorporated Heart rate estimation apparatus using digital automatic gain control
USD781881S1 (en) * 2015-12-09 2017-03-21 Facebook, Inc. Display screen with animated graphical user interface
US9892247B2 (en) * 2015-12-30 2018-02-13 Motorola Mobility Llc Multimodal biometric authentication system and method with photoplethysmography (PPG) bulk absorption biometric
US10973422B2 (en) * 2016-01-22 2021-04-13 Fitbit, Inc. Photoplethysmography-based pulse wave analysis using a wearable device
US11589758B2 (en) 2016-01-25 2023-02-28 Fitbit, Inc. Calibration of pulse-transit-time to blood pressure model using multiple physiological sensors and various methods for blood pressure variation
JP6642055B2 (ja) * 2016-02-02 2020-02-05 富士通株式会社 センサ情報処理装置、センサユニット、及び、センサ情報処理プログラム
US10803145B2 (en) * 2016-02-05 2020-10-13 The Intellectual Property Network, Inc. Triggered responses based on real-time electroencephalography
KR20170096323A (ko) * 2016-02-16 2017-08-24 삼성전자주식회사 사용자의 실제 생활 리듬과 서카디안 리듬 간의 정합도를 제공하는 방법 및 장치
CA3014812A1 (fr) * 2016-02-18 2017-08-24 Curaegis Technologies, Inc. Systeme et procede de prediction de la vigilance
CN105748051B (zh) * 2016-02-18 2018-10-09 京东方科技集团股份有限公司 一种血压测量装置
US11179049B2 (en) 2016-02-29 2021-11-23 Fitbit, Inc. Intelligent inflatable cuff for arm-based blood pressure measurement
US10747850B2 (en) 2016-03-29 2020-08-18 International Business Machines Corporation Medication scheduling and alerts
CN105877941A (zh) * 2016-04-06 2016-08-24 吉林大学 一种神经外科病床辅助装置
US20170290526A1 (en) * 2016-04-07 2017-10-12 Oregon Health & Science University Telecentive spirometer
US9762581B1 (en) 2016-04-15 2017-09-12 Striiv, Inc. Multifactor authentication through wearable electronic device
JP6721155B2 (ja) 2016-04-15 2020-07-08 オムロン株式会社 生体情報分析装置、システム、及び、プログラム
CA3027168C (fr) 2016-04-27 2021-03-30 BRYX, Inc. Procede, appareil et support lisible par ordinateur permettant de faciliter une reponse d'urgence
US10492302B2 (en) 2016-05-03 2019-11-26 Google Llc Connecting an electronic component to an interactive textile
KR20170124943A (ko) * 2016-05-03 2017-11-13 삼성전자주식회사 심혈관 특성 추출 장치 및 방법
CN107872965B (zh) 2016-05-09 2021-08-06 倍灵科技有限公司 用于健康护理的可穿戴设备及其方法
WO2017200570A1 (fr) 2016-05-16 2017-11-23 Google Llc Objet interactif à modules électroniques multiples
US10463295B2 (en) * 2016-06-13 2019-11-05 Medtronic, Inc. Multi-parameter prediction of acute cardiac episodes and attacks
DE102016211197B4 (de) * 2016-06-22 2018-11-08 Audi Ag Authentifizieren mittels Vitalparameter
US10426411B2 (en) * 2016-06-29 2019-10-01 Samsung Electronics Co., Ltd. System and method for providing a real-time signal segmentation and fiducial points alignment framework
JP6694649B2 (ja) * 2016-07-07 2020-05-20 国立研究開発法人産業技術総合研究所 生理状態判定装置、生理状態判定方法、生理状態判定装置用プログラムおよび、生理状態判定システム
EP3481293A4 (fr) * 2016-07-11 2020-03-04 Mc10, Inc. Système de mesure à capteurs multiples de la pression artérielle.
US11064893B2 (en) 2016-07-20 2021-07-20 Samsung Electronics Co., Ltd. Real time authentication based on blood flow parameters
KR101814382B1 (ko) * 2016-08-05 2018-01-04 울산대학교 산학협력단 혈액 순환 장애 진단 장치 및 방법
DE102016215250A1 (de) * 2016-08-16 2018-02-22 Audi Ag Verfahren zum Betreiben eines Kraftfahrzeugs mithilfe eines mobilen Endgeräts eines Benutzers und physiologischen Vitaldaten
US10602964B2 (en) * 2016-08-17 2020-03-31 Koninklijke Philips N.V. Location, activity, and health compliance monitoring using multidimensional context analysis
US11419509B1 (en) 2016-08-18 2022-08-23 Verily Life Sciences Llc Portable monitor for heart rate detection
US11207021B2 (en) * 2016-09-06 2021-12-28 Fitbit, Inc Methods and systems for labeling sleep states
US10517527B2 (en) 2016-09-16 2019-12-31 Bose Corporation Sleep quality scoring and improvement
US10653856B2 (en) 2016-09-16 2020-05-19 Bose Corporation Sleep system
US10963146B2 (en) 2016-09-16 2021-03-30 Bose Corporation User interface for a sleep system
US10434279B2 (en) 2016-09-16 2019-10-08 Bose Corporation Sleep assistance device
US10478590B2 (en) 2016-09-16 2019-11-19 Bose Corporation Sleep assistance device for multiple users
US10561362B2 (en) * 2016-09-16 2020-02-18 Bose Corporation Sleep assessment using a home sleep system
US11594111B2 (en) 2016-09-16 2023-02-28 Bose Corporation Intelligent wake-up system
CN107865647B (zh) * 2016-09-28 2020-01-14 京东方科技集团股份有限公司 血压检测装置以及血压检测装置的校正方法
CN106473750B (zh) * 2016-10-08 2019-03-26 西安电子科技大学 基于光电容积脉搏波最佳周期波形的身份识别方法
TWI594728B (zh) * 2016-10-14 2017-08-11 麗寶大數據股份有限公司 地毯式體脂計結構
KR102655670B1 (ko) 2016-10-25 2024-04-05 삼성전자주식회사 생체 신호 품질 평가 장치 및 방법과, 생체 신호 측정 파라미터 최적화 장치 및 방법
US10716518B2 (en) 2016-11-01 2020-07-21 Microsoft Technology Licensing, Llc Blood pressure estimation by wearable computing device
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures
US20180174146A1 (en) * 2016-12-15 2018-06-21 Parveen Bansal Situational access override
US11670422B2 (en) 2017-01-13 2023-06-06 Microsoft Technology Licensing, Llc Machine-learning models for predicting decompensation risk
JP6702559B2 (ja) * 2017-02-10 2020-06-03 株式会社東芝 電子機器、方法及びプログラム
US10749863B2 (en) * 2017-02-22 2020-08-18 Intel Corporation System, apparatus and method for providing contextual data in a biometric authentication system
KR102002638B1 (ko) * 2017-03-06 2019-07-22 계명대학교 산학협력단 심전도 신호를 이용한 운전자의 부정맥 진단 방법 및 장치
US11123014B2 (en) 2017-03-21 2021-09-21 Stryker Corporation Systems and methods for ambient energy powered physiological parameter monitoring
US10939833B2 (en) * 2017-05-01 2021-03-09 Samsung Electronics Company, Ltd. Determining artery location using camera-based sensing
US10699247B2 (en) 2017-05-16 2020-06-30 Under Armour, Inc. Systems and methods for providing health task notifications
US11266346B2 (en) * 2017-06-07 2022-03-08 Electronics And Telecommunications Research Institute Method and apparatus for determining sleep state using biometric information and motion information
WO2018235440A1 (fr) * 2017-06-22 2018-12-27 シャープ株式会社 Dispositif et méthode de gestion des états biologiques
EP3417770A1 (fr) * 2017-06-23 2018-12-26 Koninklijke Philips N.V. Dispositif, système et procédé de détection d'impulsion et/ou des informations relative à le impulsion d'un patient
US10869627B2 (en) * 2017-07-05 2020-12-22 Osr Enterprises Ag System and method for fusing information related to a driver of a vehicle
DE102017211631A1 (de) * 2017-07-07 2019-01-10 Bundesdruckerei Gmbh Elektronisches System und Verfahren zur Klassifikation eines physiologischen Zustandes
US10896375B2 (en) * 2017-07-11 2021-01-19 International Business Machines Corporation Cognitive replication through augmented reality
US11944416B2 (en) 2017-07-26 2024-04-02 Nitto Denko Corporation Photoplethysmography (PPG) apparatus and method for determining physiological changes
CN107516075B (zh) 2017-08-03 2020-10-09 安徽华米智能科技有限公司 心电信号的检测方法、装置及电子设备
CN110996796B (zh) * 2017-08-08 2023-01-31 索尼公司 信息处理设备、方法和程序
CN107707525B (zh) * 2017-08-24 2020-06-19 大唐终端技术有限公司 一种对讲终端的认证方法和装置
WO2019046602A1 (fr) * 2017-08-30 2019-03-07 P Tech, Llc Intelligence artificielle et/ou réalité virtuelle pour l'optimisation/la personnalisation d'activité
KR101962812B1 (ko) * 2017-10-13 2019-03-28 아주대학교산학협력단 Ppg 기반 rem 수면 검출 방법 및 장치
JP7028002B2 (ja) * 2017-12-06 2022-03-02 新東工業株式会社 産業機械起動制御システム、起動制御方法、及びプログラム
US10244985B1 (en) 2017-12-28 2019-04-02 Saleem Sayani Wearable diagnostic device
EP3505051A1 (fr) * 2017-12-29 2019-07-03 Sanmina Corporation Système et procédé de surveillance de la santé en véhicule
GB2584242B (en) 2018-01-09 2022-09-14 Eight Sleep Inc Systems and methods for detecting a biological signal of a user of an article of furniture
WO2019143953A1 (fr) 2018-01-19 2019-07-25 Eight Sleep Inc. Capsule de repos
US11445986B2 (en) * 2018-01-30 2022-09-20 Gaia Connect Inc. Health monitor wearable device
US10284552B1 (en) 2018-06-06 2019-05-07 Capital One Services, Llc Systems and methods for using micro accelerations as a biometric identification factor
CN111867458B (zh) * 2018-03-15 2023-12-19 勃劢亚公司 用于心血管健康监测的系统和方法
JP2019187678A (ja) * 2018-04-23 2019-10-31 日本電信電話株式会社 運動パフォーマンス推定装置、運動パフォーマンス推定方法、プログラム
US11707225B2 (en) * 2018-04-27 2023-07-25 Samsung Electronics Co., Ltd. Bio-sensing based monitoring of health
CN108937860B (zh) * 2018-06-06 2021-02-02 歌尔科技有限公司 一种运动状态监测方法、系统及设备和存储介质
US10587615B2 (en) 2018-06-06 2020-03-10 Capital One Services, Llc Systems and methods for using micro accelerations as a biometric identification factor
KR102564269B1 (ko) 2018-06-07 2023-08-07 삼성전자주식회사 생체 정보를 이용하여 운동 정보를 제공하기 위한 전자 장치 및 그의 동작 방법
US10380813B1 (en) 2018-07-19 2019-08-13 Capital One Services, Llc Systems and methods for using motion pattern of a user for authentication
US10621322B2 (en) 2018-08-02 2020-04-14 Capital One Services, Llc Platform for distinguishing human from machine input
TW202021528A (zh) * 2018-12-05 2020-06-16 宏碁股份有限公司 基於光電容積描記圖訊號取得心律不整資訊的方法及檢測心律不整的裝置
GB2579656A (en) 2018-12-11 2020-07-01 Ge Aviat Systems Ltd Method of assessing a pilot emotional state
EP3900630A4 (fr) * 2018-12-19 2021-12-22 NEC Corporation Appareil de traitement d'informations, dispositif portable, procédé de traitement d'informations, et support d'informations
KR20200078795A (ko) 2018-12-21 2020-07-02 삼성전자주식회사 혈압 추정 장치 및 방법
US10957140B2 (en) * 2018-12-28 2021-03-23 Intel Corporation Multi-factor biometric authentication
US11445927B2 (en) 2019-02-13 2022-09-20 Viavi Solutions Inc. Baseline correction and extraction of heartbeat profiles
KR102243012B1 (ko) * 2019-02-13 2021-04-22 와이케이씨테크(주) 피부 영상을 이용한 혈관탄성도와 부정맥 진단 방법
EP3701863A1 (fr) * 2019-02-26 2020-09-02 Polar Electro Oy Mesures d'électrocardiogramme
US10748656B1 (en) 2019-03-12 2020-08-18 Harmonize Inc. Population health platform
CN110025321B (zh) * 2019-03-20 2021-08-31 华为技术有限公司 一种心理压力评估方法及相关设备
EP3941339A4 (fr) * 2019-03-22 2022-12-21 Sibel Inc. Système de communication sans fil pour capteurs médicaux à porter
JP7246609B2 (ja) * 2019-03-28 2023-03-28 京セラドキュメントソリューションズ株式会社 画像形成装置
US20220157467A1 (en) * 2019-04-02 2022-05-19 Myelin Foundry Private Limited System and method for predicting wellness metrics
KR102259285B1 (ko) * 2019-04-25 2021-06-01 서울대학교산학협력단 혈압 측정 장치 및 방법
KR102277105B1 (ko) * 2019-06-03 2021-07-14 계명대학교 산학협력단 카메라를 이용한 비접촉 혈압 측정 시스템 및 그 구동 방법
CN112089423A (zh) * 2019-06-18 2020-12-18 北京京东尚科信息技术有限公司 睡眠信息确定方法、装置及设备
CN112168139B (zh) * 2019-07-05 2022-09-30 腾讯科技(深圳)有限公司 一种健康监控方法、装置及存储介质
US10559145B1 (en) * 2019-07-17 2020-02-11 Abdulaziz Mohammed Almehmadi Systems and methods for providing behavioral based intention detection
EP3998950A4 (fr) * 2019-07-19 2023-07-19 Barnacka, Anna Système et procédé de détection et de signalement de rythme cardiaque
AU2020366348A1 (en) 2019-10-15 2022-05-12 Imperative Care, Inc. Systems and methods for multivariate stroke detection
CN114585300A (zh) * 2019-10-29 2022-06-03 欧姆龙健康医疗事业株式会社 血压计、血压测量方法、以及程序
KR20210078283A (ko) 2019-12-18 2021-06-28 삼성전자주식회사 센서 신호로부터 사용자의 제스처를 인식하는 전자 장치 및 이를 이용한 제스처 인식 방법
US20210275110A1 (en) * 2019-12-30 2021-09-09 RubyElf, LLC Systems For Synchronizing Different Devices To A Cardiac Cycle And For Generating Pulse Waveforms From Synchronized ECG and PPG Systems
US11599826B2 (en) * 2020-01-13 2023-03-07 International Business Machines Corporation Knowledge aided feature engineering
RU2740601C1 (ru) * 2020-03-05 2021-01-15 Федеральное государственное бюджетное образовательное учреждение высшего образования Читинская государственная медицинская академия Министерства здравоохранения российской федерации Способ прогнозирования риска развития ишемического инсульта у женщин старше 50 лет
CA3177643A1 (fr) * 2020-04-14 2021-10-21 Rubyelf Llc Systeme et procede de mesure de la saturation veineuse en oxygene utilisant un calcul de moyenne d'impulsions intelligent avec capteurs d'ecg et de ppg integres
US20230198779A1 (en) * 2020-05-04 2023-06-22 Hewlett-Packard Development Company, L.P. Partial signatures based on environmental characteristics
US11357411B2 (en) 2020-07-08 2022-06-14 Nec Corporation Of America Sensor fusion for measurement of physiological parameters
US11361445B2 (en) 2020-07-08 2022-06-14 Nec Corporation Of America Image analysis for detecting mask compliance
US20230346234A1 (en) * 2020-09-16 2023-11-02 Oregon Health & Science University Wearable photoplethysmography device for detecting clinical decompensation based on heart rate variability
US11666271B2 (en) * 2020-12-09 2023-06-06 Medtronic, Inc. Detection and monitoring of sleep apnea conditions
CN112967801A (zh) * 2021-01-28 2021-06-15 安徽华米健康科技有限公司 Pai值处理方法、装置、设备和存储介质
KR102560787B1 (ko) 2021-02-04 2023-07-26 삼성전자주식회사 생체정보 추정 장치 및 방법과, 이를 포함하는 전자장치
US11468992B2 (en) 2021-02-04 2022-10-11 Harmonize Inc. Predicting adverse health events using a measure of adherence to a testing routine
KR102570742B1 (ko) * 2021-04-23 2023-08-24 우정하 스마트 인명구조용 경보기 시스템
KR20230078415A (ko) * 2021-11-26 2023-06-02 삼성전자주식회사 전자 장치 및 그 제어 방법
WO2023150749A1 (fr) * 2022-02-07 2023-08-10 Zoll Medical Corporation Engagement de patient pour dispositifs médicaux portables
GB2615359B (en) * 2022-02-08 2024-02-07 Xpo Health Ltd A wearable monitor with photoplethysmogram sensor for determining emotion level
JP2023158983A (ja) * 2022-04-19 2023-10-31 株式会社エー・アンド・デイ 血圧測定装置
TWI822234B (zh) * 2022-08-08 2023-11-11 簡國隆 災害現場管制裝置及災害現場管制系統
WO2024049973A1 (fr) * 2022-09-02 2024-03-07 Board Of Regents Of The University Of Nebraska Systèmes et procédés de détermination du moment d'arrivée d'impulsion avec des dispositifs électroniques à porter sur soi

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6791462B2 (en) * 2002-09-18 2004-09-14 Sang J. Choi Sleepy alarm system activated by heart pulse meter
US20120075122A1 (en) * 2010-09-24 2012-03-29 Honeywell International Inc. Alert generation and related aircraft operating methods
US20120203077A1 (en) * 2011-02-09 2012-08-09 David Da He Wearable Vital Signs Monitor
US20140073486A1 (en) * 2012-09-04 2014-03-13 Bobo Analytics, Inc. Systems, devices and methods for continuous heart rate monitoring and interpretation
US8725311B1 (en) * 2011-03-14 2014-05-13 American Vehicular Sciences, LLC Driver health and fatigue monitoring system and method

Family Cites Families (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4913150A (en) 1986-08-18 1990-04-03 Physio-Control Corporation Method and apparatus for the automatic calibration of signals employed in oximetry
EP0267884A3 (fr) 1986-11-10 1990-01-17 Saccinni ved. Vigna, Luisa Dispositif pour connecter des câbles électriques
US4854699A (en) 1987-11-02 1989-08-08 Nippon Colin Co., Ltd. Backscatter oximeter
US5289824A (en) 1991-12-26 1994-03-01 Instromedix, Inc. Wrist-worn ECG monitor
FI92139C (fi) 1992-02-28 1994-10-10 Matti Myllymaeki Ranteeseen kiinnitettävä terveydentilan seurantalaite
US5692501A (en) 1993-09-20 1997-12-02 Minturn; Paul Scientific wellness personal/clinical/laboratory assessments, profile and health risk managment system with insurability rankings on cross-correlated 10-point optical health/fitness/wellness scales
US5836884A (en) 1993-12-17 1998-11-17 Pulse Metric, Inc. Method for diagnosing, monitoring and treating hypertension and other cardiac problems
US5622178A (en) * 1994-05-04 1997-04-22 Spacelabs Medical, Inc. System and method for dynamically displaying cardiac interval data using scatter-plots
US6266623B1 (en) 1994-11-21 2001-07-24 Phatrat Technology, Inc. Sport monitoring apparatus for determining loft time, speed, power absorbed and other factors such as height
WO1997047239A1 (fr) 1996-06-12 1997-12-18 Seiko Epson Corporation Dispositif de mesure de depense calorique et dispositif de mesure de temperature corporelle
US8734339B2 (en) * 1996-12-16 2014-05-27 Ip Holdings, Inc. Electronic skin patch for real time monitoring of cardiac activity and personal health management
US6008703A (en) 1997-01-31 1999-12-28 Massachusetts Institute Of Technology Digital compensation for wideband modulation of a phase locked loop frequency synthesizer
US5818788A (en) 1997-05-30 1998-10-06 Nec Corporation Circuit technique for logic integrated DRAM with SIMD architecture and a method for controlling low-power, high-speed and highly reliable operation
JP2004513669A (ja) 1999-10-08 2004-05-13 ヘルセテック インコーポレイテッド 集積カロリー管理システム
US6527711B1 (en) 1999-10-18 2003-03-04 Bodymedia, Inc. Wearable human physiological data sensors and reporting system therefor
US6480733B1 (en) 1999-11-10 2002-11-12 Pacesetter, Inc. Method for monitoring heart failure
FI115289B (fi) 2000-02-23 2005-04-15 Polar Electro Oy Elimistön energia-aineenvaihdunnan ja glukoosin määrän mittaaminen
US6452149B1 (en) 2000-03-07 2002-09-17 Kabushiki Kaisha Toshiba Image input system including solid image sensing section and signal processing section
US7261690B2 (en) 2000-06-16 2007-08-28 Bodymedia, Inc. Apparatus for monitoring health, wellness and fitness
US7689437B1 (en) 2000-06-16 2010-03-30 Bodymedia, Inc. System for monitoring health, wellness and fitness
US6605038B1 (en) 2000-06-16 2003-08-12 Bodymedia, Inc. System for monitoring health, wellness and fitness
BR0111918B1 (pt) 2000-06-23 2010-11-30 aparelho para monitorar e reportar informação fisiológica humana.
US6600471B2 (en) 2000-07-28 2003-07-29 Smal Camera Technologies, Inc. Precise MOS imager transfer function control for expanded dynamic range imaging
AU2002255568B8 (en) 2001-02-20 2014-01-09 Adidas Ag Modular personal network systems and methods
US7054679B2 (en) 2001-10-31 2006-05-30 Robert Hirsh Non-invasive method and device to monitor cardiac parameters
US7946959B2 (en) 2002-05-30 2011-05-24 Nike, Inc. Training scripts
US7020508B2 (en) 2002-08-22 2006-03-28 Bodymedia, Inc. Apparatus for detecting human physiological and contextual information
CZ2005209A3 (cs) * 2002-09-10 2005-12-14 Ivi Smart Technologies, Inc. Bezpečné biometrické ověření identity
ES2562933T3 (es) 2002-10-09 2016-03-09 Bodymedia, Inc. Aparato para detectar, recibir, obtener y presentar información fisiológica y contextual humana
US7349574B1 (en) 2002-10-11 2008-03-25 Sensata Technologies, Inc. System and method for processing non-linear image data from a digital imager
US7218966B2 (en) 2003-04-11 2007-05-15 Cardiac Pacemakers, Inc. Multi-parameter arrhythmia discrimination
US20070159926A1 (en) 2003-04-17 2007-07-12 Nike, Inc. Adaptive Watch
EP2319410A1 (fr) 2003-09-12 2011-05-11 BodyMedia, Inc. Appareil pour la mesure de paramètres cardiaques
DE102004032812B4 (de) 2003-11-11 2006-07-20 Dräger Safety AG & Co. KGaA Kombinationssensor für physiologische Messgrößen
US7717848B2 (en) * 2004-03-16 2010-05-18 Medtronic, Inc. Collecting sleep quality information via a medical device
US20050209512A1 (en) 2004-03-16 2005-09-22 Heruth Kenneth T Detecting sleep
DK1734858T3 (da) 2004-03-22 2014-10-20 Bodymedia Inc Ikke-invasiv temperaturovervågningsindretning
US8172761B1 (en) * 2004-09-28 2012-05-08 Impact Sports Technologies, Inc. Monitoring device with an accelerometer, method and system
JP4487730B2 (ja) 2004-11-02 2010-06-23 株式会社日立製作所 生活状態通知システム
US7254516B2 (en) 2004-12-17 2007-08-07 Nike, Inc. Multi-sensor monitoring of athletic performance
US7319425B2 (en) 2005-03-21 2008-01-15 Massachusetts Institute Of Technology Comparator-based switched capacitor circuit for scaled semiconductor fabrication processes
US20070010748A1 (en) * 2005-07-06 2007-01-11 Rauch Steven D Ambulatory monitors
US20070032731A1 (en) 2005-08-05 2007-02-08 Lovejoy Jeffrey L Non-invasive pulse rate detection via headphone mounted electrodes / monitoring system
US7534206B1 (en) 2005-09-19 2009-05-19 Garmin Ltd. Navigation-assisted fitness and dieting device
US7657307B2 (en) * 2005-10-31 2010-02-02 Medtronic, Inc. Method of and apparatus for classifying arrhythmias using scatter plot analysis
US20070197881A1 (en) * 2006-02-22 2007-08-23 Wolf James L Wireless Health Monitor Device and System with Cognition
US20070232454A1 (en) 2006-03-28 2007-10-04 David Kagan Fitness assessment
US8684922B2 (en) 2006-05-12 2014-04-01 Bao Tran Health monitoring system
US7539532B2 (en) 2006-05-12 2009-05-26 Bao Tran Cuffless blood pressure monitoring appliance
US8500636B2 (en) 2006-05-12 2013-08-06 Bao Tran Health monitoring appliance
US7558622B2 (en) 2006-05-24 2009-07-07 Bao Tran Mesh network stroke monitoring appliance
US8684900B2 (en) 2006-05-16 2014-04-01 Bao Tran Health monitoring appliance
US7539533B2 (en) 2006-05-16 2009-05-26 Bao Tran Mesh network monitoring appliance
US20090273467A1 (en) 2006-09-18 2009-11-05 Koninklijke Philips Electronics N. V. Ip based monitoring and alarming
US20080076972A1 (en) * 2006-09-21 2008-03-27 Apple Inc. Integrated sensors for tracking performance metrics
DE102006057709B4 (de) 2006-12-07 2015-04-02 Dräger Medical GmbH Vorrichtung und Verfahren zum Bestimmen einer Atemfrequenz
KR20080069851A (ko) 2007-01-24 2008-07-29 삼성전자주식회사 생체 신호 측정 센서 장치 및 상기 센서 장치를 구비한헤드셋 장치 및 팬던트 장치
US7846104B2 (en) 2007-02-08 2010-12-07 Heart Force Medical Inc. Monitoring physiological condition and detecting abnormalities
US8275635B2 (en) 2007-02-16 2012-09-25 Bodymedia, Inc. Integration of lifeotypes with devices and systems
US20080287821A1 (en) * 2007-03-30 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational user-health testing
US8140154B2 (en) * 2007-06-13 2012-03-20 Zoll Medical Corporation Wearable medical treatment device
EP2185063B1 (fr) 2007-08-21 2017-11-08 University College Dublin, National University of Ireland Dublin Procédé et système pour surveiller la somnolence
US8764653B2 (en) 2007-08-22 2014-07-01 Bozena Kaminska Apparatus for signal detection, processing and communication
WO2009108228A1 (fr) * 2008-02-25 2009-09-03 Kingsdown, Inc. Systèmes et procédés pour réguler un environnement de chambre à coucher et pour fournir des données de sommeil
EP2116183B1 (fr) 2008-05-07 2012-02-01 CSEM Centre Suisse d'Electronique et de Microtechnique SA Dispositif de surveillance cardiovasculaire opto-électrique robuste localisé sur l'oreille
US20100056878A1 (en) 2008-08-28 2010-03-04 Partin Dale L Indirectly coupled personal monitor for obtaining at least one physiological parameter of a subject
US20100076276A1 (en) 2008-09-25 2010-03-25 Nellcor Puritan Bennett Llc Medical Sensor, Display, and Technique For Using The Same
US8355769B2 (en) 2009-03-17 2013-01-15 Advanced Brain Monitoring, Inc. System for the assessment of sleep quality in adults and children
WO2010108287A1 (fr) 2009-03-23 2010-09-30 Hongyue Luo Système de soin de santé intelligent pouvant être porté et méthode afférente
WO2010111489A2 (fr) 2009-03-27 2010-09-30 LifeWatch Corp. Procédés et appareils de traitement de données physiologiques acquises auprès d'une unité de surveillance physiologique ambulatoire
WO2010128500A2 (fr) * 2009-05-04 2010-11-11 Wellsense Technologies Système et procédé de surveillance des taux de glucose dans le sang de façon non invasive
US20100292589A1 (en) 2009-05-13 2010-11-18 Jesse Bruce Goodman Hypothenar sensor
US8738118B2 (en) * 2009-05-20 2014-05-27 Sotera Wireless, Inc. Cable system for generating signals for detecting motion and measuring vital signs
US8114026B2 (en) * 2009-06-23 2012-02-14 Infarct Reduction Technologies Inc. Methods and devices for remote ischemic preconditioning and near-continuous blood pressure monitoring
US20100331631A1 (en) 2009-06-30 2010-12-30 Nellcor Puritan Bennett Llc Oxygen saturation ear sensor design that optimizes both attachment method and signal quality
US20110066041A1 (en) * 2009-09-15 2011-03-17 Texas Instruments Incorporated Motion/activity, heart-rate and respiration from a single chest-worn sensor, circuits, devices, processes and systems
US8715206B2 (en) 2009-10-15 2014-05-06 Masimo Corporation Acoustic patient sensor
EP2490587A1 (fr) 2009-10-20 2012-08-29 Widemed Ltd. Procédé et système de détection de l'arythmie cardiaque
US9585589B2 (en) * 2009-12-31 2017-03-07 Cerner Innovation, Inc. Computerized systems and methods for stability-theoretic prediction and prevention of sudden cardiac death
EP2523625B1 (fr) 2010-01-14 2017-03-08 PhysIQ Inc. Indicateur sanitaire basé sur des résiduels à variables multiples pour la surveillance de la santé d'un être humain
WO2011109716A2 (fr) * 2010-03-04 2011-09-09 Neumitra LLC Dispositifs et méthodes de traitement de troubles psychologiques
WO2011113070A1 (fr) 2010-03-07 2011-09-15 Centauri Medical, INC. Systèmes, dispositifs et procédés adaptés pour prévenir, détecter et traiter des ischémies induites par la pression, des escarres de décubitus et d'autres affections
US8591411B2 (en) 2010-03-10 2013-11-26 Sotera Wireless, Inc. Body-worn vital sign monitor
JP5937072B2 (ja) 2010-07-21 2016-06-22 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 腹部大動脈瘤の発見及び監視
US10137245B2 (en) 2010-08-17 2018-11-27 University Of Florida Research Foundation, Inc. Central site photoplethysmography, medication administration, and safety
US9167991B2 (en) 2010-09-30 2015-10-27 Fitbit, Inc. Portable monitoring devices and methods of operating same
US20120220835A1 (en) * 2011-02-14 2012-08-30 Wayne Chung Wireless physiological sensor system and method
US8519835B2 (en) 2011-03-02 2013-08-27 Htc Corporation Systems and methods for sensory feedback
US8568330B2 (en) 2011-03-08 2013-10-29 Pulsaw Informatics, Inc. Composite human physiological stress index based on heart beat and sleep and/or activity history data including actigraphy
US20140089672A1 (en) 2012-09-25 2014-03-27 Aliphcom Wearable device and method to generate biometric identifier for authentication using near-field communications
US20140257058A1 (en) 2011-10-19 2014-09-11 Scanadu Incorporated Automated personal medical diagnostic system, method, and arrangement
US10006896B2 (en) * 2011-11-14 2018-06-26 University of Pittsburgh—of the Commonwealth System of Higher Education Method, apparatus and system for food intake and physical activity assessment
EP2747649A1 (fr) 2011-12-20 2014-07-02 Koninklijke Philips N.V. Procédé et appareil pour surveiller le réflexe de barorécepteur d'un utilisateur
WO2013109188A1 (fr) 2012-01-16 2013-07-25 Agency For Science, Technology And Research Procédé et système d'un appareil de mesure optique de pression artérielle
US9186077B2 (en) 2012-02-16 2015-11-17 Google Technology Holdings LLC Method and device with customizable power management
RU2518134C2 (ru) 2012-02-24 2014-06-10 Хилби Корпорейшн Способ определения концентрации глюкозы в крови человека
US10219709B2 (en) * 2012-03-28 2019-03-05 Wayne State University Sensor and method for continuous health monitoring
US20130338460A1 (en) 2012-06-18 2013-12-19 David Da He Wearable Device for Continuous Cardiac Monitoring
US8954135B2 (en) * 2012-06-22 2015-02-10 Fitbit, Inc. Portable biometric monitoring devices and methods of operating same
US9044171B2 (en) * 2012-06-22 2015-06-02 Fitbit, Inc. GPS power conservation using environmental data
WO2014022906A1 (fr) 2012-08-10 2014-02-13 Cnv Systems Ltd. Système de dispositif mobile pour une mesure de santé cardio-vasculaire
WO2014047528A1 (fr) 2012-09-21 2014-03-27 Cardiomems, Inc. Procédé et système de prise en charge de patients sur la base de tendances
US20140085050A1 (en) 2012-09-25 2014-03-27 Aliphcom Validation of biometric identification used to authenticate identity of a user of wearable sensors
US20140089673A1 (en) 2012-09-25 2014-03-27 Aliphcom Biometric identification method and apparatus to authenticate identity of a user of a wearable device that includes sensors
US9098991B2 (en) * 2013-01-15 2015-08-04 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US10314496B2 (en) 2013-02-20 2019-06-11 Tosense, Inc. Necklace-shaped physiological monitor
US9320434B2 (en) * 2013-03-04 2016-04-26 Hello Inc. Patient monitoring systems and messages that send alerts to patients only when the patient is awake
US20140275883A1 (en) 2013-03-14 2014-09-18 Covidien Lp Wireless sensors
US9558336B2 (en) 2013-10-04 2017-01-31 Salutron Inc. Persistent authentication using sensors of a user-wearable device
US20150173674A1 (en) * 2013-12-20 2015-06-25 Diabetes Sentry Products Inc. Detecting and communicating health conditions

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6791462B2 (en) * 2002-09-18 2004-09-14 Sang J. Choi Sleepy alarm system activated by heart pulse meter
US20120075122A1 (en) * 2010-09-24 2012-03-29 Honeywell International Inc. Alert generation and related aircraft operating methods
US20120203077A1 (en) * 2011-02-09 2012-08-09 David Da He Wearable Vital Signs Monitor
US8725311B1 (en) * 2011-03-14 2014-05-13 American Vehicular Sciences, LLC Driver health and fatigue monitoring system and method
US20140073486A1 (en) * 2012-09-04 2014-03-13 Bobo Analytics, Inc. Systems, devices and methods for continuous heart rate monitoring and interpretation

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI673035B (zh) * 2014-10-16 2019-10-01 日商帝人股份有限公司 具備警報系統的防護裝備
US10019882B2 (en) * 2014-10-16 2018-07-10 Teijin Limited Protective equipment comprising alarm system
US10497244B2 (en) * 2014-12-18 2019-12-03 Wearable Technology Limited Issuing alarm signal to operatives
WO2016174634A1 (fr) * 2015-04-30 2016-11-03 Pontificia Universidad Católica De Chile Procédé et dispositif de détection et d'enregistrement d'au moins un événement émotionnel et des conditions environnementales du cadre dans lequel se trouve l'individu avant, pendant et après ledit événement émotionnel, en vue de son analyse ultérieure et avec de faibles besoins en énergie
US11478215B2 (en) 2015-06-15 2022-10-25 The Research Foundation for the State University o System and method for infrasonic cardiac monitoring
US10542961B2 (en) 2015-06-15 2020-01-28 The Research Foundation For The State University Of New York System and method for infrasonic cardiac monitoring
US20170135593A1 (en) * 2015-11-13 2017-05-18 Acme Portable Corp. Wearable device which diagnoses personal cardiac health condition by monitoring and analyzing heartbeat and the method thereof
US10028672B2 (en) * 2015-11-13 2018-07-24 Acme Portable Corp. Wearable device which diagnosis personal cardiac health condition by monitoring and analyzing heartbeat and the method thereof
US10768009B2 (en) 2015-12-29 2020-09-08 Ebay Inc. Proactive re-routing of vehicles using passive monitoring of occupant frustration level
US11747158B2 (en) 2015-12-29 2023-09-05 Ebay Inc. Proactive re-routing of vehicles using passive monitoring of occupant frustration level
US11774252B2 (en) 2015-12-29 2023-10-03 Ebay Inc. Proactive re-routing of vehicles to control traffic flow
US20180033299A1 (en) * 2015-12-29 2018-02-01 Ebay Inc. Traffic disruption detection using passive monitoring of vehicle occupant frustration level
US10909846B2 (en) * 2015-12-29 2021-02-02 Ebay Inc. Traffic disruption detection using passive monitoring of vehicle occupant frustration level
US10914598B2 (en) 2015-12-29 2021-02-09 Ebay Inc. Proactive re-routing of vehicles to control traffic flow
US11574540B2 (en) 2015-12-29 2023-02-07 Ebay Inc. Traffic disruption detection using passive monitoring of vehicle occupant frustration level
US11326896B2 (en) 2015-12-29 2022-05-10 Ebay Inc Proactive re-routing of vehicles using passive monitoring of occupant frustration level
US20190122517A1 (en) * 2016-04-19 2019-04-25 Teijin Limited Article provided with warning system
US10980428B2 (en) 2016-12-15 2021-04-20 ViviPulse, LLC Wearable pulse waveform measurement system and method
US10166421B2 (en) * 2017-05-02 2019-01-01 International Business Machines Corporation Cognitive solution to enhance firefighting capabilities
US11602656B2 (en) 2017-05-02 2023-03-14 Kyndryl, Inc. Cognitive solution to enhance firefighting capabilities
US20180318622A1 (en) * 2017-05-02 2018-11-08 International Business Machines Corporation Cognitive solution to enhance firefighting capabilities
US20190021633A1 (en) * 2017-11-21 2019-01-24 Ling Wang Detecting respiratory rates in audio using an adaptive low-pass filter
US11006875B2 (en) 2018-03-30 2021-05-18 Intel Corporation Technologies for emotion prediction based on breathing patterns
WO2022107845A1 (fr) * 2020-11-19 2022-05-27 Jvckenwood Corporation Authentification biométrique par surveillance vasculaire
TWI799821B (zh) * 2021-03-30 2023-04-21 許維綸 危險預測預防系統

Also Published As

Publication number Publication date
US20150112156A1 (en) 2015-04-23
US9396643B2 (en) 2016-07-19
US20150112155A1 (en) 2015-04-23
JP2016538097A (ja) 2016-12-08
WO2015061579A1 (fr) 2015-04-30
US20150112158A1 (en) 2015-04-23
US20150112154A1 (en) 2015-04-23
US20150112452A1 (en) 2015-04-23
CA2928197A1 (fr) 2015-04-30
US20150109124A1 (en) 2015-04-23
KR20160075677A (ko) 2016-06-29
US20150112157A1 (en) 2015-04-23
US20150112606A1 (en) 2015-04-23
EP3060107A1 (fr) 2016-08-31
US20150112208A1 (en) 2015-04-23
US9396642B2 (en) 2016-07-19

Similar Documents

Publication Publication Date Title
US9396643B2 (en) Biometric authentication
US20160302677A1 (en) Calibrating for Blood Pressure Using Height Difference
US20150164351A1 (en) Calculating pulse transit time from chest vibrations
RU2656559C2 (ru) Способ и устройство для определения жизненно важных показателей
CN107708548B (zh) 用于吸烟行为的量化和预测的系统和方法
KR102318887B1 (ko) 웨어러블 전자 장치 및 그 제어 방법
EP4032469A1 (fr) Systèmes et procédés de mesure de sang multispectrale
US20210219923A1 (en) System for monitoring and providing alerts of a fall risk by predicting risk of experiencing symptoms related to abnormal blood pressure(s) and/or heart rate
US20190261855A1 (en) Systems and methods for quantification of, and prediction of smoking behavior
US20200035337A1 (en) Method and product for determining a state value, a value representing the state of a subject
US20220296847A1 (en) Wearable device operable to detect and/or manage user stress
US20210244365A1 (en) Non-invasive epidermal health-monitoring sensor, patch system and method, and epidemiological monitoring and tracking system related thereto
US20160000365A1 (en) Anxiety meter
WO2016137698A1 (fr) Calcul de temps de transit de pouls à partir de vibrations de poitrine
US20220167859A1 (en) System and method for blood pressure monitoring with subject awareness information
CA3115419A1 (fr) Capteur de surveillance de l`etat de sante epidermique non invasif, systeme et methode de timbre et systeme de surveillance et de suivi epidemiologique connexe
US20200196878A1 (en) System and method for blood pressure monitoring with subject awareness information
US20240081647A1 (en) Systems and methods for assisting in smoking cessation
US20210196194A1 (en) Unobtrusive symptoms monitoring for allergic asthma patients
Ishaque Heart-rate Variability Analysis for Stress Assessment in a Video-Game Setup
Alharbi Non-invasive Wearable Solutions to Identify Individuals with Mild Cognitive Impairments from Healthy Controls

Legal Events

Date Code Title Description
AS Assignment

Owner name: QUANTTUS, INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HE, DAVID DA;BIJJANI, RICHARD ROBEHR;REEL/FRAME:034118/0156

Effective date: 20141030

AS Assignment

Owner name: ROBERT F. DUDLEY, AS TRUSTEE OF THE QUANTTUS LIQUI

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:QUANTTUS, INC.;REEL/FRAME:041019/0850

Effective date: 20161228

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION