WO2023214957A1 - Machine learning models for estimating physiological biomarkers - Google Patents

Machine learning models for estimating physiological biomarkers Download PDF

Info

Publication number
WO2023214957A1
WO2023214957A1 PCT/US2022/027267 US2022027267W WO2023214957A1 WO 2023214957 A1 WO2023214957 A1 WO 2023214957A1 US 2022027267 W US2022027267 W US 2022027267W WO 2023214957 A1 WO2023214957 A1 WO 2023214957A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
hrv
user
machine learning
time series
Prior art date
Application number
PCT/US2022/027267
Other languages
French (fr)
Inventor
Kamiar Kordari
Jason D. MOORE
Alyssa R. MOORE
Vivek Menon
Original Assignee
Elite HRV, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elite HRV, Inc. filed Critical Elite HRV, Inc.
Priority to PCT/US2022/027267 priority Critical patent/WO2023214957A1/en
Publication of WO2023214957A1 publication Critical patent/WO2023214957A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • A61B5/353Detecting P-waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6825Hand
    • A61B5/6826Finger
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/332Portable devices specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4842Monitoring progression or stage of a disease
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • A61B5/7207Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7221Determining signal validity, reliability or quality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning

Definitions

  • the present disclosure relates to systems, devices, and methods for detection and tracking biomarkers, and more specifically, to using one or more sensors associated with a user to measure vitals, and customized insights, including health, wellness, and fitness-related parameters over a predetermined period.
  • Heart Rate Variability is determined from heart beat data and represents variability in inter-beat timing.
  • a heart rate monitor or other sensor detects the ECG (electrocardiograph) or the PPG (photoplethysmography) signal, e.g, a data measure that varies in relation to the heart’s contraction and relaxation. From this the peaks of the heart contraction can be derived and plotted against time. That is, the time between the ventricular and atrial contractions of the heart can be derived and plotted against time. This, in turn, allows the timing between peaks to be reported as a time (in milliseconds) between peaks.
  • ECG electrocardiograph
  • PPG photoplethysmography
  • the systems, devices and methods provided for in the present disclosure are directed to measuring and tracking biomarker data.
  • Data can be acquired using one or more sensors that determines vitals such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, ketone levels, general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing.
  • the one or more sensors can include a finger-over-camera sensor that can measure vital signs from contact with the finger of a user and/or a face-over-camera sensor that can measure biomarkers via the camera feature of the smartphone, and/or a standalone camera, including those in loT devices.
  • the sensors can communicate with a smartphone application that can be toggled to take measurements of predetermined biomarkers at desired intervals.
  • the data from each sensor can be combined to build a user profile where the data from each sensor is used to corroborate or correct the measurement taken by the other sensors.
  • Biomarker data can be saved by the system to the user profile that can track a history of the biomarkers over an extended period. Throughout this document, the terms biomarker, metric, and biometric will be used interchangeably.
  • a system for measuring biomarkers of a user includes: a communication device configured to make biomarker determinations, the device having one or more cameras thereon configured to capture an image of one or more users; a plurality of sensors coupled to a hardware component, the plurality of sensors being configured to collect biomarker data from the one or more users; a storage system configured to store the biomarker data to a profile that corresponds to a user of the one or more users; and a display unit associated with the communication device configured to display the biomarker data, wherein: the biomarker data collected by a sensor of the plurality of sensors is compared to biomarker data collected by another sensor of the plurality of sensors to validate values of the biomarker data; and repeated storage of the biomarker data to the profile trains a sensor of the plurality of sensors to stop collecting biomarker data during subsequent collections of biomarker data.
  • the sensor of the plurality of sensors may include an optical sensor configured to detect the face of the one or more users to collect biomarker data therefrom.
  • Another sensor of the plurality of sensors may include a pressure sensor configured to contact the one or more users to collect biomarker data therefrom.
  • the pressure sensor may further comprise a camera configured to collect a supplementary signal by measuring a force applied to the camera by the user.
  • the optical sensor and the pressure sensor may be configured to conduct readings substantially simultaneously. The readings may be conducted over a period of from approximately 10 seconds to approximately ten minutes.
  • the communication device may provide a warning to adjust a position of the plurality of sensors if biomarker data is not being collected.
  • the optical sensor may collect biomarker data by one or more of color extraction, pixel movement, and/or eye-specific movement of a face of the user.
  • the color extracted may include visible and/or invisible wavelengths.
  • the system may further include at least a third sensor configured to receive data from one or more of the first sensor, the second sensor, or the user, the third sensor having a hardware component configured to make biomarker determinations.
  • the third sensor may include one or more of a chest strap or a patch, a smartwatch, a wrist or arm wearables, a ring wearable, or a hat.
  • the biomarker data may include one or more of heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, ketone levels, general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing.
  • a method for determining biomarkers includes: placing a first sensor relative to a first body part of a user; placing a second sensor relative to a second body part of the user, the second sensor being configured to capture an image of a face of the user; measuring a value of a first biomarker of the first body part of the user using the first sensor to determine a first reading; measuring a value of the first biomarker of the second body part of the user using the second sensor to determine a second reading, the first biomarker being measured by the first sensor and the second sensor substantially simultaneously; analyzing the first reading and the second reading to determine the existence of a correlation between the values; and displaying a final value of the first biomarker based on the analysis of the first reading and the second reading.
  • the first body part may be a finger of the user and the second body part may be a face of the user.
  • the method may further include storing the final value of the first biomarker in a user profile that includes a history of the final readings of biomarkers taken by the user.
  • the second reading may be compared to the first reading to verify accuracy of the first reading.
  • the method may further include adjusting the first reading based on a measured output of the second reading.
  • the method may further include using the measured values to calibrate the profile of the user.
  • the first and second readings may be taken over a predetermined time interval.
  • the predetermined time interval may range from approximately 10 seconds to approximately ten minutes.
  • the method may further include filtering the first reading and the second reading to determine an average value of each of the first reading and the second reading.
  • the method may further include measuring a second biomarker of the user using one or more of the first sensor and the second sensor, and combining the first biomarker with the second biomarker.
  • a method for biometric monitoring and scoring includes: collecting biometric data from a plurality of users; collecting data measuring physiological data and environmental data associated with said plurality of users over time; storing all collected data in an electronic storage system; analyzing said collected data to create a composite score that comprises at least heart rate activity data, biometric data, and environmental data; comparing said composite score against historical composite scores to determine activity modifications that will impact the behavior of said plurality of users prior to collecting additional biometric data; and presenting said activity modifications to said plurality of users on a display device in combination with recommended actions to accomplish said activity modifications.
  • the method may further include utilizing one or more signal cleaning algorithms to detect artifacts in the collected data and improve the collected data by removing any detected artifacts that impair the signal quality of the biometric and/or physiological data.
  • the signal cleaning of the collected data may be performed during a data collection action and the cleaned collected data is stored in an electronic format prior to analysis of said cleaned collected data.
  • the method may further include tracking said composite score over a pre-configured time span.
  • the composite score may comprise Heart Rate Variability (HRV) data, biometric data, and changes in environment data.
  • HRV Heart Rate Variability
  • the method may further include receiving from a user or medical practitioner a threshold composite score or composite score range that is preferred for the user to maintain.
  • the method may further include transmitting to a user recommended actions comprising events, interventions, and/or planned steps in accordance with maintaining said user’s particular composite score.
  • the sensors may comprise any of a finger sensor, an LED sensor, a chest-strap electrocardiogram sensor, or sensors contained within or attached to a mobile device associated with said user.
  • the composite score may be presented to a user as a numeric value and a gauge graphic to permit the user to visually understand changes in the composite score over time.
  • the composite score, recommendations, and guidance may be provided as a report, as part of an ongoing data display, or in real-time as live biofeedback to a user during an activity.
  • a system for biometric monitoring and scoring includes: collecting biometric data from a plurality of users; collecting data measuring physiological data and environmental data associated with said plurality of users over time; storing all collected data in an electronic storage system; analyzing said collected data to create a composite score that comprises at least heart rate activity data, biometric data, and environmental data; comparing said composite score against historical composite scores to determine activity modifications that will impact the behavior of said plurality of users prior to collecting additional biometric data; presenting said activity modifications to said plurality of users on a display device in combination with recommended actions to accomplish said activity modifications.
  • the system may further include utilizing one or more signal cleaning algorithms to detect artifacts in the collected data and improve the collected data by removing any detected artifacts that impair the signal quality of the biometric and/or physiological data.
  • the signal cleaning of the collected data may be performed during a data collection action and the cleaned collected data is stored in an electronic format prior to analysis of said cleaned collected data.
  • the sensors may comprise any of a finger sensor, an LED sensor, a chest-strap electrocardiogram sensor, or sensors contained within or attached to a mobile device associated with said user.
  • the system may further include said composite score over a preconfigured time span.
  • the composite score may comprise Heart Rate Variability (HRV) data, biometric data, and changes in environment data.
  • HRV Heart Rate Variability
  • the system may further include receiving from a user or medical practitioner a threshold composite score or composite score range that is preferred for the user to maintain.
  • the system may further include transmitting to a user recommended actions comprising events, interventions, and/or planned steps in accordance with maintaining said user’s particular composite score.
  • the composite score may be presented to a user as a numeric value and a gauge graphic to permit the user to visually understand changes in the composite score over time.
  • the composite score, recommendations, and guidance may be provided as a report, as part of an ongoing data display, or in real-time as live biofeedback to a user during an activity.
  • a method for training machine learning models to transform camera-based images into estimates of physiological biomarkers includes: providing a time series of video frames collected from a region of a body of a subject that is associated with a blood volume change within a tissue of the body; providing a time series of R-R intervals, the R-R intervals provided as ground truth data; synchronizing the time series of the video frames and the time series of the R-R intervals to provide an input pair to provide a synchronized time series of video frames and time series of R-R intervals; and training a machine learning model to estimate R-R intervals from the synchronized time series of video frames and time series of R-R intervals using the time series of R-R intervals as the ground truth data.
  • the time series of video frames may be recorded by finger-over-camera or face-over- camera video cameras.
  • the time series of R-R intervals may be calculated from inter-beat intervals measured by an ECG.
  • the machine learning model may be configured as a deep learning network.
  • the deep learning network may be a convolutional neural network.
  • the training may include supervised learning.
  • the supervised learning may be configured to operate without exact known features of the regions of the body of the subject.
  • the synchronizing may be performed before training the machine learning model by using one or more signal processing peak detection methods.
  • the synchronized time series of video frames and time series of R-R intervals may include a first synchronized segment of the synchronized time series of video frames and time series of R-R intervals having a first duration.
  • the first synchronized segment may be subdivided into one or more subdivided synchronized segments having one or more durations.
  • a moveable time window may be provided to subdivide the first synchronized segment into the one or more subdivided synchronized segments.
  • the one or more subdivided synchronized segments may be used to train the machine learning model.
  • a method for training machine learning models to learn time-dependent patterns in a time-series signal includes: providing a time domain biometric value; providing a time series of R-R intervals as ground truth data; and training a machine learning model to determine a time domain biometric value from the time series of R-R intervals.
  • the time domain biometric value may include at least one of an HRV score or a RMSSD value.
  • the machine learning model may include a long short term memory model.
  • the method may further include training the machine learning model to produce a frequency domain biometric value.
  • the frequency domain biometric value may include at least a high-frequency power value.
  • the machine learning model may be configured as a deep learning network.
  • the deep learning network may be configured as a convolutional neural network.
  • the machine learning model may be a hybrid machine learning model.
  • the hybrid machine learning model may be trained by a stacking model.
  • the stacking model may include a convolutional neural network and a long short term memory model.
  • the machine learning model may be trained to detect and remove artifacts from the time series of R-R intervals.
  • the artifacts may be caused by one or more of motion, arrhythmias, premature ectopic beats, atrial fibrillation, measurement, or signal noise.
  • a method for training machine learning models to determine a subset of pixels in video frames to generate one or more biomarker values includes: providing a time series of video frames collected from a region of a body of a subject that is associated with a blood volume change within a tissue of the body; dividing each of the time series of video frames into cells of an N x M grid; detecting one or more cells having the highest information content; calculating estimates of the one or more biomarkers from each of the detected cells; and training a machine learning model to fuse the estimates of the one or more biomarkers from each of the detected cells to generate the one or more biomarker values.
  • the one or more biomarker values may include at least one of R-R intervals or an HRV value.
  • the dividing may increase the signal to noise in the detected cells.
  • the dividing may reduce the amount of computation required to generate the one or more biomarker values.
  • a method for training machine learning models to determine a subset of pixels in video frames to generate one or more biomarker values includes: providing a time series of video frames collected from a region of a body of a subject that is associated with a blood volume change within a tissue of the body; using attention mechanism to do spatial segmentation in each of the time series of video frames; detecting the spatial segmentation in each of the time series of video frames; and training a machine learning model to calculate estimates of the one or more biomarkers from the spatial segmentation in each of the time series of video frames.
  • the one or more biomarker values may include at least one of R-R intervals or an HRV value.
  • the spatial segmentation in each of the time series of video frames may increase the signal to noise in the detected cells.
  • the spatial segmentation in each of the time series of video frames may reduce the amount of computation required to generate the one or more biomarker values.
  • FIG. 1 A is a schematic illustration of a pulse waveform from a PPG signal
  • FIG. IB is an example of a PPG signal
  • FIG. 2A is a schematic illustration of use of a finger-over-camera sensor of a smartphone in accordance with the present embodiments
  • FIG. 2B is a schematic illustration of use of a face-over-camera sensor of the smartphone of FIG. 2 A;
  • FIG. 3 A is a schematic perspective view of an exemplary embodiment of a wearable sensor attached to a finger of a user
  • FIG. 3B is a schematic top view of the wearable sensor of FIG. 3 A attached to the finger of the user;
  • FIG. 3C is a schematic perspective view of alternate embodiments of wearable devices that can be used with the embodiments of the present disclosure
  • FIG. 4 is a view of artifact detection accuracy in terms of the detection of false positive artifact detection consistent with certain embodiments of the present invention
  • FIG. 5 is a view of artifact detection accuracy in terms of the detection of true positive artifact detection consistent with certain embodiments of the present invention
  • FIG. 6 is a view of artifact impact on the system consistent with certain embodiments of the present invention.
  • FIG. 7 is a view of the display of HRV statistics for a user post reading consistent with certain embodiments of the present invention.
  • FIG. 8 is a view of the display of the continuation of HRV statistics for a user post reading consistent with certain embodiments of the present invention.
  • FIG. 9 is an exemplary embodiment of an interface displaying readings that can be performed with the sensors of the present embodiments.
  • FIG. 10 is an exemplary embodiment of an interface for taking an HRV snapshot in accordance with the present embodiments.
  • FIG. 11 A is a tutorial on how to use the finger-over-camera sensor of FIG. 1 A;
  • FIG. 1 IB is a status update of use of the finger-over-camera sensor of FIG. 1 A;
  • FIG. 11C is an interface indicating that the system is waiting to receive pulse data from a wearable device in communication with the system of the present disclosure
  • FIG. 1 ID is an interface for performing accuracy check performed by the application
  • FIG. 1 IE is an interface for indicating that the accuracy check is complete
  • FIG. 12 is a status screen 1200 indicating that the finger-over-camera sensor is taking a reading
  • FIG. 13 is an interface that displays results of the finger-over-camera sensor reading
  • FIG. 14 is an exemplary embodiment of reminders prior to taking a face-over-camera sensor reading
  • FIG. 15 is an interface indicating that the system is waiting to receive pulse data from a wearable device in communication with the system of the present disclosure
  • FIG. 16A is an exemplary embodiment of calibration of the face-over-camera sensor to take readings from a user
  • FIG. 16B is an accuracy check of the calibration of FIG. 16A
  • FIG. 16C is an interface indicating that calibration of the face-over-camera sensor has been completed;
  • FIG. 16D is an interface for taking an HRV snapshot using the face-over-camera sensor;
  • FIG. 16E is an interface that displays biomarker information when using the face- over-camera sensor
  • FIG. 16F is an interface showing the user their face
  • FIG. 17 is an exemplary embodiment of an interface for indicating an error message during a reading
  • FIG. 18 is an exemplary embodiment of an interface for personalization after a reading is taken
  • FIG. 19 is an exemplary embodiment of an interface for enabling additional tagging features
  • FIG. 20 is an interface showing the results of the reading of FIGS. 16A-16F;
  • FIG. 21 is an exemplary embodiment of a home screen of the application.
  • FIG. 22A is an exemplary embodiment of a calibration of a scanner function that can collect biomarker data live via the face-over-camera sensor;
  • FIG. 22B is an interface for performing the face-over-camera live reading of FIG. 22A;
  • FIG. 23 is an exemplary embodiment of a web browser version using face-over- camera for measuring biomarkers via video conference
  • FIG. 24A is an exemplary embodiment of detection for a reading using the face-over- camera sensor with no live reading
  • FIG. 24B is an exemplary embodiment of calibration for a reading using the face- over-camera sensor with no live reading
  • FIG. 24C is an exemplary embodiment of an interface indicating that calibration is complete;
  • FIG. 24D is an exemplary embodiment of a reading being formed;
  • FIG. 24E is an exemplary embodiment of results of the reading
  • FIG. 24F is an exemplary embodiment of an interface comparing results of the user profile with other users
  • FIG. 25 is a view of the display of the data integration connections for the user device consistent with certain embodiments of the present invention.
  • FIG. 26 is a view of the display of the historical log for a user consistent with certain embodiments of the present invention.
  • FIG. 27 is a view of the historical trends for HRV statistics for a user consistent with certain embodiments of the present invention.
  • FIG. 28 is a view of the connection capability for the sensors associated with the HRV monitoring system consistent with certain embodiments of the present invention.
  • FIG. 29 is a view of the historical trends for HRV statistics for a user related to morning readiness scores and HRV values consistent with certain embodiments of the present invention.
  • FIG. 30 is a view of the detailed data values for HRV statistics for a user related to morning readiness scores and HRV values consistent with certain embodiments of the present invention.
  • FIG. 31 is a view of the informational data for a user related to morning readiness scores and HRV values expressed as autonomic balance consistent with certain embodiments of the present invention
  • FIG. 32 is a view of the historical trends for HRV statistics for a population related to morning readiness scores and HRV values consistent with certain embodiments of the present invention
  • FIG. 33 is a view of the historical trends for HRV statistics for a population related to morning readiness scores and HRV values consistent with certain embodiments of the present invention.
  • FIG. 34 is a view of the raw data captured for RR intervals and HRV values consistent with certain embodiments of the present invention.
  • FIG. 35 is a view of the relationship between RR intervals and HRV values consistent with certain embodiments of the present invention.
  • FIG. 36 is a view of the display of HRV statistics and signal quality for a user post reading consistent with certain embodiments of the present invention.
  • FIG. 37 is a view of the user feedback and tagging display consistent with certain embodiments of the present invention.
  • FIG. 38 is a view of the display of HRV statistics and signal quality for a user post reading consistent with certain embodiments of the present invention.
  • FIG. 39 is a view of the composite score reading and data collection process consistent with certain embodiments of the present invention.
  • FIG. 40 presents a view presents a view of the HRV system configuration consistent with certain embodiments of the present disclosure
  • the terms “a” or “an”, as used herein, are defined as one or more than one.
  • the term “plurality”, as used herein, is defined as two or more than two.
  • the term “another”, as used herein, is defined as at least a second or more.
  • the terms “including” and/or “having”, as used herein, are defined as comprising (i.e., open language).
  • the term “coupled”, as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically.
  • parasympathetic refers to the portion of the autonomic nervous system that conserves energy as it slows the heart rate, increases intestinal and gland activity, and relaxes sphincter muscles in the gastrointestinal tract.
  • HRV Heart Rate Variability
  • HRV scoring refers to the development of a score that is calculated utilizing various algorithms to present a scaled score from which comparisons over time may be made.
  • Reference throughout this document to “Morning Readiness” refers to a scaled score related to a user’s particular balance of parasympathetic and sympathetic nervous system activity.
  • Reference throughout this document to “Composite Scoring” refers to composite scores that have HRV Scoring, Morning Readiness scoring, and additional scoring parameters that are not necessarily related to HRV scores.
  • Readiness Score refers to a novel readiness score based upon ANS activity changes over time and indicates the user’s readiness to tackle life’s challenges each day.
  • the inter-beat intervals or R-R intervals are transmitted to the Elite HRV software wirelessly (currently via Bluetooth) from the finger sensor and used to calculate variability over time in the inter-beat or R-R intervals, i.e., the HRV data.
  • Changes in the inter-beat or RR intervals are associated with activity and/or changes in the Autonomic Nervous System’s (parasympathetic) and Sympathetic Nervous System’s (sympathetic) activity (which influences and can control heart rate, blood pressure, pupil dilation, blood glucose, muscle tension, sexual function, digestion, and energy regulation), and has been used as an indicator of stress levels, inflammation levels, and post-exercise recovery status, among other conditions.
  • HRV data has been correlated to all major causes of death.
  • the research literature evaluating algorithms for the detection and correction of artifacts in an Inter Beat Interval (e.g., IB I) series focuses on data sourced from specific, homogenous subpopulations rather than broad cross-sections. Additionally, source data used in the comparative literature is typically derived from low-noise, research-grade ECG sensors. While this evaluation strategy may suffice for limited clinical contexts in which the population and experimental conditions can be controlled, large-scale consumer HRV applications must satisfy broader requirements. In particular, those consumer applications supporting open compatibility with 3rd party Bluetooth sensors are liable to face significant variance in both the population parameters (e.g. age, athleticism, pathology of users), and the sensor platforms used. Thus, it is insufficient to evaluate artifact detection algorithms using the traditional, narrow source data parameters.
  • IB I Inter Beat Interval
  • HRV4Training offers a suite of estimations geared towards athletes, including the following: Lactate threshold estimation (providing advice on pacing strategies for racing and workouts); Half and full marathon time estimation; V02max estimation (cardiorespiratory fitness level); Functional Threshold Power (FTP) estimation (providing advice on pacing strategies for racing and workouts); and Aerobic Endurance (efficiency and cardiac decoupling).
  • Lactate threshold estimation providing advice on pacing strategies for racing and workouts
  • V02max estimation cardioorespiratory fitness level
  • FTP Functional Threshold Power
  • Aerobic Endurance efficiency and cardiac decoupling
  • HRV data has been and continues to be actively researched to determine if HRV data can act as an indicator or biomarker of certain conditions and used for predictions.
  • HRV is used to diagnose autonomic dysfunction, or more generally it is used with other data to assess and/or track health and performance when implementing new protocols.
  • HRV data (alone or in combination with other data) to make specific recommendations about lactate threshold estimation (providing advice on pacing strategies for racing and workouts); half and full marathon time estimation; VChmax estimation (cardiorespiratory fitness level); Functional Threshold Power (FTP) estimation (providing advice on pacing strategies for racing and workouts); and Aerobic Endurance (efficiency and cardiac decoupling).
  • Other current methods use HRV data to generate a variety of scores, e.g., EHRV’s Morning Readiness Score, EHRV’s HRV Score (a normalized manipulation of InRMSSD), Stress Score, Productivity Score, Emotional State, and Illness Risk, etc.
  • HRV data has been and continues to be actively researched to determine if HRV data can act as an indicator or biomarker of certain conditions and used for predictions.
  • HRV is used to diagnose autonomic dysfunction, or more generally it is used with other data to assess and/or track health and performance when implementing new protocols.
  • HRV4Training A primary use context for HRV data remains athletic training and health predictions. Competitors, such as HRV4Training, have a suite of services geared towards this context. These include team views, calendar views, various customized scores (some calculated on the basis of third-party data in combination with L1RV data), and rudimentary predictions (e.g., estimated marathon time).
  • the systems, devices, and methods provided for in the present disclosure are directed to transforming raw sensor data into personalized guidance to users.
  • Raw sensor data detected by sensors and/or received sensor data transformed by firmware into physiological signals is processed, through a series of steps, into personalized guidance to users.
  • the raw sensor data may be acquired using one or more sensors that measure physiological signals, like electrical waves.
  • an optical sensor such as a camera, detects colors and intensity of brightness as measured by pixels in an array.
  • These signals may be transformed into estimates of peaks and valleys representative of blood volume changes that can be transformed into the biomarkers of heart rate (e.g., HR), heart rate variability (e.g., HRV), pulse oximetry, respiration rate, blood pressure, body temperature, and so forth.
  • heart rate e.g., HR
  • HRV heart rate variability
  • HRV heart rate variability
  • pulse oximetry respiration rate
  • blood pressure blood pressure
  • body temperature body temperature
  • the detected color data is transformed by the device firmware into peaks and valleys representative of blood flow for estimating HR and HRV biomarkers.
  • similar transformations occur on electrical sensors like chest straps. Except, instead of color detection, the sensors are detecting electrical waves propagated by the beating of the heart. These electrical signals are transformed into heart beat data that can be processed into the biomarkers described above.
  • camera-based sensing of facial data can also be processed into estimates of heart beat data.
  • camera-based data that is based on acquired video images, e.g., time series data, of a user’s face also may be known as image-based data. That is, the use of “image” in the context of image-based videos refers to using video data to capture the time-series information associated with time-changing physiological signals.
  • the arrays of pixels in the camera detect raw color and intensity data that can be transformed by machine learning (e.g., ML) and/or machine learning methods (e.g., ML methods) and signal processing into estimates of heart beat data.
  • machine learning e.g., ML
  • ML methods e.g., ML methods
  • the machine learning algorithms can be trained by combining simultaneous data streams from multiple sensors to make the ML processing and training more robust than is possible with only the one data stream coming from the image-based data. This process helps “train” the algorithm to transform the camera-based images into useful estimates of physiological biomarkers.
  • combinatorial methods may be employed to determine human user biomarkers. For example, finger-over-camera sensors, face-over-camera sensors, and wider-angle cameras that capture a large portion of a user’s body may be combined as a multi-sensor input to scan various locations of a user’s body. For example, camera-based images may be used to make measurements of a user’s body.
  • These measurements may include the length of a user’s torso, a distance between a finger and the chest of a user, a distance between the face and the chest of a user, and the like. These distance measurements may be combined with heart beat data to estimate the blood pressure of a user, and/or other biomarkers.
  • biomarkers such as HRV, RMSSD (root mean square of successive differences between normal heartbeats), high frequency power, low frequency power, and the like. Relationships between these biomarkers have been linked to various physiological states, including inflammation, certain kinds of disease, training recovery, and the like.
  • Customized Insights can be calculated from one or more biomarkers, and from historical trends observed in historical biomarker data.
  • a “Stress Score” reading is a Customized Insight.
  • the “Stress Score” reading is an output that is a calculated value that is calculated using algorithms that receive trended HRV, HR, and respiration as inputs.
  • Additional Customized Insights include EHRV’s Morning Readiness Score, EHRV’s HRV Score (a normalized manipulation of InRMSSD), Stress Score, Productivity Score, Emotional State, and Illness Risk. These are calculated from HRV, HR, respiration, blood pressure (e.g., BP) and other input biomarkers.
  • Personalized Guidance may be calculated by algorithms that analyze the Customized Insights together with various biomarkers.
  • the Personalized Guidance provides behavioral advice taking into account the user’s Customized Insights and the various biomarkers. For example, if the Customized Insights algorithm receives a low value for a Readiness Score, then Personalized Guidance might be to train with a lower intensity. While a Personalized Guidance is not medical advice nor is it a medical diagnosis, it may still be valuable to a user.
  • HRV refers to “Heart Rate Variability” which is a measure of the variability in inter-beat timing of a heart as it is actively beating. While heart rate (e.g., HR) is simply measured as the number of beats of the heart over a given period of time, such as 70 beats per minute, HRV refers to the variations of the precise time durations between the beats. That time variation can vary between around 10 milliseconds (ms) of variation between beats to more than 100ms of variation between beats. This amount of time between beats is referred to as inter-beat intervals, e.g., IBI’s.
  • IBI inter-beat intervals
  • the IBIs When the IBIs are measured by an ECG, the IBIs may also be known as R-R intervals (also used interchangeably with “RR intervals”).
  • R-R intervals also used interchangeably with “RR intervals”.
  • HRV value There are several algorithms used to calculate an HRV value from IBIs and/or RR values. Regardless of the algorithm used, determining an HRV value requires precise heart beat data. Importantly, the biomarkers are not measured, they are estimated from raw sensor data, as described above.
  • HRV scoring refers to the development of a score that is calculated utilizing various algorithms to present a scaled score from which comparisons over time may be made.
  • Moorning Readiness refers to a score related to a user's relative balance of parasympathetic and sympathetic nervous system activity based on the user's trended data and baseline status .
  • the present disclosure generally relates to systems, devices, and methods to measure and track human user biomarkers.
  • the systems and devices of the present disclosure can utilize multiple sensor types, such as optical sensors, pressure sensors, electricpassive sensors, and the like.
  • the various sensors may be disposed on parts of a user’s body, such as the upper or lower parts of a user’s arm or leg, on one or more of a user’s fingers, and/or on the chest of a user. Some sensors may be positioned separate and away from the user, such as a camera sensor located on a phone, laptop, desktop, webcam, security camera, and the like.
  • the miniaturization of technology allows a user to measure more biomarker data than previously available.
  • the increased amount of sensors, and related sensor data can be incorporated by HRV algorithms to deliver more advanced biomarker physiology insights based on the increased number and type of sensors.
  • some optical sensors worn on the arm and/or the finger of a user can measure PPG (photoplethysmography) signals that can be transformed into data that displays the variations of a heart’s contraction and relaxation.
  • Electric-passive sensors such as chest straps or multi-lead ECG sensors can measure electrical heart signals that are transformed into ECG traces.
  • Pressure sensors can be integrated into smartphones and wearable devices. Sensors worn against the skin of a user may be able to measure skin temperature, and skin pH. Furthermore, continuous glucose monitors can measure blood glucose levels automatically, and nearly continuously.
  • each sensor can be used to measure and/or track one or more parts of a user’s body to collect readings therefrom. These readings can be combined with one another to provide a detailed analysis of user physiology. Use of a plurality of sensors can exponentially increase power effectiveness of any of these methods and derive greater accuracy of user data.
  • the collection of camera image-based physiological data may allow ML algorithms to augment, supplement, and/or validate sensor data. Collecting the image-based data may be performed utilizing visual light and/or infrared cameras pointed at the face of one or more users.
  • the collected image data may provide insight into biomarker data such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, or ketone levels or a combination of the foregoing.
  • biomarker data such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, or ketone levels or a combination of the foregoing.
  • the resultant biomarker data may be combined to provide a score that may include HRV data, or may consist of collected biomarker data as a corollary to a calculated HRV score.
  • This data may provide insight into a user’s general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing.
  • Heart Rate Variability e.g., HRV
  • HRV Heart Rate Variability
  • the HRV system and method herein disclosed describes improved techniques for obtaining usable HRV data using generally available sensors (e.g., physiological sensors, smartphone cameras or other sensors as herein described) and techniques for improving the quality of HRV data or using lower quality HRV data, including improving signal to noise ratios. This will permit more confident HRV scoring when using lower quality sensors such as cameras and permit HRV scoring to be produced from smaller data samples, i.e., reducing the time needed to take useful readings.
  • HRV data has been collected for single (not longitudinal) snapshots of the user's autonomic nervous system, usually derived with longer readings (5 minutes to hours long) with clinical grade equipment.
  • HRV data has been collected for single (not longitudinal) snapshots of the user's autonomic nervous system, usually derived with longer readings (5 minutes to hours long) with clinical grade equipment.
  • the user has the opportunity to tag the HRV data collected during any reading with contextual information. This tagged information doesn’t affect the scoring, but it can affect the Customized Insights and Personalized Guidance that are provided, and may assist the user in understanding the data and how it relates to any of a user’s goals.
  • the tag data types may include sleep data, exercise data, mood ratings, questionnaires, custom tags/notes, blood glucose level, body weight, as well as other relevant data to be shared with the user.
  • the user may also link the data with third-party apps and services to automate any contextual data collection and display other types of data alongside the HRV data and Morning Readiness scores.
  • the HRV system utilizes captured data from one or more HRV readings to calculate an HRV Score, scaled on a 1-100 basis, based on the natural log of the Root Mean Square of Successive Differences (RMSSD) for the HRV data collected.
  • Changes in the HRV Score correlate with changes in: breathing and respiratory patterns; physical stress; recovery from physical stress; physical performance; psychological stress and health; emotion and mood; cognitive performance; immune system function; inflammation, posture and structural health; injury; biological age; general health and wellbeing; resilience and adaptability; risk of disease; morbidity and mortality; motivation and willpower; and digestive stress.
  • a user may compare the calculated HRV score and other non-proprietary HRV parameters to general population and/or demographic-filtered population data to provide an indication of how the user compares with the general population as a whole or with specific filtered portions of the general population. This comparison may provide a user with some indication as to changes in their HRV values with respect to their own historic values as well as historic values for a given population.
  • the HRV system may also create a daily expressed score for use in tracking a user’s HRV values over time.
  • This daily expressed score is known as a Morning Readiness score.
  • the Morning Readiness score is a scaled score (1-10) that shows the relative balance or imbalance in the user’s sympathetic and parasympathetic nervous system, as well as providing a number and color indicator of the user’s ability to handle the stress and challenges of the day.
  • the Morning Readiness score correlates with day-to-day fluctuations in the nervous system for an individual, highlighting to the user when major changes may have occurred in the body, based on the user’s own unique individual patterns.
  • data must be collected from at least two HRV readings to establish a true baseline and to begin calculating the Morning Readiness score.
  • the Morning Readiness score may be generated through automated pattern recognition applied to the user’s HRV scores over time.
  • the pattern recognition is based on research and uses statistical methods such as standard deviation and mean over time to create the Morning Readiness score each day.
  • This pattern recognition is further refined by research in the HRV system’s unique database of HRV data collected and stored for each user registered with the HRV system.
  • Machine learning algorithms may be applied as testing and data analysis prove machine learning algorithms to be of equal or greater accuracy than the HRV system’s human-generated algorithms.
  • the machine learning algorithms utilized by the HRV system may be trained using the HRV system’s database in order to produce an algorithm that automatically detects a user’s HRV trend and assigns a morning readiness score.
  • the HRV system may also create custom algorithms to determine customized scores. Such customized scores may include an inflammation score or other scoring parameters that may utilize HRV collected data in combination with other scoring data, or other parameters that are collected from outside data sources.
  • the HRV system has access to large electronic data stores containing large amounts of collected HRV data from users of the system as well as metadata from HRV collected data. Much of the HRV data is labeled with contextual tags (metadata) and can be reviewed to label appropriate portions of the data as training data. For example, a user may tag that they are sick, have ingested some caffeine, or have just exercised. In this way, the retained HRV data and metadata may be used to create models that classify HRV data into contextual tag categories, proxies and/or equivalents, or new categories. This data categorization and metadata may permit the HRV system to detect conditions of interest in newly collected HRV data through the use of machine learning algorithms.
  • the HRV system may utilize such machine learning algorithms to predict various conditions of interest to the users of the HRV system.
  • Such conditions may include the ability to predict physical conditioning, physical performance levels, stress levels, EHRV’s Morning Readiness Score, EHRV’s HRV Score (a normalized manipulation of InRMSSD), Stress Score, Productivity Score, Emotional State, and Illness Risk, and other physical conditions affecting one or more users.
  • the HRV system may utilize data organized into any number of statistical or visual patterns.
  • the data may be organized into one or more Poincare plots (a two-dimensional plotting of beat variability), geometric plots, DFA plots, Power Spectral Density charts, spectrograms, or any number of unique data visualizations that could potentially be run by machine learning (ML) algorithms to analyze data patterns and derive insights, or to identify conditions based on patterns present in the data.
  • the plotted data may be derived from single-reading statistical and visual analysis, and/or from long-term trending patterns. Utilizing machine learning algorithms to perform an analysis of such data plots may permit the HRV system to discover new conditions, such as, in a non-limiting example, conditions associated with stress, athletic performance, fatigue, and other conditions affecting the users of the HRV system.
  • HRV data has been collected. This lack of guidance stems from a lack of certainty in the reliability of input HRV data itself and the inability to leverage reliable outcome data that correlates HRV data with specific plans, courses of action, and outcomes. This results in HRV data being used to generally estimate the user’s nervous system state and provide equally general feedback. In a non-limiting example, HRV data has been used to provide daily feedback regarding the body’s apparent ability to handle a stressful workout, a binary categorization of a user’s risk for a condition, etc.
  • an application for a training plan may guide a user through a series of HRV measurements, scores, and plan steps to customize the training for the user based on his or her actual HRV data.
  • An initial HRV reading is taken, followed by a programmed event (e.g., a workout). Thereafter, the application guides the user to take a subsequent, updated HRV reading.
  • the plan may be modified. The decision as to how the plan is to be modified may be programmed into the application, e.g., based on HRV data research, learning from the community, etc.
  • the application may provide data feedback related to the HRV score, the HRV scoring trend, contextual feedback, the Morning Readiness scores, or a combination of the foregoing. This allows the user(s) to understand, based on HRV data and other contextual data, the effectiveness of the plan, why it has been modified, etc.
  • this general technique is applicable to a wide variety of possible applications, ranging from near-term or planned applications for guided breathing and meditation, to exercise and fitness plan modification and food sensitivity validation based on HRV scores combined with other data, and even long term plans to use HRV data in novel contexts.
  • additional applications may include modifying the behavior of systems like gaming systems, content recommendation systems, or vehicles using HRV data.
  • HRV data tends to be somewhat difficult to understand. This lack of understanding of what HRV data specifically indicates regarding a user’s physical condition has resulted in the use of various scores. While these various scores are quite useful in driving home the meaning of a user’s current HRV readings, current existing scores may also serve as a defined endpoint to guidance or advice that could flow from the HRV data.
  • the system and method described herein plans to provide an improved set of recommendations that include specific guidance based on HRV data and other physiological, behavioral, and outcome-based data.
  • the improved recommendations may be provided periodically, as part of an ongoing plan, or provided in real-time for live biofeedback. This will allow users to more confidently approach a myriad of tasks that could be improved by monitoring HRV data as well as other aforementioned data and tailoring specific feedback on the basis thereof. This may include provision of various custom scores and directed, goal-oriented applications for individuals or groups.
  • the HRV system may expand on the scores or indices that are provided to users by leveraging the proprietary database of collected HRV data.
  • Scores of interest to the HRV system include a recovery score, an inflammation score, a cognitive function score, a readiness score for specific goals (e.g., triathlon readiness), a health score, a fitness score, a stress index, a “tilt” score (gaming/poker term for being stressed), a self-awareness score, and a glucose/HRV/ketones index.
  • a cognitive function score may have sub-categories that may include: chronic/acute stress; attend on/focus; productivity; and so on.
  • Examples of health score subcategories may include: inflammation; resilience; anxiety; and so on.
  • Examples of fitness score sub-categories may include: recovery, readiness, VChMax, and so on.
  • Existing research may be useful in designing algorithms to make these predictions.
  • a cognitive function score may be created based on research indicating HRV scores are related to cognitive capability.
  • HRV data quality remains a concern, particularly when attempting to offer anything but general guidance. While high quality HRV data can be obtained using a biosensor specifically designed for the task, such as a finger sensor or chest ECG strap, collection of high quality HRV data remains cumbersome due to the need for biosensors and somewhat extensive collection times.
  • the system and method herein described may provide improved techniques for obtaining usable HRV data, as well as additional data input by a user or a medical service provider, using generally available sensors and input methods (e.g., smartphone cameras). This will permit more confident HRV scoring and combination scoring when using lower quality sensors such as cameras and permit HRV scoring to be produced from smaller data samples, thus reducing the time needed to take useful readings.
  • generally available sensors and input methods e.g., smartphone cameras.
  • the efficacy of any heart rate variability metric critically depends upon the signal to noise ratio of its source data.
  • series of so called IBI’s inter-beat-intervals, i.e. time between consecutive R waves in the QRS complex, or the time between detected beats in a PPG pulse-waveform
  • IBI inter-beat-intervals
  • the QRS complex is the combination of three of the graphical deflections seen on a typical electrocardiogram (ECG or EKG). It is usually the central and most visually obvious part of the tracing. It corresponds to the depolarization of the right and left ventricles of the heart and contraction of the large ventricular muscles.
  • the Q, R, and S waves occur in rapid succession, do not all appear in all leads, and reflect a single event and thus are usually considered together.
  • ECG electrocardiogram
  • PPG photoplethysmography
  • PPG functions by using a flash on the camera to emit light, by using the ambient light behind a finger or earlobe as the light source, or by a combination of both light sources. Because the arterioles and arteries distend when blood is pumped by each heartbeat, the opacity of the tissue varies with the cardiac cycle.
  • HRV monitoring and related analytics may be provided through a proprietary finger sensor attached to a user and used for collecting HRV data (photoplethysmogram (PPG) data collected using LEDs), although the mobile app allows users to input HRV data using third- party sensors (e.g., chest strap that collects electrocardiogram (ECG) data and provides interbeat intervals for calculating HRV).
  • HRV data photoplethysmogram (PPG) data collected using LEDs
  • PPG photoplethysmogram
  • third- party sensors e.g., chest strap that collects electrocardiogram (ECG) data and provides interbeat intervals for calculating HRV.
  • ECG electrocardiogram
  • the system may utilize the physiological sensor, either an electrocardiogram (ECG) or a photoplethysmogram (PPG), to detect heart beats of a user of the system.
  • ECG electrocardiogram
  • PPG photoplethysmogram
  • the sensor measurements Upon collection of the measurements from the physiological sensor, the sensor measurements derive the peak of the heart contraction and report the time, in milliseconds, between peaks. This derived set of measurements defines the inter-beat intervals or, as commonly known, the R-R intervals.
  • the physiological sensor may be specified as a finger mounted sensor, although other sensors applied to different parts of a user’s body may be equally effective in capturing the sensor measurements.
  • the physiological sensors may be wearable, and may include chest straps or patches, smartwatches, wrist or arm wearables, ring wearables, hats, clothing, or other devices that can contact a portion of the body known to one skilled in the art.
  • the wearable sensors may be based on ECG or PPG physiological sensors.
  • HRV system may couple the integrated sensor data, such as pupil dilation information with other data collection, such as user supplied blood pressure and device data, such as data readings from an accelerometer or other sensors installed within an electronic device such as a smart watch, smartphone, tablet, or other computer based sensors.
  • integrated sensor such as a wearable device that collects HRV data natively (e.g., Apple Watch) or a smartphone or other computer based camera that facilitates image-based HRV data collection.
  • the HRV system may couple the integrated sensor data, such as pupil dilation information with other data collection, such as user supplied blood pressure and device data, such as data readings from an accelerometer or other sensors installed within an electronic device such as a smart watch, smartphone, tablet, or other computer based sensors.
  • sensor finger-based physiology detection sensors are currently in use for collecting HRV data. These sensor readings may be improved by reducing finger movement via reduced reading times or finger stabilization techniques or a magnetic accessory that attaches to the finger to stabilize it, among other methods for finger stabilization. Sensor readings may also be improved with the use of machine learning algorithms.
  • camera, face-based (e.g., image-based), physiology detection is another mechanism to collect image data that can be used to collect HRV data, and derive HRV scores and other biometric data.
  • the image-based, HRV-derived biomarker data may be combined with other biomarker data, such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, ketone levels.
  • the image-based, HRV-derived biomarker data may also provide insight into a user’s general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing.
  • the image-based detection may analyze still images, video images, or a combination of both.
  • image/video analysis permits a user to point any camera with high-enough resolution and/or sufficient speed (e.g., frames per second) at his or her face and to detect signals that may be used to estimate heartrate and HRV, as well as other data such as blood pressure, pupil dilation, etc., some or all of which may be combined into a composite score of which HRV data is only a portion.
  • the image/video analysis may operate in situations wherein the image/video data was collected with environmental lighting (e.g., ambient lighting). In use cases where the environmental lighting was low, the image/video analysis may use image-correcting algorithms to adjust the intensity and brightness of the pixels in the frames.
  • remote cameras can be used to determine this data, such as at a sporting event.
  • this data such as at a sporting event.
  • the image-based data may be used to derive the various composite or customized scores that use HRV data, breathing rates, oxygen levels, blood pressure, pupil dilation, eye movement/blinking rate, emotional state and additional such parameters.
  • Face-based detection is important because it can be used more naturally during certain activities, such as driving a car, whereas finger-over-phone camera or other sensing is not possible or not preferable.
  • the HRV system has the ability to use one or more existing databases of HRV data and HRV metadata to improve the accuracy of image-based HRV scoring and other determinations.
  • the HRV system can utilize high quality HRV data and related HRV scores to learn which images or features are associated with the HRV scores. That is, with machine learning, the HRV system can learn which features in an image/video are most closely correlated with higher HRV scores. This may involve the use of one or more machine learning algorithms, for example, to categorize images or image features and associate these categories with particular HRV data and scores.
  • the R-R intervals may be transmitted to a system server using a wireless protocol such as Bluetooth, although this should not be considered limiting as alternative wireless protocols may be used such as BLE, Wi-Fi, NFC, ZigBee, or other such protocols developed in the future, for storage and analysis.
  • the system may have a plurality of software modules that analyze the data to determine hourly measurements, daily measurements, and/or measurements throughout the day for heart rate variability (HRV) in an individual.
  • HRV heart rate variability
  • the raw waveform is processed by a beat detection algorithm to determine where true heart beats occurred. For finger-over-camera or face-over-camera PPG measurements, there are two levels of beat detection and artifact detection.
  • the raw color signal is de-noised, and then cleaned up prior to generating the pulse waveform and estimating the beats from the signals.
  • the IBIs are run through another artifact detection algorithm that detects artifacts such as arrhythmias or ectopic beats.
  • the artifact detection/correction algorithms may be performed directly on estimated IBIs.
  • the denoising step may be performed directly on the pulse waveform, followed by estimating the beats and IBIs, and then finally performing the artifact detection step.
  • the R-R intervals are calculated on the hardware device, and then artifact detection and correction is performed on the R-R interval signal.
  • beat detection algorithms take the form of QRS complex detection algorithms. QRS detection may utilize wavelet analysis or some other pattern matching system. Algorithms for beat detection for both ECG and PPG vary across devices and software applications. After being processed at this stage, what remains is a series of interbeat-intervals (IBIs). An IBI is simply the amount of time (usually milliseconds) between two subsequent beats. A typical value might be 1,000 ms, which would in turn correspond to an instantaneous heart rate of 60 bpm. Given a noise free recording under perfect conditions, the IBIs could be used to directly calculate HRV as they are. As such is rarely the case, it is at this point that artifact detection algorithms should be applied to the IBIs to test for any errors or artifacts that may have entered the signal thus far.
  • IBIs interbeat-intervals
  • the HRV system software manages connections to multiple sensors, assisting the user in selecting the appropriate sensor for the current measurement.
  • the Elite HRV software Upon receipt of R-R intervals from the hardware, the Elite HRV software, in real-time (within a second or two), displays the beat patterns and/or pulse waveforms and received data visually to the user for live or real-time biofeedback in the form of calculated heart rate, calculated HRV values, visual charts of heart rate patterns and R-R interval patterns.
  • the Elite HRV software also checks, again in real-time, the received data for accuracy and quality.
  • the data quality checks are based on published research standards (typically done manually by physiologists or research teams), historical population data, and patterns in prior data received within the same reading or session, i.e., beats are analyzed recursively throughout the reading as new beat intervals are received.
  • the HRV system also assists the user visually and algorithmically in identifying when the user's heart rate has stabilized at the beginning of a reading.
  • RMSSD Root Mean Square of Successive Differences
  • the RMSSD analytical method is the industry standard time domain measurement for detecting Autonomic Nervous System (specifically Parasympathetic) activity in shortterm measurements, where short-term is defined as approximately 5 minutes or less.
  • a natural log (In) is applied to the RMSSD calculation.
  • RMSSD does not chart in a linear fashion, so it can be difficult to conceptualize the magnitude of changes as it rises and falls. Therefore, it is common practice in the application of RMSSD calculations to apply a natural log to produce a number that behaves in a more linearly distributed fashion.
  • the In(RMSSD) is expanded to generate a useful 0 to 100 score.
  • the In(RMSSD) value typically ranges from 0 to 6.5.
  • the system may be able to sift out anomalous readings and create a much more accurate scale where everyone fits in a 0 to 100 range - even Olympians and elite endurance athletes.
  • the HRV score may correlate with changes in breathing and respiratory patterns, physical stress, recovery from physical stress, physical performance, Psychological stress and health, emotion and mood, cognitive performance, immune system function, inflammation, posture and structural health, injury, biological age, general health and wellbeing, resilience and adaptability, risk of disease, morbidity and mortality, motivation and willpower, and/or digestive stress.
  • the customized HRV score may be transmitted to a medical practitioner or directly to a user, where the medical practitioner or user may compare the customized HRV score, and other non-proprietary HRV parameters, to population data and/or demographic- filtered population data to provide a basis in comparison to a selected population.
  • HRV readings when measuring HRV changes before or after specific events, it is recommended that HRV readings should be taken for at least 60 seconds immediately pre-and post any activity or event. For better accuracy in the HRV readings it is recommended that the user keep the same body position between readings that the user wishes to compare to past or future readings. [0172] In an alternative non-limiting example, HRV readings can gather relevant HRV data in as little as 30 to 60 seconds duration or as long as 24 hours. However, for Morning Readiness type readings, it is recommended by the HRV system that the user take at least a two-minute reading to collect HRV data for that time.
  • Morning Readiness may be estimated from readings of 60 seconds (e.g., 1 minute) to 300 seconds (e.g., 5 minutes), but certain HRV indices (such as frequency domain metrics) are only available for readings over 120 seconds (e.g., 2 minutes)
  • This data collection effort should be performed after the user’s period of longest sleep.
  • HRV indices such as frequency domain metrics
  • the HRV system recommends taking data collection readings of between 4 and 20 minutes in duration and repeating as often as required by the user. During this data collection period the user has the option to turn on audio and/or visual cues for guided breathing patterns, mindfulness, and meditations.
  • the system has a mobile application (app) for use in capturing and transmitting information between the user and the HRV monitoring system.
  • the mobile app currently focuses on providing general data (e.g., heart rate), scaled HRV score, and Morning Readiness score coupled to high-level or general feedback based on the HRV data.
  • general data e.g., heart rate
  • scaled HRV score e.g., scaled HRV score
  • Morning Readiness score coupled to high-level or general feedback based on the HRV data.
  • the mobile app also provides more detailed HRV metrics for every reading, such as different time domain and frequency domain indices. For example, a Morning Readiness score may be presented as a numeric value and a gauge graphic.
  • the software modules in the system server may be active to manage connections to multiple sensors, assisting the user in selecting the appropriate sensor for any desired measurement.
  • the system software Upon receipt of the R-R intervals from the sensor(s), regardless of the sensor utilized, the system software immediately performs a set of functions in real-time, where real-time is specified as an interval of less than two seconds from the receipt of the R-R interval information.
  • the system software displays the beat patterns and received measurement data visually to the user for live feedback to the user in the form of calculated heart rate, calculated HRV values, visual charts of heart rate patterns and R-R interval patterns. This feedback to the user is also known as biofeedback.
  • the system software is operative to check the received measurement data for accuracy and quality. In a non-limiting example, data quality checks are based upon published research standards, historical population data, and/or patterns in prior data received within the same “reading” or measurement collection activity. Heartbeats are analyzed recursively throughout the reading as new beat intervals are received.
  • readings can be as little as 30 second to 60 seconds in duration or as long as 24 hours.
  • recommendations to the user are to take a reading between 60 and 180 seconds in duration, with the average being approximately 120 seconds (2 minutes) after the period of longest sleep, which is typically a morning reading.
  • the recommendation is to take a reading of between 4- and 20-minutes duration, repeating as often as the user desires to foster user actions.
  • the Morning Readiness Score may correlate with day-to-day fluctuations in the nervous system for an individual, highlighting to the user when major changes may have occurred in the body, based on their own unique individual patterns.
  • the ranges of the Morning Readiness score provide information to the user on whether the user is in a Sympathetic or a Parasympathetic status on that given day.
  • values in the 1-3 portion of the range are in the red zone of a gauge as represented on a gauge score graphic. This indicates a wide swing in balance either towards the Sympathetic or Parasympathetic side. A wide acute swing in either direction is usually in reaction to a strong acute stressor or reaching a threshold of accumulated stress.
  • Values in the 4-6 range are in the yellow zone. Yellow indicates a similar, but not as drastic, change in relative balance as a red indication. Yellow days are often nothing to worry about in isolation. Values in the 7-10 range are the green zone.
  • the readiness scores use standard deviation thresholds for an individual’s historical (baseline) HRV data to determine the readiness for the day.
  • a perfect 10 score is achieved when the relative balance is slightly Parasympathetic leaning. This means that if the user normally scores around a 45 on the HRV score, then an HRV score of 46 may produce a relative balance score of 10. For example, one user could produce a score of 10 with an HRV of 46 (and baseline around 45) while another user might produce a 10 with an HRV of 47 (and baseline around 45) because of their historical data and would produce different thresholds (i.e. personalized to them).
  • the sensitivity of the 1-10 relative balance score depends on a user’s individual patterns. If the user often fluctuates widely day-to-day, then the user’s relative balance gauge will become less sensitive to change. If the user’s HRV scores hardly fluctuate at all during the baseline period, the relative balance gauge will become more sensitive to small changes. Additionally, utilizing proprietary data analysis algorithms and machine learning systems the sensitivity to small changes may be increased further permitting greater accuracy for the relative balance score and the reporting of any fluctuations in a user's relative balance gauge.
  • the data analysis results in an instant HRV score and a morning readiness score that can be used for spot checks, and can be used as a parameter to be analyzed over time to determine long term HRV measurements for an individual.
  • the instant HRV scores are also accumulated and analyzed over time to help physicians and users in tracking HRV, forming a part of the health tracking data for the user.
  • This instant HRV score is also used by professional and elite athletes to analyze their heart rate variability to optimize performance, and may be used by a coach or an automated software algorithm to also create training plans based upon the athlete’s performance as shown by the HRV score.
  • the morning readiness score provides a daily baseline indication for the user. This score is trended and charted over time, to help the user understand how acute, short-term, mediumterm, and long-term choices and events impact their score over time.
  • HRV data readings may currently be taken utilizing various devices where such devices may, include a mobile device have a network connection capability, such as a smartphone, iPad, tablet, wearable mobile device, laptops and other mobile devices, as well as cameras and sensors incorporated directly into fitness equipment. Data may also be able to be generated by sensors built directly into a mobile device and may not require a connection, wired or wireless, to external sensors. During the reading, the user may also have access to audio and/or visual cues to present guidance on breathing patterns, mindfulness, and meditations.
  • a mobile device have a network connection capability, such as a smartphone, iPad, tablet, wearable mobile device, laptops and other mobile devices, as well as cameras and sensors incorporated directly into fitness equipment. Data may also be able to be generated by sensors built directly into a mobile device and may not require a connection, wired or wireless, to external sensors.
  • the user may also have access to audio and/or visual cues to present guidance on breathing patterns, mindfulness, and meditations.
  • the user may have an opportunity to tag the HRV reading with contextual information.
  • the tag information may be attached to the completed score derived from the HRV reading but does not affect the calculation, analysis, or creation of the completed score.
  • An optimized future system may utilize the tagged HRV reading with contextual information to discover meaningful patterns identified in data analysis or by machine learning algorithms to generate composite metrics that utilize the contextual information in the metric generation.
  • the tag information may be attached to the score record to assist the user in understanding the HRV reading and score data and how the data relates to any goals that have been expressed by the user.
  • a user may add tag information consisting of sleep data, exercise data, mood ratings, questionnaires, custom tags/notes, blood glucose, body weight, or any other data that is useful for assisting the user in achieving their goals.
  • the user may link an established account maintained on the system server with 3 rd party applications and services to automate the collection and display of other types of data that may be associated with the collected HRV reading data and scores, including an established morning readiness score and composite scores reflecting HRV data and ancillary data collected from one or more users.
  • the signal quality of the sensor is analyzed in conjunction with the full data captured by the sensor during the HRV reading action.
  • the system server is operative to create a novel, customized signal quality rating.
  • This signal quality rating may be provided to inform and educate the user on the validity and quality of the rating (e.g., the quality of the data and therefore the quality of calculated values and insights) when received and reviewed by the user. That is, the signal quality rating relates to the quality of the data, and therefore the quality of calculated values and insights.
  • the signal quality of the HRV measurement apparatus is currently analyzed initially and again with the full collected data from an HRV reading.
  • a proprietary signal quality rating is provided to the user to educate them on the validity and quality of the reading.
  • This signal quality rating is based on internal research determining the degree of confidence in a result given a certain frequency, total amount, and magnitude of signal artifacts from all sources, as compared to the total duration of the reading and the detected patterns present within the reading.
  • the signal quality score is based on published research standards that have been previously created by physiologists and/or research teams, historical population data that has been collected over time, and patterns in prior data received within the same data collection reading or session.
  • customized scoring may be generated from the analysis of the received signal data upon determination that the received HRV reading data is in a form that is ready for analysis by the receiving device, where the receiving device may consist of a system server, a smartphone with or without an internet connection, and/or fitness equipment having an internet connection or having an embedded analysis software module. This may occur when the received signal data is free of artifacts and signal corruption. While there are many potential sources of signal corruption, the net effect, regardless of corruption source or sensor type, can be classified as one of two fundamental types. Either:
  • the beat detection algorithm missed one or more beats that actually occurred (type 1), or
  • the beat detection algorithm detected one or more false beats that did not actually occur (type 2).
  • Type 1 is sometimes referred to as a false negative, type 2 as a false positive.
  • type 2 as a false positive.
  • these two types of artifacts have their own distinct waveform patterns and properties, such that the corrupted signal can be analyzed, and often times the impact of corruption can be mitigated or eliminated entirely.
  • ectopic beats can exhibit properties of both false positives and false negatives.
  • the HRV system uses a proprietary blend of algorithms for different scenarios to analyze and clean the signal from the data collection sensor or device, as needed. Attempts to clean the signal and improve the data collection effort may include feedback to the user in certain circumstances. In a non-limiting example, the system could suggest to the user data collection times to take readings to improve data collection (as with the Morning Readiness score), suggested postures associated with certain artifacts to remove the generation of these artifacts, suggestions for sensor(s) placement, etc. to reduce artifacts. Moreover, signal clean up (e.g., filtering techniques) may be applied differentially based on detection of known issues such as incorrect posture.
  • signal clean up e.g., filtering techniques
  • the simple thresholding described so far is a static variant, in which the threshold values are not modified per signal.
  • a dynamic variant would be one in which the mean of the signal is calculated, and the thresholds set as mean + cl and mean - c2 for some constants cl, c2. In fact, this system suffers the same weaknesses as the static scheme.
  • a slightly better approach to the dynamic simple thresholding described above would be to replace the mean with the median, and the cl, c2 values with cl*std, c2*std, where std is the signal's standard deviation. Adding this flexibility to the algorithm helps account for the large difference in non-artifactual variance observed across individuals. Still, this system suffers from critical flaws. Most notably the standard deviation of a signal is quite sensitive to artifacts itself. Therefore, if the signal contains a large number of artifacts, or a few artifacts of large magnitude, this system will allow the less deviant, but still artifactual intervals through.
  • the first innovation is to analyze not the IBI intervals themselves, but rather the differences between subsequent intervals. This strategy minimizes the negative impact of valid local variations in heart rate, while retaining the ability to capture artifact generated spikes or impulses.
  • the second innovation is quantile-based threshold determination.
  • the Bemtson algorithm is an industry standard which utilizes both IBI difference analysis and quantile thresholding to good effect. This algorithm assumes a normal distribution of beat differences in order to calculate the Maximum Expected Difference (MED) for veridical beats, and well as the Minimum Artifactual Difference (MAD).
  • MED Maximum Expected Difference
  • MAD Minimum Artifactual Difference
  • QD is the quartile deviation of the IBIs.
  • the artifact cutoff threshold is then taken as a mean of the two values, which given normally distributed IBI differences will cover at least 97.5% of artifact-related differences, though in practice the number is often higher.
  • the first modification is regarding the logic which marks artifactual beats given threshold-exceeding IBI differences. In principle, the Berntson algorithm marks pairs of IBI’s, not individual IBI’s, which can be seen as one cost- of-difference based method.
  • the second modification is a set of heuristics for identifying contiguous runs of artifactual beats.
  • Another artifact detection technique with traction in the literature is based on impulse response detection.
  • the strategy is to calculate a series of deviations from the median in order to detect unusually large impulses, then normalize each of these differences with another median derived value specific to each RR series.
  • a windowed version of this algorithm enhances accuracy by cutting the target series into overlapping windows and calculating the median and normalization factor for each window separately. It also sets the overlap factor such that each value (except first few) are tested at least twice.
  • Xj(h) is the normalized difference from the median of the h th element in the j th window. Note that the median in the denominator is calculated once for the entire window.
  • PWIR pattern-based windowed impulse response
  • Patterns fall into three categories that determine the appropriate corrective action to perform on an artifactual RR. Possible corrective actions include interpolation and recovery of split intervals via addition. The benefit of this method is that it tests not only the magnitude of an impulse, but the shape formed by every four consecutive samples. This allows for stricter threshold values without major increase in false positives.
  • Ectopic beat a. Single point negative spike followed by single point positive spike. b. Ectopic beat prevents normal beat from occurring and then subsequent beat come on original schedule.
  • the final algorithm tested is one based upon the Integral pulse frequency modulation (IPFM) model of heart rate variability.
  • IPFM Integral pulse frequency modulation
  • the IPFM model also called the “integrate and fire” model, describes the beating of the heart in terms of sympathetic and parasympathetic inputs to the sinoatrial (SA) node via a modulating function of time: m(t).
  • SA sinoatrial
  • m(t) modulating function of time
  • ectopic beats are even common in healthy individuals. These contractions can be either ventricular or atrial in origin, and are quite distinguishable from normal beats on an ECG.
  • the ventricular ectopic beats can be further classified into ones in which the normal heart beat following the artifactual beat occurs “on schedule”, and supraventricular ectopic beats in which the normal heart beats are effectively “reset” by the artifactual beat.
  • Known patterns such as these can be exploited by artifact detection algorithms. Additional physiological sources of artifacts include atrial fibrillations, ventricular fibrillations, and muscle contractions.
  • HRV scoring In addition to manually designing algorithms to reduce signal noise and improve HRV scoring from lower quality data, a large amount of historical HRV user data may be leveraged to provide more accurate HRV scores using lower quality data or less data. This additional data analysis allows for HRV scoring to be completed in a shorter time span or completed with data of a higher quality than is otherwise not currently possible or optimal. In a nonlimiting example, a user can provide less HRV data or provide HRV data of lower quality and receive a valid HRV score, perhaps tempered with a score quality or confidence rating. [0206] The HRV system may use machine learning to associate the presently input HRV data with a particular class or category based on a model trained with previously recorded HRV data and scores.
  • a user’s input of lower quality HRV data may be insufficient to assign an HRV score using normal processing via a static algorithm.
  • the lower quality input HRV data may be input to a machine learning algorithm trained using the pre-existing HRV data currently collected and stored in the HRV system’s database.
  • the HRV system may optionally utilize contextual data or a composite of signals to boost the quality of the collected HRV data and thus provide an HRV score of higher confidence in terms of accuracy and quality.
  • the machine learning algorithm may be trained using historical HRV data from a validated sensor. This permits the software to provide an HRV score even though the data collected typically would be insufficient.
  • a reported quality score may indicate the technique used or the confidence of the score reported.
  • a novel, customized HRV score may be calculated from the analysis of the received HRV reading data.
  • the system may receive the R-R intervals directly from a chest strap heart rate monitor or other sensor device attached to a user. Obvious artifacts within the data, such as readings that are out of bounds, obviously incorrect, or corrupted, are cleaned and/or removed.
  • the raw, unaltered R-R intervals are backed up securely to an electronic database maintained within the system server. This allows for optimization and improvement of algorithms for all current and past calculations, as well as for the export of the raw, unaltered R-R intervals to a different system or storage location if desired by the medical practitioner or user.
  • an additional novel and proprietary score may be prepared by the system and transmitted to a user on a daily basis, in the morning and based upon a morning readiness HRV reading performed by the user.
  • the Morning Readiness gauge indicates a user’s state of relative balance. In other words, it is comparing the user’s HRV values to the recent past and providing a comparison for the user on whether the user’s Autonomic Nervous System (ANS) is in a similar state or if it is swinging widely outside of the norm for the user.
  • ANS Autonomic Nervous System
  • additional scores may be calculated based upon additional physiological data in addition to collected HRV data.
  • additional physiological data such as image data, environmental data, historical health history data, or other biometric data may form the basis for one or more biological health scores that include HRV data as a particular component.
  • the collection of camera image-based, physiological data may be performed utilizing visual light and/or infrared cameras pointed at the face of one or more users.
  • the collected image data may provide insight into biometric and /or biomarker data such as heart rate, blood pressure, temperature, oxygen levels, CO2 levels, glucose, ketones, as well as insight into a user’s general awareness or alertness, stress, reflex time, resilience, or a combination of any of these data categories.
  • the resultant combined score may include HRV data or may consist of collected biometric data as a corollary to a calculated HRV score.
  • the HRV system may provide users with the benefit of score and performance analysis to assist in predicting success with short-term and long-term physical goals and recommendations and suggestions on how to achieve identified user goals.
  • HRV system can submit data related to their goals, plans, HRV data and outcomes and utilize the HRV system to identify and/or formulate optimal plans to achieve their desired goals.
  • a recommendation may take the form of a general training plan or a training plan customized for the individual.
  • Community members may vote on these plans to surface the best plans, which could be promoted to users, e.g., based on HRV data similarity to those that have completed the plans.
  • the HRV may also provide Artificial Intelligence (Al) enhanced and implemented performance predictions and plan suggestions. These predictions and plan suggestions may take the form of a virtual coach, but specifically incorporating HRV data as an input. These Al suggested plans or virtual coaches may take the place of user submitted plans.
  • the HRV system may develop machine learning algorithms that take user profile data, including HRV data, and use it to predict the type or level of exercise to suggest to the user to achieve a specific goal. Similarly, this profile data, including HRV data, may be used to predict performance during an activity, such as running or biking. Additional types of program suggestions could be implemented outside of the health and fitness domain while still making use of HRV data.
  • Al Artificial Intelligence
  • the additional program suggestions may realize the benefit of the scoring provided by the HRV system to create a service for users.
  • the HRV system may leverage its ability to accurately analyze HRV data as a service to others.
  • the HRV system may offer a scoring service by which the HRV system receives HRV data collected by a third-party application, analyze the third- party collected HRV data as a service to a user or third-party entity, and output the analysis to the third-party app for use by the third-party app.
  • This service offering may include receiving and ingesting the collected raw HRV data as a cloud service or offering an API to third parties for data ingestion, processing the raw data, and outputting proprietary score(s) to the requesting application.
  • This service may be offered by the HRV system and used to operate on a variety of different input data types and produce a variety of different HRV based scores, system and application modifications, or data displays.
  • the HRV system may also provide trend and analysis information based upon HRV data collected and scores derived from the HRV data collected.
  • the Morning Readiness score calculated by the HRV system may provide a daily baseline indication for the user.
  • the Morning Readiness score is trended and charted over time to help the user understand how acute, short-term, medium-term, and long-term choices and events impact the score over time.
  • the HRV Score and other data and parameters can be charted and analyzed longitudinally, as well as for each individual reading.
  • the large amount of existing HRV user data may allow the HRV system to provide more specific guidance to users in view of the user’s trend data.
  • the HRV system can discover, either utilizing a manual review or an automated machine learning process, that prior users exhibiting a similar trend had a positive or negative outcome by making certain adjustments.
  • These data insights can form the basis of customized feedback for the users given their data trends, desired outcomes and past user experiences.
  • the HRV system may associate a current user’s trend data and a stated goal (e.g., mental health, weight loss, etc.) with other users having similar trend data, known modifications (e.g., increased exercise, decreased sleep, etc.), and the same or similar stated goal.
  • a stated goal e.g., mental health, weight loss, etc.
  • the HRV system software can suggest changes that have been helpful for past members and provide cautionary information about modifications or continuations of the same behavior that have been historically harmful or negative for members in the past.
  • the HRV data may indicate inflammation in the body and may be analyzed to create an inflammation score for tracking adverse conditions, also forming a portion of the tracked health data.
  • the user has the option to link their data to a team, where a coach, wellness practitioner, or medical practitioner may view the data.
  • users In addition to requests to the HRV system for analysis of their own data, users have the option to link their own collected data to a team or group, where a coach or healthcare practitioner can view the data.
  • the coach or healthcare practitioner in turn may have access to team level and individual team member level HRV based feedback, such as proprietary scores, customized modifications to training plans, etc.
  • This allows the users, e.g., coaches, trainers, healthcare professionals, to access customized guidance and recommendations for clients, patients, etc., e.g., at the team or organization level, subgroups within the team or organization, or individual team or organization members. This permits group leaders to have access to HRV data of the team or group and associated HRV-based guidance with increasing specificity.
  • a CrossFit gym may obtain an HRV-based suggested modification (e.g., color coded Green/Yellow/Red indication) to the workout of the day (WOD) for individual users or groups of users.
  • HRV-based suggested modification e.g., color coded Green/Yellow/Red indication
  • WOD workout of the day
  • modifications may be selected based on global data (e.g., other users having similar HRV readings or trends) or more specific data, e.g., coach or healthcare professional modifications matched to HRV recommendation categories.
  • a matrix display may be provided for dynamically organizing team or group members per HRV system-based suggestions or modifications.
  • a variety of user interfaces and functionalities may be provided in connection with a team-based view.
  • the HRV system may provide a capability to sort team members by HRV- based workout intensity recommendation.
  • a matrix may be displayed organizing the team or group members into columns and rows, such as one user per row, with a color coded (or otherwise indicated) HRV based modification, along with an HRV score in associated columns.
  • HRV system-based modifications may be paired with predetermined, customized guidance per user, such as that input by a coach, health practitioner, etc.
  • the matrix can be re-organized to dynamically group users via various modalities.
  • the HRV system may prepare the matrix listing users per sub-group (e.g., offense and defensive positions), based on HRV scores (or ranges), based on modifications, or based on any grouping that provides useful information to the user.
  • the HRV system may be implement utilizing a finger sensor based on LEDs that collect PPG data.
  • the finger sensor uses three LEDs (infrared, red and green).
  • the LEDs are paired with sensors (detectors) on opposing sides.
  • the LEDs cycle to attempt to obtain a strong reading, which assists in handling user differences (skin tone, cardiac patterns, etc.).
  • the LEDs take readings at 500 MHz.
  • the current sensor can measure other data, such as pulse oximetry data, in addition to HRV data.
  • the HRV system data collection readings may be performed utilizing other sensor devices including gaming input devices, AR/VR gloves, or other physical sensors.
  • the HRV system may accept HRV data collected by any available hardware device that provides sufficient signal quality to collect the HRV data at acceptable sample rates.
  • the HRV system may collect HRV data using an integrated sensor such as a wearable device that collects HRV data natively.
  • integrated sensors may include devices such as, in a non-limiting example, an Apple Watch, or a smartphone or other computer-based camera that facilitates image-based HRV data collection, coupled with other data collection (e.g., blood pressure, pupil dilation, device data such accelerometer, etc.).
  • data collection e.g., blood pressure, pupil dilation, device data such accelerometer, etc.
  • Use of existing sensors of the user’s common hardware e.g., smartphone, smartwatch, laptop, etc.
  • cameras and finger-based physiology detection sensors are among the few currently viable options. These have been used by others for obtaining HRV data. These sensors may be improved by reducing finger movement via reduced reading times or finger stabilization techniques utilizing a magnetic accessory that attaches to the finger to stabilize it, etc.
  • such data can be used to validate treatments, for display or feedback by gamers or those watching a live streaming event.
  • the data may be utilized in an office to determine when employees should take breaks, to guide meditation or breathing practices using live, real-time feedback, to create, modify or evaluate the efficacy of corporate wellness programs, and in stress level monitoring.
  • the HRV data may also be used in content recommendation systems, to enhance sports broadcasting and news broadcasting, or to modify the behavior of systems or devices, such as the behavior of automated vehicles, self- service kiosks, gaming systems, advertisement or content selection systems, smart home devices, office furniture, etc.
  • the user’s detected HRV data or a score using the collected HRV or other data may be used to influence advertisement selection (alone or in combination with other contextual data, e.g., GPS location of the user’s device) or to influence music selection systems to change music based on a user’s determined mood or goal for the day (and the current progress towards that goal).
  • the collected HRV data could in turn be fed into other device applications, e.g., virtual assistants or smart home devices to adjust their recommendations, tone, etc., or to adjust office furniture, room temperature, ergonomics, and sleep environment.
  • scores or suggested modifications may be provided as a service to various third-party applications and devices.
  • the HRV system may collect and accumulate HRV data, HRV scores, camera sensor-based image data, and/or other biometric data for a user. This data may be acquired from various sensors and sources of data and stored in an electronic storage apparatus. The HRV system may then look to the user to define a goal with regard to improving one or more HRV and/or combined HRV and composite data scores that the user wishes to achieve and provide a suggested activity to achieve that goal.
  • the activity suggested for the user may be one part of a predetermined plan based at least in part on a model trained using HRV data, HRV scores, camera sensor-based image data, and/or other biometric data.
  • the HRV system may obtain biometric data for a plurality of users, including HRV data collected from a sensor, and utilize the biometric data to train a model utilizing machine learning algorithms.
  • the machine learning algorithms may classify the biometric data into one or more predetermined classes. This collected data may then be analyzed by the HRV system to predict an HRV score, or a composite score, during or after completion of a pre-established activity.
  • the HRV and composite data may be normalized to reflect population trends and help a user understand their particular scores as compared against population averages and norms.
  • the HRV system contains a method for biometric monitoring and scoring in which the HRV system is actively collecting biometric data from a plurality of users.
  • the collected data measures physiological data and environmental data associated with said plurality of users over time and stores all collected data in an electronic storage system.
  • the HRV system is then active to analyze all collected data to create a composite score that is based at least on heart rate activity data, biometric and/or biomarker data, and environmental data.
  • the composite score may be compared against historical composite scores to determine activity modifications that will impact the behavior of any of a plurality of users prior to collecting additional biometric data.
  • the HRV system may present these activity modifications to the plurality of users on a display device such as a wearable device, smartphone, or other mobile device in combination with recommended actions to accomplish said activity modifications.
  • the HRV system may utilize one or more signal cleaning algorithms to detect artifacts in the collected data and improve the collected data by removing any detected artifacts that impair the signal quality of the biometric and/or physiological data. Additionally, the signal cleaning of the collected data may be performed during a data collection action and the cleaned collected data is stored in an electronic format prior to analysis of said cleaned collected data.
  • the HRV system may track the composite score over a preconfigured time span.
  • the composite score may consist of at least HRV data, biometric data, physiological data, and environmental data associated with a single user or a group of users.
  • the HRV system may receive from a user or medical practitioner a threshold composite score or composite score range that is preferred for the user to maintain.
  • the HRV system may transmit to a user recommended actions comprising events, interventions, and/or planned steps in accordance with maintaining said user’s particular composite score.
  • the HRV system may utilize any of a finger sensor, an LED sensor, a chest-strap electrocardiogram sensor, a camera, or sensors contained within or attached to a mobile device associated with a user.
  • the composite score may be presented to a user as a numeric value and a gauge graphic to permit the user to visually understand changes in the composite score over time. Additionally, the composite score, recommendations, and guidance may be provided as a report, as part of an ongoing data display, or in real-time as live biofeedback to a user during an activity.
  • a machine learning model may be trained to learn the patterns in the sequence of video frames from finger- over-camera or face-over-camera images that are associated with the blood volume change in the tissue.
  • the training data may be pairs consisting of video frames recorded from finger- over-camera or face-over-camera images as input and R-R intervals from an HRV sensor used as the desired output, or ground-truth data. This input and output pair may need to be synchronized before training the model by using signal processing peak detection methods.
  • a longer video can be subdivided into shorter videos using a moving time window. The shorter subdivided video segments may be fed into the model to generate R-R intervals.
  • This model may be a deep learning network, such as CNN (Convolutional Neural Network).
  • the machine learning model may include supervised learning.
  • a supervised learning model may be used to eliminate the need to extract features or to do feature engineering.
  • a supervised learning model nay be trained to learn to extract the features from the video frames such that it is possible for the video data stream to be incorporated into the machine learning model.
  • a machine learning model may be trained to utilize a series of R-R intervals (e.g., a data stream of R-R intervals) as input into the machine learning model, and to have as an output a time-domain, or frequency-domain HRV metric (e.g., biomarker) such as an HRV Score, a RMSSD value, and/or high-frequency power value as an output.
  • This machine model may be a LSTM (Long Short Term Memory) model that can learn timedependent patterns in a time-series signal.
  • the machine learning model may be trained to detect and remove artifacts from a data stream of R-R interval signals.
  • the artifacts to be removed may be caused by motion, arrhythmias, premature ectopic beats, atrial fibrillation, measurement and/or signal noise.
  • a hybrid machine learning model may be trained.
  • a stacking model incorporating a CNN (convolutional neural network) model stacked with an LSTM (e.g., long short term memory) model to detect HRV value directly from the video input.
  • the stacking model may be formed by stacking the CNN model on the LSTM model, or by stacking the LTSM model on the CNN model.
  • a machine learning model may be trained to identify a subset of pixels (instead of all of them) in video frames in order to generate the R-R interval and HRV measurement.
  • a subset of the entire set of pixels in a video frame are identified, and then used by the machine learning model to determine the biomarker (e.g. metric) of interest.
  • This pixel selection of the subset of pixels may increase the signal-to-noise ratio in the extracted information.
  • the identification of the subset of pixels may reduce the amount of computation time, and/or power to generate the output of the model.
  • the frames in the video may be divided into an N x M grid.
  • Features extracted from each cell in the grid are used to train a machine learning model that picks the one or more cells with the highest information content.
  • the R-R intervals or HRV value estimates are fused together to provide for determination of the biomarker (e.g. metric) of interest.
  • a machine learning model may use an attention mechanism to do spatial segmentation in the frames.
  • spatial segmentation an image may be subdivided into multiple segments.
  • every pixel in the image is associated with an object type.
  • all objects of the same type are marked using one class label while in another embodiment, instance segmentation, similar objects get their own separate labels.
  • the machine learning model includes an encoder and a decoder. The encoder extracts features from the video image through filters, and the decoder is responsible for generating the final output which is usually a segmentation mask containing the outline of the object.
  • R-R intervals and HRV measured from ECG may be used as ground-truth.
  • the R-R intervals determined from ECG measurements and PPG measurement may have some differences.
  • a machine learning model may be trained to learn the difference between these values, and use the differences to improve the performance of the model.
  • the factors affecting the difference between the R-R intervals as measured from ECG and PPG include heart diseases, blood pressure, artery wall thickness and elasticity, body and environmental temperature, red cell count, hemoglobin content, etc.
  • a model instead of training a machine learning model on input video frames, a model may be trained on PPG time-series extracted from video frames by using image processing methods.
  • a machine learning model may be trained by computing the spectrogram or scalogram on a sliding window on PPG represented as an image, and a machine learning model such as a CNN model may be used to learn the associated HRV.
  • approaches similar to those described above may be used to train machine learning models to estimate other biomarkers, such as respiration rate or other metrics of HRV.
  • biomarkers such as respiration rate or other metrics of HRV.
  • supervised learning, LSTM, CNN, finding a subset of pixels, and spatial segmentation may be used to estimate other biomarkers, such as respiration rate or other metrics of HRV.
  • a machine learning model may be trained to predict various conditions of interest for a user based on an HRV trend. Such conditions may include the physical conditioning, physical performance levels, stress levels, EHRV’s Morning Readiness Score, EHRV’s HRV Score, Stress Score, Productivity Score, Emotional State, and Illness Risk, and other conditions such as having diabetes or cardiovascular disease.
  • a machine learning model may be trained using contextual data from the HRV system’s database paired with associated HRV values.
  • a machine learning model may be trained to improve sensor readings and HRV estimation when using a camera to collect data from finger-over-camera scans by detecting movement of the finger, proper covering of the camera’s lens, proper finger placement on the camera, and proper amount of finger pressure on the camera, and/or movement of the phone while taking the measurement. This information may be used to guide the user to improve how they use their phone to collect data.
  • a machine learning model may be trained to improve sensor readings and HRV estimates when using a camera to collect data from a user’s face during a camera-over-face measurement.
  • a each HRV reading may be tagged with contextual information.
  • This tagged contextual information e.g., data
  • the tag information may consist of sleep data, exercise data, mood ratings, questionnaires, custom tags/notes, blood glucose, body weight, or any other data that is useful for assisting the user in achieving their goals.
  • a machine learning model may be trained to report quality score or the confidence of a biomarker estimation or prediction.
  • a machine learning model may be trained to be a virtual (e.g., Al) coach to perform predictions, suggestions or, recommendations to help a user achieve a specific goal.
  • This model may learn from HRV changes for that user to learn how a user’s body responds to various triggers, lifestyle choices, environmental conditions, etc.
  • the model may suggest specific actions, lifestyle choices, or behavioral changes.
  • an Al coach may use machine learning models that are trained to associate a current user’s biomarker, trend data, and a stated goal (e.g., mental health, weight loss, etc.) with other users having similar data and trend, known modifications (e.g., increased exercise, decreased sleep, etc.), and the same or similar stated goal. This may be known as “look alike” modeling. Having this information, the HRV system software may suggest changes that have been helpful for past members and provide cautionary information about modifications or continuations of the same behavior that have been historically harmful or negative for other members in the past.
  • a stated goal e.g., mental health, weight loss, etc.
  • transfer learning methods may be used to train two machine learning models sequentially.
  • a model trained to estimate HRV values from finger-over- camera videos may be used to train a machine learning model to estimate HRV from face- over-camera videos.
  • a machine learning model may incorporate unsupervised or supervised learning methods to be trained to find sub-populations that have similar R-R dynamics or HRV characteristics. Identifying a sub-population to which a user belongs, may help to improve biomarker estimations, predictions, and/or classification tasks. Knowing a user’s sub-population can help with look-alike modeling, as described above. Other types of data such as age, gender, ethnicity, medical history, activity level, etc. may be included in this clustering.
  • a machine learning model may be trained to classify and/or cluster a user’s R-R intervals dynamics.
  • each class or cluster is associated with a specific value of HRV Score and/or other HRV metrics (e.g., biomarker).
  • HRV Score e.g., HRV Score
  • HRV metrics e.g., biomarker
  • a machine learning model may be trained to use natural language processing (e.g., NLP) to develop unstructured tags from analyzing users' spoken words and/or written messages.
  • NLP natural language processing
  • the app may quantify the mood, stress level, energy, etc. by analyzing the user’s tone, speed of talking, choice of words, and the like.
  • FIG. 1 A schematically illustrates a trace of heart beat data 100, such as would be produced by a pulse waveform from a PPG signal.
  • the trace includes the R portion of the heart beat data, as identified in Ri, R2, and R3.
  • the R-R intervals e.g., RR
  • the RR values contribute to the determination of the user’s HRV.
  • the trace of heart beat data 100 is the result of measuring raw sensor signals that are transformed into a heart beat trace.
  • the raw sensor data may be collected from optical sensors, electrical sensors, a time series of still images, video images or a combination of any of the listed sensor types.
  • Time series data whether from a series of still images or from a video, is required for calculating and/or deriving R-R intervals and HRV.
  • the HRV system may analyze the sensor data, whether from a single source or from a combination of the sensors, using algorithms trained by ML to convert (e.g., transform) the raw sensor signal into biomarkers.
  • FIG. 2A Illustrates an exemplary embodiment 200 of using a smartphone to measure user heartbeat data.
  • a user places a finger 230 over an optical sensor 220 on the smartphone 210.
  • the optical sensor 220 is coupled to an illumination source (not shown).
  • the illumination source provides white light to the surface of the user’s finger 230 and the optical sensor 220 records variations in the color and intensity of the light reflected off of the various tissues in the user’s finger. These tissues include skin, blood vessels and capillaries, as well as the blood coursing through the blood vessels and capillaries.
  • Algorithms in the software analyze the recorded images and calculate precise electrical heart beat data from EEG and ECG measurements, and precise optical heart beat data from PPG measurements.
  • the optical sensor may be a camera sensor located on a phone, laptop, desktop, webcam, security camera, and the like. This embodiment may be referred to as finger-over-camera measurements, and the camera may be referred to as a finger-over-camera sensor.
  • the finger-over-camera sensor may be used with or without the illumination source.
  • biomarker data of a user is measured and incorporated into the Customized Insights.
  • biomarker data can include measurements such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, and ketone levels.
  • biomarker data may also provide insight into a user’s general awareness, awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing.
  • an optical sensor e.g., a camera
  • an optical sensor may be used to measure color and intensity data detected from the face of a user through still image and/or video image analysis.
  • Still image and/or video image analysis permits a user to point any optical sensor (e.g., camera), having sufficient resolution and/or frame speed, at his or her face and detect color and intensity data which can be transformed into biomarker data such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, and ketone levels, and may be used to provide the user with estimates of the user’s general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing.
  • biomarker data such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, and ketone levels
  • FIG. 2B illustrates an exemplary embodiment 250 of capturing an image of the face of a user to collect face-based data.
  • an optical sensor 260 records an image of the user’s face 270 which is displayed on a smartphone 210.
  • This embodiment may be referred to as face-over-camera measurements, and the camera may be referred to as a face-over-camera sensor.
  • the face-over-camera sensor may be used with or without an illumination source.
  • the camera sensor data may include a time series of still images and/or video images of the face of the user. It is important that the camera-over-camera sensor data include time series data. That is, the algorithms use changes in the color and intensity of the raw sensor data in the algorithms, and in learning through ML processes.
  • one or more sensors can be used to detect biomarkers as part of a dual sensor input system and/or a multi-sensor input system.
  • optical sensors 220 and 260 can be integrated with a camera, such as in an embodiment of a mobile phone 210, as shown in FIGS. 2A and 2B, in order to determine biomarkers of the user.
  • the sensors 210 and 260 can be used to record the physiological data of the user using a face-over-camera sensor 260 either as still photos in a time series, and/or as video images, provided the camera is of sufficient quality and or has sufficient speed, as shown in FIG. 2B.
  • the biomarkers can be determined by contacting a finger over-camera sensor 220 with a finger of the user 230, as shown in FIG. 2 A.
  • the finger-over-camera sensor 220 and the face-over-camera sensor 260 can be used in combination to interact with one or more body parts of a user to collect measurements therefrom.
  • An RR probability model trained from the historical data of a higher quality sensor which is used in-the-loop for RR modeling from camera data, can be used as input features for a supervised classification system for physiological conditions.
  • Each of the sensors 220 is compatible with a variety of camera configurations.
  • the finger- over-camera sensor 220 can be placed on a rear-facing camera of the phone 210 to facilitate user grip and allow readings or measurements to occur while the phone 210 is being held in a position that is natural for a user.
  • the finger-over-camera sensor 220 can utilize the index finger for conducting measurements, though it will be appreciated that any finger or part of the hand can be placed over the sensor to take the measurement.
  • the raw sensor data can be transformed into biomarkers to link camera data patterns in R-R intervals and other biomarkers to various morbidities.
  • morbidities to which the data patterns can be linked can include certain cancers, diabetes, kidney disease, heart disease, high blood pressure, and so forth.
  • the biomarker data captured by the sensors can be converted into biomarker values used to calculate a score or index for various conditions.
  • the presently disclosed system can utilize captured data from one or more HRV readings to calculate an HRV Score, scaled on a 1-100 basis, based on the natural log of the Root Mean Square of Successive Differences (RMSSD) for the HRV data collected.
  • Changes in the HRV Score correlate with changes in: breathing and respiratory patterns; physical stress; recovery from physical stress; physical performance; psychological stress and health; emotion and mood; cognitive performance; immune system function; inflammation, posture and structural health; injury; biological age; general health and wellbeing; resilience and adaptability; risk of disease; morbidity and mortality; motivation and willpower; and digestive stress.
  • the system can be configured to measure physiological signals based on a time series signal.
  • the camera can be used over a period of time to detect color changes in light reflected from the skin of a user/patient, e.g., the face of a user/patient.
  • every pump of the heart can cause an increased flow of blood to the face, which can cause color changes that are detected by the camera in different color scales and intensities. Taking measurements of the face over time allows a time series signal to be created of the patient’s face that can be processed to create an average profile of the face under specific conditions.
  • FIGS. 3A-3C illustrate alternative embodiments of wearable sensors 310 that can be used with the embodiments of the present disclosure.
  • a sensor can be placed over the finger to determine the biomarkers discussed above.
  • Additional examples of the wearable sensors 320 can include chest straps 330 or patches, smartwatches 340, wrist or arm wearables 350, ring wearables 360, smart apparel, or other devices that can contact a portion of the body known to one skilled in the art.
  • These wearable sensors 320 can be used alone, and or in combination with the finger-over-camera 220 or the face-over-camera sensor 260 to take additional measurements, validate measurements taken by other sensors, and/or improve accuracy of measurements to build a user profile, among other advantages known to one skilled in the art. Readings performed by the wearable sensors 320 can be combined with readings from the finger-over-camera 220 or the face-over-camera sensor 260 to enhance accuracy of the resultant user profile constructed from the collected data.
  • data from the additional sensors can be used to perform validation of readings from a given sensor.
  • a sensor e.g., the finger-over-camera sensor 220
  • biomarkers measured by the face-over-camera sensor 260 can be used to validate measurements taken by the finger-over- camera sensor 220, though, it will be appreciated, that biomarkers measured by the finger- over-camera sensor 220 may be used to validate measurements taken by the face-over-camera sensor 260.
  • substantially simultaneous operation includes the finger-over-camera sensor 220 and the face- over-camera sensor 260 being activated within approximately 0.1 seconds of one another, though in some embodiments, substantially simultaneous operation can include from approximately 0.01 seconds to approximately 2 seconds.
  • the measurements of the biomarkers can be used to train machine learning within the system.
  • the system can be configured to learn ranges of biomarkers collected from the user over time, and use these previous measurements in combination with subsequent measurements to tailor ranges which the user experiences.
  • the machine learning models can be used to suggest biomarker measurements to be taken based on previous readings performed by the user.
  • the machine learning models can be trained in phases. For example, the color extraction and intensities detected by the sensors can detect the heart rate, from which the RR interval can be determined.
  • the RR interval data can be collected and graphed over a time span of at least 30 seconds, although the time span may be longer if greater accuracy is desired, and the data is collected continuously over the reading time span.
  • the time-series RR intervals can be converted to two-dimensional images, such as scalograms or spectrograms, for purposes of direct biomarker estimation or classification.
  • the machine learning models can train the system to take measurements that allow phasing out of one or more of the sensors over time. For example, initial calibration and measurements of certain biomarkers can be performed using a plurality of sensors, e.g., the finger-over-camera sensor and the face-over-camera sensor. Following repeated measurements of the biomarkers of a given user, the machine learning models can construct a profile of the user that includes ranges of the values of the measured biomarkers. As more measurements are taken and the ranges of the values become smaller, the machine learning models can obtain biomarker values using a single sensor of the one or more sensors, which results in quicker, more accurate measurements. The number of measurements taken can vary, but is commonly presumed that the system can be trained after 5-10 initial readings.
  • this range is not to be taken as a limit, but as an initial estimate of the number of initial readings. It will be appreciated that the machine learning models can continue to access the plurality of sensors as needed and/or for validation of measurements of certain biomarkers if a measured value falls outside of the expected range.
  • training of the system can occur sequentially.
  • the finger-camera sensor 220 can be trained using machine learning models that are exposed to validated HRV sensor data. Once the finger-over-camera sensor 220 is sufficiently trained, the finger-over-camera models can be used to train the face-over-camera sensor 260 and/or additional sensors of the presently disclosed system.
  • FIG. 4 presents a view of artifact detection accuracy in terms of the detection of false positive artifact detection consistent with certain embodiments of the present invention.
  • the mean false positive rate is the ratio of falsely annotated artifacts to the total number of veridical intervals such as, in a non-limiting example, negative artifacts.
  • the false positive artifact detection is performed utilizing the Mod-Berntson, IPFM and PWIR processes. This figure displays the relative accuracy of each method when compared directly.
  • the exceptionally high False Positive Rate (FPR) of PWIR can be mitigated by increasing its threshold parameter, but not without inducting a reduction in false positives on missed beats which hurts the HRV estimation mean error more than the false positives do.
  • FPR False Positive Rate
  • FIG. 5 presents a view of artifact detection accuracy in terms of the detection of true positive artifact detection consistent with certain embodiments of the present invention.
  • the system presents a mean true positive rate as the ratio of correctly identified artifacts to the total number of artifacts 500. While not shown here, the performance of IPFM when evaluated exclusively on spurious beat-type artifacts is actually exceptional. Unfortunately, the instantaneous-derivative metric used by IPFM as a threshold metric is not nearly as sensitive to missed beats, which hurt its overall performance significantly. It was also found that the median FNR value on all artifact conditions for the Modified Berntson algorithm was 0%.
  • IPFM and WPIR algorithms were applied using threshold parameters recommended by Osman et al. Further analysis might include a complete parameter search against the present test data. It is worth noting that the modified Berntson algorithm is robust across data conditions given its standard parameterization.
  • FIG. 6 presents a view of artifact impact on the system consistent with certain embodiments of the present invention.
  • the best performance for IPFM was found with threshold set to 4.5, and for PWIR set to 2.5. These thresholds are used in all included plots. The optimal threshold can differ significantly with data source, therefore performance of these algorithms on a data source for which there is no prior knowledge may be worse. Alternatively, no parameter optimization was performed for the modified Berntson algorithm. This property of being functionally parameterless is of considerable value when no opportunities for pre-emptive tuning or source analysis are available.
  • FIG. 7 presents a view of the display of HRV statistics for a user post reading consistent with certain embodiments of the present invention.
  • the display presents a display of the time-domain and frequency-domain results for the HRV reading taken by a user or by a medical practitioner, a coach, or other professional on behalf of the user.
  • the time-domain statistics may include, but are not limited to, the mean RR interval, rMSSD, In(rMSSD), SDNN, PNN50, NN50 and a 7-day HRV Calculated Value (CV).
  • the frequency domain statistics may include, but are not limited to, the total power consumed, the Low Frequency (LF) power consumed, the High Frequency (HF) power consumed, and the ratio of LF to HF power.
  • FIG. 8 presents a view of the display of the continuation of HRV statistics for a user post reading consistent with certain embodiments of the present invention.
  • the display presents the display of frequency domain results for the HRV reading as previously described and continues with a display of the user’s heart rate results for the reading.
  • the heart rate results statistics presented include but are not limited to, the minimum heart rate, maximum heart rate, and average heart rate captured during the data reading action.
  • the sensors of the present disclosure can provide a number of readings for determining biomarkers when activating the sensors. For example, as shown in FIG. 9, the sensors can be used to take a Morning HRV reading, an Open HRV reading, an HRV snapshot, or a Research Reading, among others.
  • Each reading and/or snapshot can trigger the sensors to determine biomarkers over a specific time frame, e.g., whether to take an instantaneous measurement, or a series of measurements over an extended period or duration.
  • the readings can be selected by toggling the appropriate function on a screen of a phone 900 by a user, with each reading configured to be customized according to user preferences.
  • FIG. 10 illustrates an exemplary interface 1000 of the system when the HRV Snapshot is toggled for measurement.
  • HRV Snapshot when the HRV Snapshot is selected, users have the option to Track and monitor respiration rates in addition to measuring a snapshot of their HRV.
  • one or more of several sensors may be activated to measure the respiration rates.
  • the chest strap 330 may be activated.
  • the face-over-camera sensor 260 may also be activated, with the camera being pointed towards the user’s body, e.g., face, to determine respiration rate of the user.
  • the two sensors, the face-over-camera sensor 260 and the chest strap 330 may be used individually, or in conjunction with each other.
  • FIGS. 11 A-l IE illustrate a system tutorial of use of the finger-over-camera sensor 220.
  • the finger-over- camera sensor 220 in some embodiments, can be used in combination with the chest strap 330 or another feature in order to facilitate accuracy of the biomarker. Once the chest strap 330 is secured, the user is instructed to place the pad of their finger over the finger-over- camera sensor 220, which in this embodiment is the rear-facing camera, though, in some embodiments, the finger-over-camera sensor 260 can include the front-facing camera.
  • the rear-facing camera is chosen due to a user’s natural grip in holding a phone that places the fingers proximate to the rear-facing camera, which allows for more straightforward measurements to be taken.
  • the application verifies that the lens is covered, appropriate pressure is exerted on the sensor, and that there is sufficient lighting prior to taking a reading.
  • the system can collect a supplementary signal to the face-over-camera sensor 260 and/or the finger-over-camera sensor 220 by measuring the force of the finger (pressure) applied on the touch screen.
  • the combination of the sensors and the supplementary signal, taken simultaneously and/or sequentially can be used to strengthen the accuracy of the data collected by the sensors overall.
  • the system can receive pulse data from the chest strap 330 and verify that the finger- over-camera sensor 220 is able to make biomarker measurements from a user’s contact therewith. Once the user’s position is verified, the user is instructed to remain substantially immobile while the reading begins automatically.
  • the duration of the reading can be substantially instantaneous, e.g., approximately one second, though, in some embodiments, the reading can occur over a period of approximately 10 seconds, approximately 20 seconds, approximately 30 seconds, approximately one minute, approximately two minutes, or approximately three minutes or more.
  • a reading may take between 1 second and 10 minutes, but a typical duration is 1-2 minutes. While the reading is taken, the system records a timestamp of the readings taken by the chest strap and the finger-over-camera sensor 220 such that the readings of each can be correlated with one another at specific moments to collect multiple data points to confirm accuracy of the readings.
  • FIG. 12 illustrates an interface during which the reading is in progress.
  • the user interface 1200 can include a reading of the heart rate and the HRV score, as well as a graph showing one or more of these values throughout the duration of the measurement.
  • an option to stop measurement is included, along with a timer to indicate progress of the reading to inform the user how much time remains before results are provided.
  • FIG. 13 illustrates the results of the readings taken by the system in accordance with the embodiments discussed above.
  • the results can include an HRV chart and a heart rate chart showing the values measured over the duration of the measurement, as well as additional biomarkers.
  • the charts can be customized based on the biomarkers being measured and/or based on user preferences that can be customized in settings.
  • the system can provide reminders regarding use of one or more of the sensors. An exemplary embodiment of such reminders is shown in FIG. 14, which provides reminders prior to starting the face-over-camera reading.
  • Some non-limiting examples of such reminders can include instructions for proper phone positioning to obtain the most accurate measurements, sitting still and avoiding turning the face away from the sensor, maintaining good lighting, and staying in the frame, since removing the user from the frame can negatively impact determination of biomarkers and create inaccuracies in the data.
  • FIG. 15 illustrates the communication of a wearable sensor 1500 over the finger of the user to collect pulse data, though it will be appreciated that the connection can be formed with any wearable sensor, such as illustrated by 330.
  • the face-over-camera sensor 260 can be calibrated. Even though the wearable sensor is collecting pulse data for calibration, it is understood that the collection of data from the face-over-camera sensor 260 can be initiated substantially simultaneously.
  • FIGS. 16A-16F illustrate calibration of the face-over-camera sensor 260.
  • the user can position their face in the designated region with respect to the camera such that the sensor can detect the face.
  • proper face detection can be adjusted and checked on the interface of the phone to ensure that the user’s face appears in the designated region.
  • the system can perform an accuracy check to determine that the user’s face is detected and proper lighting is being used, at which point the system is ready to perform biomarker measurements, as shown in FIG. 16C.
  • FIGS. 16D-16E illustrate different views that the user may choose from (swipe between) while they are taking a face-over camera HRV snapshot reading that is in progress.
  • 16D is intended to provide no distraction and includes instructions to “relax and breathe normally.”
  • the HRV snapshot is set to be taken over the course of one minute, with changes in the user interface indicating progress of the HRV snapshot.
  • the user can be in any of the screens ( as shown in FIGS. 16D-16E) throughout the measurement, depending on their preference.
  • 16E is intended to provide live biofeedback or biomarker data. As the measurement continues, values for such biomarkers such as heart rate and HRV score, as well as a graph for the duration of the measurement are displayed, as shown in FIG. 16E. 16F is simply showing the user their face recording.
  • FIG. 17 illustrates an error message that can be received during the HRV snapshot.
  • an error message can be sent to the user.
  • the message can indicate that measurement of the biomarkers can continue via the other sensors, such as those taken by the finger-over-camera sensor 220 and/or the chest strap 330.
  • HRV readings can continue as the other sensors continue to collect data. Once the issues identified in the warning are fixed, tracking can resume.
  • users can add tags and/or events to inform the system about their overall well-being and log recent events. Events can be logged before or after the reading is performed. The events may be logged before the results are provided to afford the system more data for tracking when building out the user profile. As shown in FIG. 18, tagging can be enabled in the app, with tags including moods, and energy levels, as well as recent events, such as sleep schedules, exercise routines, and illnesses can be tracked, which will allow the system to account for discrepancies in biomarkers.
  • the system may use ML to incorporate location tags, such as GPS, and respiratory tracking. For example, if respiration is abnormal, and the user indicated that they have been ill, the machine learning capabilities of the system can account for that when building the user profile to understand that the illness may be a contributing factor to data discrepancies.
  • the app may use ML with natural language processing (e.g., NLP) to develop unstructured tags from analyzing users' spoken words or written messages. The app may quantify the mood, stress level, energy, etc. by analyzing the user’s tone, speed of talking, and choice of words, and the like.
  • NLP natural language processing
  • FIG. 20 illustrates an exemplary embodiment of results from the Research Reading of the face-over-camera sensor 260.
  • the results can display a respiration rate over the course of the measurement as well as respiration rate history, e.g., over the past three days, as shown.
  • the results can also display the tags that were selected, as well as values of body position, and options to save and share results to social media platforms, such as Twitter and Facebook.
  • FIG. 21 illustrates the home screen 2100 of the system.
  • the home screen 2100 can illustrate user history, previously measured biomarkers, comparisons of readings performed over the previous few uses of the system, and so forth.
  • the home screen 2100 can also be customized to display data associated with the readings that have been performed by the user during the use history of the application.
  • the home screen 2100 may be customized to display data associated with the readings that have been performed by the user during the use history of the application, additional data that has been extracted, as discussed above, can be combined with the associated data from a smartphone or other devices to build an overall profile of the user.
  • Some non-limiting examples of the associated data can include GPS location and elevation, local events such as pandemic news, disasters, political sentiment, metadata about the user (age, gender, other demographics), other biomarkers such as blood pressure, pulse ox, blood glucose levels, mood (obtained via integrations, or direct user input), and/or behavior data (exercise, sleep, food, supplements, medical activity).
  • the associated data is retained in the system to aid in building the overall profile of the user, but is not displayed on the home screen 2100.
  • the system can perform a live heart rate detection and breathing rate measurement.
  • the user can position their face in relation to the face-over-camera sensor 260 to scan an image of the face.
  • facebased detection can be used more naturally during certain activities, such as driving a car, whereas finger-over-phone camera or other sensing is not possible or not preferable.
  • the face-over-camera sensor 260 can track user data by one of the following: color extraction (visible and invisible wavelengths), pixel movement, and/or eye-specific movement, including pupil dilation, as well as pixel intensity.
  • FIG. 22A is an exemplary embodiment of a calibration of a scanner function that can collect biomarker data live via the face-over- camera sensor.
  • the system can use various points of the face to extract the color profile thereof throughout the duration of the measurement to provide live heart rate and respiration rate readings.
  • FIG. 22B is an interface for performing the face-over-camera live reading
  • the face-over-camera sensor 260 can detect blood flow changes and slight variations in color in certain areas of the face to determine the heart rate as well as biomarkers to further build the profile of the user.
  • the finger-over camera sensor 220 and the face-over camera sensor 260 can communicate with a web browser, a projector, and/or a computer application installed on a laptop or desktop.
  • the face-over-camera sensor can be part of a webcam or an integrated camera in a laptop that can detect the user’s face.
  • the system can create a profile of the user within the system and take measurements thereof over a certain time period, e.g., as the user sits at the computer, to measure biomarkers over a period of time to improve the accuracy of the user profile.
  • FIG. 23 illustrates an exemplary embodiment of such a system for detecting biomarkers in users utilizing a camera or a live feed.
  • the system can interact with a webcam of a desktop or laptop to determine biomarkers over video conference such as Zoom, Facebook Messenger, Facetime, and the like.
  • a webcam of a desktop or laptop can determine biomarkers over video conference such as Zoom, Facebook Messenger, Facetime, and the like.
  • users can input such parameters as height, weight, and stress levels, and the system can use the sensor attached to the webcam in ways similar to that of the face-over-camera sensor to measure biomarkers such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, ketone levels, general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing, of the users in real-time.
  • biomarkers such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, ketone levels, general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing, of the users in real-time.
  • the system can capture a snapshot of a user’s face over the video conference and calibrate the system for an upcoming live video conference to begin a build out of a user’s profile, or use a pre-recorded video of a user over a video conference to make subsequent measurements of the biomarkers, as shown in FIGS. 24A-24F.
  • the HRV system may use recorded still images and/or video images in a manner that is analogous to live still images and/or video images.
  • a plurality of cameras can be used to acquire a three- dimensional image of the target in order to generate a more complete data set. For example, using a plurality of cameras can provide a plurality of angles of the face of the user in multiple degrees of freedom, e.g., up to six degrees of freedom. Moreover, in some embodiments, the application can integrate cameras from various locations in order to collect more biomarker data that will result in more accurate measurements.
  • the presently disclosed system can communicate with sensors located in cameras that are located in a user’s car, at the computer at work, at the gym or wellness center, or in a hospital or wellness setting to collect biomarker data during the course of the user’s day to build out a user profile that captures a detailed history of biomarker values.
  • This combination of a plurality of cameras may provide raw data that may be transformed into new biomarkers that have heretofore been unknown.
  • the increase in sensors and/or cameras in the environment coupled with the increasing resolution and video speed (e.g., frames per second) of those sensors and/or cameras create the possibility of discovering a data set of physiological signals that have been previously unknown.
  • a data set of physiological signals may allow the HRV system to build user profiles that make it possible to build unique user profiles that can be associated with only one user, such as a fingerprint.
  • the system can calculate heart rate, HRV score, and other biomarker values for a given user.
  • HRV score Upon completion of the HRV score calculation, a user may compare the calculated HRV score and other non-proprietary HRV parameters to general population and/or demographic-filtered population data, as shown in FIG. 24F, to provide an indication of how the user compares with the general population as a whole or with specific filtered portions of the general population. This comparison may provide a user with some indication as to changes in their HRV values with respect to their own historic values as well as historic values for a given population.
  • the HRV system may also create a daily expressed score for use in tracking a user’s HRV values over time.
  • This daily expressed score is known as a Morning Readiness score.
  • the Morning Readiness score is a scaled score (1-10) that shows the relative balance or imbalance in the user’s sympathetic and parasympathetic nervous system.
  • the Morning Readiness score correlates with day-to-day fluctuations in the nervous system for an individual, highlighting to the user when major changes may have occurred in the body, based on the user’s own unique individual patterns.
  • the HRV system may create a user profile that is unique to an individual.
  • the HRV system may integrate an individual user’s biomarker data to create a “biomarker fingerprint” that is unique to the individual user.
  • This unique “biomarker fingerprint” may be operative to identify an individual using biomarker data collected from one or more sensors, including still images and/or video images.
  • FIG. 25 presents a view of the display of the data integration connections for the user device consistent with certain embodiments of the present invention.
  • the user may specify one or more exterior cloud integrations, such as, in a non-limiting example, a Fitbit or Google Fit cloud-based data set, with which the HRV monitoring device may connect.
  • the exterior cloud based data may integrate with the HRV monitoring system to receive, transmit, and exchange heart rate, heart rate variability, exercise regimen, nutrition data, and any other data that assists in monitoring and managing the health of the user.
  • the user may configure the HRV monitoring system to transmit data to any selected exterior cloud based data integrator, or to an exterior monitoring device maintained by a medical practitioner of the user’s choice.
  • FIG. 26 presents a view of the display of the historical log for a user consistent with certain embodiments of the present invention.
  • the historical log presents data to the user to permit a visual tracking of the morning readiness score, sleep statistics, exercise statistics, and other health information.
  • the data presented may be configured by the user to provide that information that is most useful to the user in monitoring trends for these health-related statistics over time.
  • FIG. 27 presents a view of the historical trends for HRV statistics for a user consistent with certain embodiments of the present invention.
  • the displays presented to the user provide a snapshot of statistical information in chart form, permitting the user to understand changes in their health readings and health related statistics over time.
  • a portion of the display presents a chart of a coefficient of variation for the HRV data readings expressed as a percentage and a second portion of the display presents a chart of the total power readings captured over time.
  • this example should not be considered limiting as the user may configure the display to present any other statistical measure captured by the system over time. The selected statistical data may then be charted and presented to the user on this display when requested.
  • FIG. 28 presents a view of the connection capability for the sensors associated with the HRV monitoring system consistent with certain embodiments of the present invention.
  • the user would open this display to begin the process of selecting the desired Heart Rate (HR) monitoring sensor and performing the connection actions to place the HR monitoring system in readiness to perform a data reading.
  • the HR system presents the user with one or more sensors that have been discovered through an embedded near-field communication protocol, such as, in a non-limiting example, Bluetooth Low- Energy (BLE). The user may then select the sensor they intend to use for the HRV data reading by selecting the sensor name on the display screen.
  • the user is also presented with a troubleshooting capability to resolve issues when a sensor does not connect or indicates errors or issues with connection. Upon an indication of connectivity and readiness, the user may proceed to use the selected HR monitoring sensor to capture the HRV data reading.
  • FIG. 29 figure presents a view of the historical trends for HRV statistics for a user related to morning readiness scores and HRV values consistent with certain embodiments of the present invention.
  • this display presents charts of the morning readiness and HRV data readings over a selected span of time.
  • the user may choose the length of time for the charted information from an icon on the screen indicating the desired time span.
  • the user may also choose to change the chart time span to move from one timespan to another by selecting a different time span icon on the display screen, allowing the user to compare short term and long-term trends.
  • FIG. 30 presents a view of the detailed data values for HRV statistics for a user related to morning readiness scores and HRV values consistent with certain embodiments of the present invention.
  • the user may be presented with an indicator that displays the morning readiness score as a relative measure between sympathetic and parasympathetic conditions to provide a relative balance indicator.
  • This display also provides the user with heart rate and HRV data readings and charts intra-reading values that may be interpolated from the HRV data readings.
  • This detailed information display provides the user with a view into the actual variability in the intra-beat heart rate data.
  • FIG. 31 presents a view of the informational data for a user related to morning readiness scores and HRV values expressed as autonomic balance between the sympathetic and the parasympathetic systems consistent with certain embodiments of the present invention.
  • the displayed information 3118 is educational and informative in nature, providing the user with an understanding of how the morning readiness score indicates a balance between the sympathetic and parasympathetic condition of the user’s autonomic nervous system.
  • FIG. 32 presents a view of the historical trends for HRV statistics for a population related to morning readiness scores and HRV values consistent with certain embodiments of the present invention.
  • the user is presented at 3220 with metrics associated with the user and presented as a comparison with a filtered population.
  • the user or a medical practitioner, a coach, or wellness practitioner may input metrics associated with a particular HRV score, age and gender to create a comparison between the input metrics and the filtered population as a whole.
  • FIG. 33 presents a view of the historical trends for HRV statistics for a population related to morning readiness scores and HRV values consistent with certain embodiments of the present invention.
  • the user at 3320 is informed that the filtered population is filtered based upon morning readiness readings from all system users having more than 2 measurements stored within the morning readiness score database maintained by the system server.
  • the information provided to the user on this screen is informational and is intended to educate the user on how and why age, gender, fitness level, and health can affect the user’s HRV.
  • FIG. 34 presents a view of the raw data captured for RR intervals and HRV values consistent with certain embodiments of the present invention.
  • the user is presented with the actual data captured by the selected sensor and presented to the user as a chart of values over time 3424.
  • the R-R interval data is collected over a time span of at least 1 minute, although the time span may be longer if greater accuracy is desired, and the data is collected continuously over the reading time span.
  • the time span may be shorter than 2 minutes. In particular, the time span may be as brief as 1 minute, e.g., 60 seconds.
  • the R-R intervals are reported in milliseconds and provide the basis for the determination of HRV, which is charted over the same time span and presented on a normalized scale of 1 - 100. From this raw data, the user or medical practitioner may have a more optimized view of the user’s HRV and the RR intervals that contribute to the HRV values for the time span during the data reading.
  • FIG. 35 presents a view of the relationship between R-R intervals and HRV values consistent with certain embodiments of the present invention.
  • FIG. 35 is showing live R-R intervals during a reading, as well as “real time” heart rate and HRV values.
  • HRV heart rate
  • the user or a medical practitioner a coach, or other professional is presented with a comparative chart of a user’s heart rate and the associated variability in that heart rate (HRV) for a particular data reading 3526.
  • HRV heart rate
  • the user or the medical practitioner may derive a better understanding of the amount of variability the user is experiencing in their monitored heart beats, even though the number of beats and timing may be well within physical norms for the user’s age, gender, fitness level, and weight.
  • This particular display may present a user, or the user’s medical practitioner, some insight into whether steps should be taken to optimize the user’s HRV.
  • FIG. 36 presents a view of the display of HRV statistics and signal quality for a user post reading consistent with certain embodiments of the present invention.
  • this display presents the user with an operational view of the signal quality for received data from the sensor and the HRV and HR data values collected during a reading period 3628.
  • the signal quality presents the user with a view of whether data artifacts are appearing in the recorded data and, if so, how many such artifacts have been detected. If the user is experiencing poor results from a data reading, the user may use this display to determine if signal quality or data artifacts are causing the poor data reading. If either the signal quality or data artifacts are causing an issue with collecting data measurements, the user may take steps to correct the issue.
  • FIG. 37 presents a view of the user feedback and tagging display consistent with certain embodiments of the present invention.
  • the user may utilize this input data view to tag a collected HRV data reading with a mood the user was feeling when the HRV data reading was recorded 3730.
  • the user may also use this display to record notes as to physical feelings and edit information associated with the user’s interaction with caffeine, alcohol or other chemicals.
  • the user may also add metadata associated with sleep, energy level, exercise, and/or soreness to add to the HRV data reading when it is stored within the HRV data reading database. This information, although somewhat subjective, may also assist the user in optimizing their heart rate variability over time.
  • FIG. 38 presents a view of the display of HRV statistics and signal quality for a user post reading consistent with certain embodiments of the present invention.
  • the user may visit this display page to review insights into the user’s condition over time 3832.
  • the display presents the data as a weekly insights display, this time span should not be considered limiting as the user may select different time spans over which to observe the data points presented on the display, once again in an effort to optimize the user’s HRV over time.
  • FIG. 39 presents a view of the composite reading and data collection process consistent with certain embodiments of the present invention.
  • a user or medical practitioner on behalf of a user initiates a data collection action to collect HRV data, physiological data, biometric data, and environmental data.
  • the initial data collection secures information about the user’s heart beat R-R values, environmental data about the area in which a user is located, and other biometric and physiological data such as heart rate, blood pressure, oxygen levels, CO2 levels, glucose, ketones, general awareness or alertness, stress, reflex time, resilience, training or related capacity or capability, and video imagery.
  • the data collected during the HRV reading is stored in the HRV system server electronic data store and combined with historical data and other information to create an initial composite score, comprising HRV data, environmental data, and other biometric data, for the user at 3902.
  • the HRV system may analyze the collected and accumulated historical data to create a planned event, intervention or planned step to assist a user in achieving one or more expressed goals with regard to the composite score and provide this guidance to the user.
  • intervention or planned steps the user at 3906 will perform another data collection action to update the initial data recordings and collect updated data on all sampled values subsequent to the user performance of the planned event, intervention, or planned step.
  • the HRV system may perform a calculation to update the composite score utilizing the collected data from the most recent data collection effort.
  • the HRV system analyzes the updated composite score to determine if the latest calculated composite score is above or below a threshold value or within a range that is indicated as desired for the user.
  • the HRV system updates the planned intervention for the user at 3912 by choosing or creating a modification to the previously recommended event, intervention, or planned step and returns this value to the HRV system server.
  • the HRV system server then returns to process step 3904 to provide this information to the user.
  • the HRV system provides updated feedback to the user on their composite score values and how the user is meeting their goals with regard to the established composite score.
  • the HRV system queries the user to determine if additional data collection and/or analysis is desired by the user, or by the medical practitioner associated with the user.
  • the HRV system returns to step 3904 to provide the updated modifications created for the user by the HRV system and the user performs the remaining steps in the process utilizing the updated modifications in performing those steps. If no further steps are required the HRV system at 3918 may produce a score validation for the user and create a final report for the current data collection readings and the user’s current state with regard to their expressed goals and/or the composite level goals established for the user by the HRV system.
  • FIG. 40 presents a view of the HRV system configuration consistent with certain embodiments of the present disclosure.
  • HRV data may be collected from a user through the use of any sensor or device configured to collect HRV data.
  • sensors or devices may include capturing the HRV data through attaching an ECG sensor 4000, a PPG sensor 4002, or a smart wearable device 403 to the user, or the HRV data may be captured through the use of a camera 4004 or using a smartphone 4006.
  • the data captured by any sensor or device may be collected and transmitted as a stream of data in real-time, or may be collected and transmitted in a batch at a later time. Regardless of the method of collection or data transmission the collected data is transmitted to the system data processor 4008.
  • a plurality of modules are active at 4010 to perform the HRV analysis herein described and creating parameters for review as well as predictions and recommendations for the user.
  • the data, predictions, and recommendations are later transmitted to any display device associated with a user at 4012 to display the information for consumption by the user.
  • Various embodiments may be implemented at least in part in any conventional computer programming language. For example, some embodiments may be implemented in a procedural programming language (e.g., “C”), or in an object-oriented programming language (e.g., “C++”). Other embodiments may be implemented as a pre-configured, standalone hardware element and/or as pre-programmed hardware elements (e.g., application specific integrated circuits, FPGAs, and digital signal processors), or other related components.
  • C procedural programming language
  • object-oriented programming language e.g., “C++”
  • Other embodiments may be implemented as a pre-configured, standalone hardware element and/or as pre-programmed hardware elements (e.g., application specific integrated circuits, FPGAs, and digital signal processors), or other related components.
  • the disclosed systems, devices, and methods may be implemented as a computer program product for use with a computer system, a smartphone, a smartwatch, and the like.
  • Such implementation may include a series of computer instructions fixed either on a tangible, non-transitory medium, such as a computer readable medium (e.g., a diskette, CD-ROM, ROM, or fixed disk).
  • a computer readable medium e.g., a diskette, CD-ROM, ROM, or fixed disk.
  • the series of computer instructions can embody all or part of the functionality previously described herein with respect to the system.
  • Such computer instructions can be written in a number of programming languages for use with many computer architectures or operating systems. Further, such instructions may be stored in any memory device, such as semiconductor, magnetic, optical or other memory devices, and may be transmitted using any communications technology, such as optical, infrared, microwave, or other transmission technologies.
  • such a computer program product may be distributed as a removable medium with accompanying printed or electronic documentation, e.g., shrink wrapped software, preloaded with a computer system, e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the network, e.g., the Internet or World Wide Web).
  • some embodiments may be implemented in a software-as-a- service model (“SAAS”) or cloud computing model.
  • SAAS software-as-a- service model
  • some embodiments may be implemented as a combination of both software, e.g., a computer program product, and hardware. Still other embodiments are implemented as entirely hardware, or entirely software.
  • module may refer to software, firmware and/or circuitry configured to perform any of the aforementioned operations.
  • Software may be embodied as a software package, code, instructions, instruction sets and/or data recorded on non-transitory computer readable storage medium.
  • Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., nonvolatile) in memory devices.
  • Circuitry as used in any embodiment herein, may comprise, for example, singly or in any combination, hardwired circuitry, programmable circuitry such as computer processors comprising one or more individual instruction processing cores, state machine circuitry, and/or firmware that stores instructions executed by programmable circuitry.
  • the modules may, collectively or individually, be embodied as circuitry that forms part of a larger system, for example, an integrated circuit (IC), system on- chip (SoC), desktop computers, laptop computers, tablet computers, servers, smart phones, etc.
  • IC integrated circuit
  • SoC system on- chip
  • any of the operations described herein may be implemented in a system that includes one or more storage mediums having stored thereon, individually or in combination, instructions that when executed by one or more processors perform the methods.
  • the processor may include, for example, a server CPU, a mobile device CPU, and/or other programmable circuitry.
  • the storage medium may include any type of tangible medium, for example, any type of disk including hard disks, floppy disks, optical disks, compact disk read-only memories (CD-ROMs), compact disk rewritables (CD-RWs), and magneto-optical disks, semiconductor devices such as read-only memories (ROMs), random access memories (RAMs) such as dynamic and static RAMs, erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), flash memories, Solid State Disks (SSDs), magnetic or optical cards, or any type of media suitable for storing electronic instructions.
  • Other embodiments may be implemented as software modules executed by a programmable control device.
  • the storage medium may be non- transitory.
  • various embodiments may be implemented using hardware elements, software elements, or any combination thereof.
  • hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, memristors, quantum computing devices, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
  • non-transitory is to be understood to remove only propagating transitory signals per se from the claim scope and does not relinquish rights to all standard computer- readable media that are not only propagating transitory signals per se. Stated another way, the meaning of the term “non-transitory computer-readable medium” and “non-transitory computer-readable storage medium” should be construed to exclude only those types of transitory computer-readable media which were found in In Re Nuijten to fall outside the scope of patentable subject matter under 35 U.S.C. ⁇ 101.

Abstract

The systems, devices and methods provided for in the present disclosure are directed to measuring and tracking biomarker data. Data can be acquired using one or more sensors that determines vitals such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, ketone levels, general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing. The one or more sensors can include a finger-over-camera sensor that can measure vital signs from contact with the finger of a user and/or a face-over-camera sensor that can measure biomarkers via the camera feature of the smartphone, and/or a standalone camera, including those in loT devices. The sensors can communicate with a smartphone application that can be toggled to take measurements of predetermined biomarkers at desired intervals, and the data from each sensor can be combined to build a user profile.

Description

Machine Learning Models for Estimating Physiological Biomarkers
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The subject-matter of this patent application may be related to the subject matter of U.S. Patent Application No. 16/867,624, filed on May 6, 2020, and titled “Heart Rate Variability Monitoring and Analysis,” and U.S. Patent Application No. 16/867,629, filed on May 6, 2020, and titled “Heart Rate Variability Composite Scoring and Analysis,” the contents of which are hereby incorporated herein by reference in their entireties, including the drawings and appendices.
COPYRIGHT NOTICE
[0002] A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
FIELD
[0003] The present disclosure relates to systems, devices, and methods for detection and tracking biomarkers, and more specifically, to using one or more sensors associated with a user to measure vitals, and customized insights, including health, wellness, and fitness-related parameters over a predetermined period.
BACKGROUND
[0004] Heart Rate Variability (HRV) is determined from heart beat data and represents variability in inter-beat timing. A heart rate monitor or other sensor detects the ECG (electrocardiograph) or the PPG (photoplethysmography) signal, e.g, a data measure that varies in relation to the heart’s contraction and relaxation. From this the peaks of the heart contraction can be derived and plotted against time. That is, the time between the ventricular and atrial contractions of the heart can be derived and plotted against time. This, in turn, allows the timing between peaks to be reported as a time (in milliseconds) between peaks. [0005] Currently, there is a need for improved systems, devices, and methods for measuring biomarkers that are more user friendly and accurate in more conditions, and more broadly accessible to the general public.
SUMMARY
[0006] The systems, devices and methods provided for in the present disclosure are directed to measuring and tracking biomarker data. Data can be acquired using one or more sensors that determines vitals such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, ketone levels, general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing. The one or more sensors can include a finger-over-camera sensor that can measure vital signs from contact with the finger of a user and/or a face-over-camera sensor that can measure biomarkers via the camera feature of the smartphone, and/or a standalone camera, including those in loT devices. The sensors can communicate with a smartphone application that can be toggled to take measurements of predetermined biomarkers at desired intervals. The data from each sensor can be combined to build a user profile where the data from each sensor is used to corroborate or correct the measurement taken by the other sensors. Biomarker data can be saved by the system to the user profile that can track a history of the biomarkers over an extended period. Throughout this document, the terms biomarker, metric, and biometric will be used interchangeably.
[0007] In an embodiment, a system for measuring biomarkers of a user includes: a communication device configured to make biomarker determinations, the device having one or more cameras thereon configured to capture an image of one or more users; a plurality of sensors coupled to a hardware component, the plurality of sensors being configured to collect biomarker data from the one or more users; a storage system configured to store the biomarker data to a profile that corresponds to a user of the one or more users; and a display unit associated with the communication device configured to display the biomarker data, wherein: the biomarker data collected by a sensor of the plurality of sensors is compared to biomarker data collected by another sensor of the plurality of sensors to validate values of the biomarker data; and repeated storage of the biomarker data to the profile trains a sensor of the plurality of sensors to stop collecting biomarker data during subsequent collections of biomarker data. [0008] The sensor of the plurality of sensors may include an optical sensor configured to detect the face of the one or more users to collect biomarker data therefrom. Another sensor of the plurality of sensors may include a pressure sensor configured to contact the one or more users to collect biomarker data therefrom. The pressure sensor may further comprise a camera configured to collect a supplementary signal by measuring a force applied to the camera by the user. The optical sensor and the pressure sensor may be configured to conduct readings substantially simultaneously. The readings may be conducted over a period of from approximately 10 seconds to approximately ten minutes. The communication device may provide a warning to adjust a position of the plurality of sensors if biomarker data is not being collected.
[0009] The optical sensor may collect biomarker data by one or more of color extraction, pixel movement, and/or eye-specific movement of a face of the user. The color extracted may include visible and/or invisible wavelengths.
[0010] The system may further include at least a third sensor configured to receive data from one or more of the first sensor, the second sensor, or the user, the third sensor having a hardware component configured to make biomarker determinations. The third sensor may include one or more of a chest strap or a patch, a smartwatch, a wrist or arm wearables, a ring wearable, or a hat.
[0011] In embodiments, the biomarker data may include one or more of heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, ketone levels, general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing.
[0012] In another embodiment, a method for determining biomarkers includes: placing a first sensor relative to a first body part of a user; placing a second sensor relative to a second body part of the user, the second sensor being configured to capture an image of a face of the user; measuring a value of a first biomarker of the first body part of the user using the first sensor to determine a first reading; measuring a value of the first biomarker of the second body part of the user using the second sensor to determine a second reading, the first biomarker being measured by the first sensor and the second sensor substantially simultaneously; analyzing the first reading and the second reading to determine the existence of a correlation between the values; and displaying a final value of the first biomarker based on the analysis of the first reading and the second reading. The first body part may be a finger of the user and the second body part may be a face of the user.
[0013] In embodiments, the method may further include storing the final value of the first biomarker in a user profile that includes a history of the final readings of biomarkers taken by the user. The second reading may be compared to the first reading to verify accuracy of the first reading.
[0014] The method may further include adjusting the first reading based on a measured output of the second reading. The method may further include using the measured values to calibrate the profile of the user. The first and second readings may be taken over a predetermined time interval. The predetermined time interval may range from approximately 10 seconds to approximately ten minutes. The method may further include filtering the first reading and the second reading to determine an average value of each of the first reading and the second reading.
[0015] The method may further include measuring a second biomarker of the user using one or more of the first sensor and the second sensor, and combining the first biomarker with the second biomarker.
[0016] In yet another embodiment, a method for biometric monitoring and scoring includes: collecting biometric data from a plurality of users; collecting data measuring physiological data and environmental data associated with said plurality of users over time; storing all collected data in an electronic storage system; analyzing said collected data to create a composite score that comprises at least heart rate activity data, biometric data, and environmental data; comparing said composite score against historical composite scores to determine activity modifications that will impact the behavior of said plurality of users prior to collecting additional biometric data; and presenting said activity modifications to said plurality of users on a display device in combination with recommended actions to accomplish said activity modifications.
[0017] The method may further include utilizing one or more signal cleaning algorithms to detect artifacts in the collected data and improve the collected data by removing any detected artifacts that impair the signal quality of the biometric and/or physiological data. The signal cleaning of the collected data may be performed during a data collection action and the cleaned collected data is stored in an electronic format prior to analysis of said cleaned collected data.
[0018] In embodiments, the method may further include tracking said composite score over a pre-configured time span. The composite score may comprise Heart Rate Variability (HRV) data, biometric data, and changes in environment data.
[0019] In embodiments, the method may further include receiving from a user or medical practitioner a threshold composite score or composite score range that is preferred for the user to maintain. The method may further include transmitting to a user recommended actions comprising events, interventions, and/or planned steps in accordance with maintaining said user’s particular composite score.
[0020] The sensors may comprise any of a finger sensor, an LED sensor, a chest-strap electrocardiogram sensor, or sensors contained within or attached to a mobile device associated with said user.
[0021] The composite score may be presented to a user as a numeric value and a gauge graphic to permit the user to visually understand changes in the composite score over time. The composite score, recommendations, and guidance may be provided as a report, as part of an ongoing data display, or in real-time as live biofeedback to a user during an activity.
[0022] In yet another embodiment, a system for biometric monitoring and scoring includes: collecting biometric data from a plurality of users; collecting data measuring physiological data and environmental data associated with said plurality of users over time; storing all collected data in an electronic storage system; analyzing said collected data to create a composite score that comprises at least heart rate activity data, biometric data, and environmental data; comparing said composite score against historical composite scores to determine activity modifications that will impact the behavior of said plurality of users prior to collecting additional biometric data; presenting said activity modifications to said plurality of users on a display device in combination with recommended actions to accomplish said activity modifications.
[0023] The system may further include utilizing one or more signal cleaning algorithms to detect artifacts in the collected data and improve the collected data by removing any detected artifacts that impair the signal quality of the biometric and/or physiological data. [0024] The signal cleaning of the collected data may be performed during a data collection action and the cleaned collected data is stored in an electronic format prior to analysis of said cleaned collected data.
[0025] The sensors may comprise any of a finger sensor, an LED sensor, a chest-strap electrocardiogram sensor, or sensors contained within or attached to a mobile device associated with said user.
[0026] In embodiments, the system may further include said composite score over a preconfigured time span. The composite score may comprise Heart Rate Variability (HRV) data, biometric data, and changes in environment data.
[0027] The system may further include receiving from a user or medical practitioner a threshold composite score or composite score range that is preferred for the user to maintain.
[0028] The system may further include transmitting to a user recommended actions comprising events, interventions, and/or planned steps in accordance with maintaining said user’s particular composite score. The composite score may be presented to a user as a numeric value and a gauge graphic to permit the user to visually understand changes in the composite score over time. The composite score, recommendations, and guidance may be provided as a report, as part of an ongoing data display, or in real-time as live biofeedback to a user during an activity.
[0029] In yet another embodiment, a method for training machine learning models to transform camera-based images into estimates of physiological biomarkers includes: providing a time series of video frames collected from a region of a body of a subject that is associated with a blood volume change within a tissue of the body; providing a time series of R-R intervals, the R-R intervals provided as ground truth data; synchronizing the time series of the video frames and the time series of the R-R intervals to provide an input pair to provide a synchronized time series of video frames and time series of R-R intervals; and training a machine learning model to estimate R-R intervals from the synchronized time series of video frames and time series of R-R intervals using the time series of R-R intervals as the ground truth data. [0030] The time series of video frames may be recorded by finger-over-camera or face-over- camera video cameras. The time series of R-R intervals may be calculated from inter-beat intervals measured by an ECG.
[0031] In embodiments, the machine learning model may be configured as a deep learning network. The deep learning network may be a convolutional neural network.
[0032] In embodiments, the training may include supervised learning. The supervised learning may be configured to operate without exact known features of the regions of the body of the subject.
[0033] In embodiments, the synchronizing may be performed before training the machine learning model by using one or more signal processing peak detection methods. The synchronized time series of video frames and time series of R-R intervals may include a first synchronized segment of the synchronized time series of video frames and time series of R-R intervals having a first duration. The first synchronized segment may be subdivided into one or more subdivided synchronized segments having one or more durations.
[0034] A moveable time window may be provided to subdivide the first synchronized segment into the one or more subdivided synchronized segments. The one or more subdivided synchronized segments may be used to train the machine learning model.
[0035] In yet another embodiment, a method for training machine learning models to learn time-dependent patterns in a time-series signal includes: providing a time domain biometric value; providing a time series of R-R intervals as ground truth data; and training a machine learning model to determine a time domain biometric value from the time series of R-R intervals.
[0036] The time domain biometric value may include at least one of an HRV score or a RMSSD value. The machine learning model may include a long short term memory model.
[0037] The method may further include training the machine learning model to produce a frequency domain biometric value. The frequency domain biometric value may include at least a high-frequency power value.
[0038] In embodiments, the machine learning model may be configured as a deep learning network. The deep learning network may be configured as a convolutional neural network. [0039] In embodiments, the machine learning model may be a hybrid machine learning model. The hybrid machine learning model may be trained by a stacking model. The stacking model may include a convolutional neural network and a long short term memory model.
[0040] In embodiments, the machine learning model may be trained to detect and remove artifacts from the time series of R-R intervals. The artifacts may be caused by one or more of motion, arrhythmias, premature ectopic beats, atrial fibrillation, measurement, or signal noise.
[0041] In yet another embodiment, a method for training machine learning models to determine a subset of pixels in video frames to generate one or more biomarker values includes: providing a time series of video frames collected from a region of a body of a subject that is associated with a blood volume change within a tissue of the body; dividing each of the time series of video frames into cells of an N x M grid; detecting one or more cells having the highest information content; calculating estimates of the one or more biomarkers from each of the detected cells; and training a machine learning model to fuse the estimates of the one or more biomarkers from each of the detected cells to generate the one or more biomarker values. The one or more biomarker values may include at least one of R-R intervals or an HRV value.
[0042] The dividing may increase the signal to noise in the detected cells. The dividing may reduce the amount of computation required to generate the one or more biomarker values.
[0043] In yet another embodiment, a method for training machine learning models to determine a subset of pixels in video frames to generate one or more biomarker values includes: providing a time series of video frames collected from a region of a body of a subject that is associated with a blood volume change within a tissue of the body; using attention mechanism to do spatial segmentation in each of the time series of video frames; detecting the spatial segmentation in each of the time series of video frames; and training a machine learning model to calculate estimates of the one or more biomarkers from the spatial segmentation in each of the time series of video frames. The one or more biomarker values may include at least one of R-R intervals or an HRV value.
[0044] In embodiments, the spatial segmentation in each of the time series of video frames may increase the signal to noise in the detected cells. The spatial segmentation in each of the time series of video frames may reduce the amount of computation required to generate the one or more biomarker values.
BRIEF DESCRIPTION OF THE DRAWINGS
[0045] Certain illustrative embodiments illustrating organization and method of operation, together with objects and advantages may be best understood by reference to the detailed description that follows taken in conjunction with the accompanying drawings in which:
[0046] FIG. 1 A is a schematic illustration of a pulse waveform from a PPG signal;
[0047] FIG. IB is an example of a PPG signal;
[0048] FIG. 2A is a schematic illustration of use of a finger-over-camera sensor of a smartphone in accordance with the present embodiments;
[0049] FIG. 2B is a schematic illustration of use of a face-over-camera sensor of the smartphone of FIG. 2 A;
[0050] FIG. 3 A is a schematic perspective view of an exemplary embodiment of a wearable sensor attached to a finger of a user;
[0051] FIG. 3B is a schematic top view of the wearable sensor of FIG. 3 A attached to the finger of the user;
[0052] FIG. 3C is a schematic perspective view of alternate embodiments of wearable devices that can be used with the embodiments of the present disclosure;
[0053] FIG. 4 is a view of artifact detection accuracy in terms of the detection of false positive artifact detection consistent with certain embodiments of the present invention;
[0054] FIG. 5 is a view of artifact detection accuracy in terms of the detection of true positive artifact detection consistent with certain embodiments of the present invention;
[0055] FIG. 6 is a view of artifact impact on the system consistent with certain embodiments of the present invention;
[0056] FIG. 7 is a view of the display of HRV statistics for a user post reading consistent with certain embodiments of the present invention; [0057] FIG. 8 is a view of the display of the continuation of HRV statistics for a user post reading consistent with certain embodiments of the present invention;
[0058] FIG. 9 is an exemplary embodiment of an interface displaying readings that can be performed with the sensors of the present embodiments;
[0059] FIG. 10 is an exemplary embodiment of an interface for taking an HRV snapshot in accordance with the present embodiments;
[0060] FIG. 11 A is a tutorial on how to use the finger-over-camera sensor of FIG. 1 A;
[0061] FIG. 1 IB is a status update of use of the finger-over-camera sensor of FIG. 1 A;
[0062] FIG. 11C is an interface indicating that the system is waiting to receive pulse data from a wearable device in communication with the system of the present disclosure;
[0063] FIG. 1 ID is an interface for performing accuracy check performed by the application;
[0064] FIG. 1 IE is an interface for indicating that the accuracy check is complete;
[0065] FIG. 12 is a status screen 1200 indicating that the finger-over-camera sensor is taking a reading;
[0066] FIG. 13 is an interface that displays results of the finger-over-camera sensor reading;
[0067] FIG. 14 is an exemplary embodiment of reminders prior to taking a face-over-camera sensor reading;
[0068] FIG. 15 is an interface indicating that the system is waiting to receive pulse data from a wearable device in communication with the system of the present disclosure;
[0069] FIG. 16A is an exemplary embodiment of calibration of the face-over-camera sensor to take readings from a user;
[0070] FIG. 16B is an accuracy check of the calibration of FIG. 16A;
[0071] FIG. 16C is an interface indicating that calibration of the face-over-camera sensor has been completed; [0072] FIG. 16D is an interface for taking an HRV snapshot using the face-over-camera sensor;
[0073] FIG. 16E is an interface that displays biomarker information when using the face- over-camera sensor;
[0074] FIG. 16F is an interface showing the user their face;
[0075] FIG. 17 is an exemplary embodiment of an interface for indicating an error message during a reading;
[0076] FIG. 18 is an exemplary embodiment of an interface for personalization after a reading is taken;
[0077] FIG. 19 is an exemplary embodiment of an interface for enabling additional tagging features;
[0078] FIG. 20 is an interface showing the results of the reading of FIGS. 16A-16F;
[0079] FIG. 21 is an exemplary embodiment of a home screen of the application;
[0080] FIG. 22A is an exemplary embodiment of a calibration of a scanner function that can collect biomarker data live via the face-over-camera sensor;
[0081] FIG. 22B is an interface for performing the face-over-camera live reading of FIG. 22A;
[0082] FIG. 23 is an exemplary embodiment of a web browser version using face-over- camera for measuring biomarkers via video conference;
[0083] FIG. 24A is an exemplary embodiment of detection for a reading using the face-over- camera sensor with no live reading;
[0084] FIG. 24B is an exemplary embodiment of calibration for a reading using the face- over-camera sensor with no live reading;
[0085] FIG. 24C is an exemplary embodiment of an interface indicating that calibration is complete; [0086] FIG. 24D is an exemplary embodiment of a reading being formed;
[0087] FIG. 24E is an exemplary embodiment of results of the reading;
[0088] FIG. 24F is an exemplary embodiment of an interface comparing results of the user profile with other users;
[0089] FIG. 25 is a view of the display of the data integration connections for the user device consistent with certain embodiments of the present invention;
[0090] FIG. 26 is a view of the display of the historical log for a user consistent with certain embodiments of the present invention;
[0091] FIG. 27 is a view of the historical trends for HRV statistics for a user consistent with certain embodiments of the present invention;
[0092] FIG. 28 is a view of the connection capability for the sensors associated with the HRV monitoring system consistent with certain embodiments of the present invention;
[0093] FIG. 29 is a view of the historical trends for HRV statistics for a user related to morning readiness scores and HRV values consistent with certain embodiments of the present invention;
[0094] FIG. 30 is a view of the detailed data values for HRV statistics for a user related to morning readiness scores and HRV values consistent with certain embodiments of the present invention;
[0095] FIG. 31 is a view of the informational data for a user related to morning readiness scores and HRV values expressed as autonomic balance consistent with certain embodiments of the present invention;
[0096] FIG. 32 is a view of the historical trends for HRV statistics for a population related to morning readiness scores and HRV values consistent with certain embodiments of the present invention;
[0097] FIG. 33 is a view of the historical trends for HRV statistics for a population related to morning readiness scores and HRV values consistent with certain embodiments of the present invention; [0098] FIG. 34 is a view of the raw data captured for RR intervals and HRV values consistent with certain embodiments of the present invention;
[0099] FIG. 35 is a view of the relationship between RR intervals and HRV values consistent with certain embodiments of the present invention;
[0100] FIG. 36 is a view of the display of HRV statistics and signal quality for a user post reading consistent with certain embodiments of the present invention;
[0101] FIG. 37 is a view of the user feedback and tagging display consistent with certain embodiments of the present invention;
[0102] FIG. 38 is a view of the display of HRV statistics and signal quality for a user post reading consistent with certain embodiments of the present invention;
[0103] FIG. 39 is a view of the composite score reading and data collection process consistent with certain embodiments of the present invention;
[0104] FIG. 40 presents a view presents a view of the HRV system configuration consistent with certain embodiments of the present disclosure
DETAILED DESCRIPTION
[0105] Certain exemplary embodiments will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the systems, devices, and methods disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. Those skilled in the art will understand that the systems, devices, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting exemplary embodiments and that the scope of the present disclosure is defined solely by the claims. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments. Such modifications and variations are intended to be included within the scope of the present disclosure.
[0106] While this invention is susceptible of embodiment in many different forms, there is shown in the drawings and will herein be described in detail specific embodiments, with the understanding that the present disclosure of such embodiments is to be considered as an example of the principles and not intended to limit the invention to the specific embodiments shown and described. In the description below, like reference numerals are used to describe the same, similar or corresponding parts in the several views of the drawings.
[0107] The terms “a” or “an”, as used herein, are defined as one or more than one. The term “plurality”, as used herein, is defined as two or more than two. The term “another”, as used herein, is defined as at least a second or more. The terms “including” and/or “having”, as used herein, are defined as comprising (i.e., open language). The term “coupled”, as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically.
[0108] Reference throughout this document to "one embodiment", “certain embodiments”, "an embodiment" or similar terms means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of such phrases or in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments without limitation.
[0109] Reference throughout this document to “sympathetic”, refers to a part of the autonomic nervous system that serves to accelerate the heart rate, constrict blood vessels, and raise blood pressure
[0110] Reference throughout this document to “parasympathetic” refers to the portion of the autonomic nervous system that conserves energy as it slows the heart rate, increases intestinal and gland activity, and relaxes sphincter muscles in the gastrointestinal tract.
[0111] Reference throughout this document to “HRV” refers to “Heart Rate Variability” which is a measure of the variability in inter-beat timing of a heart as it is actively beating.
[0112] Reference throughout this document to “HRV scoring” refers to the development of a score that is calculated utilizing various algorithms to present a scaled score from which comparisons over time may be made.
[0113] Reference throughout this document to “Morning Readiness” refers to a scaled score related to a user’s particular balance of parasympathetic and sympathetic nervous system activity. [0114] Reference throughout this document to “Composite Scoring” refers to composite scores that have HRV Scoring, Morning Readiness scoring, and additional scoring parameters that are not necessarily related to HRV scores.
[0115] Reference throughout this document to “Autonomic Balance” refers to changes in a user’s Autonomic Nervous System (ANS) as indicated by changes in the user’s HRV over time.
[0116] Reference throughout this document to a “Readiness Score” refers to a novel readiness score based upon ANS activity changes over time and indicates the user’s readiness to tackle life’s challenges each day.
[0117] The inter-beat intervals or R-R intervals are transmitted to the Elite HRV software wirelessly (currently via Bluetooth) from the finger sensor and used to calculate variability over time in the inter-beat or R-R intervals, i.e., the HRV data. Changes in the inter-beat or RR intervals are associated with activity and/or changes in the Autonomic Nervous System’s (parasympathetic) and Sympathetic Nervous System’s (sympathetic) activity (which influences and can control heart rate, blood pressure, pupil dilation, blood glucose, muscle tension, sexual function, digestion, and energy regulation), and has been used as an indicator of stress levels, inflammation levels, and post-exercise recovery status, among other conditions. As such, it has proven useful in certain areas, such as gauging if an athlete has recovered adequately from a prior workout or is in a state of overtraining (e.g., over-trained), estimating cognitive functioning, predicting risk for certain conditions, etc. In fact, HRV data has been correlated to all major causes of death.
[0118] The research literature evaluating algorithms for the detection and correction of artifacts in an Inter Beat Interval (e.g., IB I) series focuses on data sourced from specific, homogenous subpopulations rather than broad cross-sections. Additionally, source data used in the comparative literature is typically derived from low-noise, research-grade ECG sensors. While this evaluation strategy may suffice for limited clinical contexts in which the population and experimental conditions can be controlled, large-scale consumer HRV applications must satisfy broader requirements. In particular, those consumer applications supporting open compatibility with 3rd party Bluetooth sensors are liable to face significant variance in both the population parameters (e.g. age, athleticism, pathology of users), and the sensor platforms used. Thus, it is insufficient to evaluate artifact detection algorithms using the traditional, narrow source data parameters.
[0119] There has been interest in leveraging HRV data (alone or in combination with other data) to make more specific recommendations. For example, competitor HRV4Training offers a suite of estimations geared towards athletes, including the following: Lactate threshold estimation (providing advice on pacing strategies for racing and workouts); Half and full marathon time estimation; V02max estimation (cardiorespiratory fitness level); Functional Threshold Power (FTP) estimation (providing advice on pacing strategies for racing and workouts); and Aerobic Endurance (efficiency and cardiac decoupling). Others have used HRV data to generate a variety of scores, e.g., cognitive functioning scores, risk classifications for cardiac events and cardiovascular disease, etc. In general, HRV data has been and continues to be actively researched to determine if HRV data can act as an indicator or biomarker of certain conditions and used for predictions. For example, HRV is used to diagnose autonomic dysfunction, or more generally it is used with other data to assess and/or track health and performance when implementing new protocols.
[0120] Current methods leverage HRV data (alone or in combination with other data) to make specific recommendations about lactate threshold estimation (providing advice on pacing strategies for racing and workouts); half and full marathon time estimation; VChmax estimation (cardiorespiratory fitness level); Functional Threshold Power (FTP) estimation (providing advice on pacing strategies for racing and workouts); and Aerobic Endurance (efficiency and cardiac decoupling). Other current methods use HRV data to generate a variety of scores, e.g., EHRV’s Morning Readiness Score, EHRV’s HRV Score (a normalized manipulation of InRMSSD), Stress Score, Productivity Score, Emotional State, and Illness Risk, etc. In general, HRV data has been and continues to be actively researched to determine if HRV data can act as an indicator or biomarker of certain conditions and used for predictions. For example, HRV is used to diagnose autonomic dysfunction, or more generally it is used with other data to assess and/or track health and performance when implementing new protocols.
[0121] A primary use context for HRV data remains athletic training and health predictions. Competitors, such as HRV4Training, have a suite of services geared towards this context. These include team views, calendar views, various customized scores (some calculated on the basis of third-party data in combination with L1RV data), and rudimentary predictions (e.g., estimated marathon time).
Raw Sensor Data and Physiological Signals
[0122] The systems, devices, and methods provided for in the present disclosure are directed to transforming raw sensor data into personalized guidance to users. Raw sensor data detected by sensors and/or received sensor data transformed by firmware into physiological signals is processed, through a series of steps, into personalized guidance to users. The raw sensor data may be acquired using one or more sensors that measure physiological signals, like electrical waves. For example, in a finger-over-camera device, an optical sensor, such as a camera, detects colors and intensity of brightness as measured by pixels in an array. These signals may be transformed into estimates of peaks and valleys representative of blood volume changes that can be transformed into the biomarkers of heart rate (e.g., HR), heart rate variability (e.g., HRV), pulse oximetry, respiration rate, blood pressure, body temperature, and so forth. For some optical devices (for example, CorSense), the detected color data is transformed by the device firmware into peaks and valleys representative of blood flow for estimating HR and HRV biomarkers. As an example from a different type of sensor technology, similar transformations occur on electrical sensors like chest straps. Except, instead of color detection, the sensors are detecting electrical waves propagated by the beating of the heart. These electrical signals are transformed into heart beat data that can be processed into the biomarkers described above.
[0123] As disclosed herein, camera-based sensing of facial data can also be processed into estimates of heart beat data. Using a face-over-camera device, camera-based data that is based on acquired video images, e.g., time series data, of a user’s face also may be known as image-based data. That is, the use of “image” in the context of image-based videos refers to using video data to capture the time-series information associated with time-changing physiological signals. The arrays of pixels in the camera detect raw color and intensity data that can be transformed by machine learning (e.g., ML) and/or machine learning methods (e.g., ML methods) and signal processing into estimates of heart beat data. The machine learning algorithms can be trained by combining simultaneous data streams from multiple sensors to make the ML processing and training more robust than is possible with only the one data stream coming from the image-based data. This process helps “train” the algorithm to transform the camera-based images into useful estimates of physiological biomarkers. [0124] Indeed, in embodiments, combinatorial methods may be employed to determine human user biomarkers. For example, finger-over-camera sensors, face-over-camera sensors, and wider-angle cameras that capture a large portion of a user’s body may be combined as a multi-sensor input to scan various locations of a user’s body. For example, camera-based images may be used to make measurements of a user’s body. These measurements may include the length of a user’s torso, a distance between a finger and the chest of a user, a distance between the face and the chest of a user, and the like. These distance measurements may be combined with heart beat data to estimate the blood pressure of a user, and/or other biomarkers.
Clean Data and Biomarkers
[0125] Following the sensing of the raw data, and the processing of the raw data into estimated heart beat intervals, proprietary algorithms are used to “clean” the data. That is, it is necessary to remove noise from the estimated data that is caused by movements of the user during data acquisition, as well as other factors. The noise is in the form of artifacts that can be identified and removed. Once the artifacts are removed, the algorithms replace the noisy data with estimates of the actual signal. The algorithms are being constantly refined and improved using ML methods. By combining the image-based data with the other existing sources of data, the ML methods are improved making the training process more robust.
[0126] Once a “clean” data set is available, biomarkers, such as HRV, RMSSD (root mean square of successive differences between normal heartbeats), high frequency power, low frequency power, and the like, can be calculated. Relationships between these biomarkers have been linked to various physiological states, including inflammation, certain kinds of disease, training recovery, and the like.
[0127] Once the biomarkers linked to the various physiological states have been calculated, Customized Insights can be calculated from one or more biomarkers, and from historical trends observed in historical biomarker data. For example, a “Stress Score” reading is a Customized Insight. The “Stress Score” reading is an output that is a calculated value that is calculated using algorithms that receive trended HRV, HR, and respiration as inputs. Additional Customized Insights include EHRV’s Morning Readiness Score, EHRV’s HRV Score (a normalized manipulation of InRMSSD), Stress Score, Productivity Score, Emotional State, and Illness Risk. These are calculated from HRV, HR, respiration, blood pressure (e.g., BP) and other input biomarkers.
[0128] Furthermore, Personalized Guidance may be calculated by algorithms that analyze the Customized Insights together with various biomarkers. The Personalized Guidance provides behavioral advice taking into account the user’s Customized Insights and the various biomarkers. For example, if the Customized Insights algorithm receives a low value for a Readiness Score, then Personalized Guidance might be to train with a lower intensity. While a Personalized Guidance is not medical advice nor is it a medical diagnosis, it may still be valuable to a user.
[0129] As described above, “HRV” refers to “Heart Rate Variability” which is a measure of the variability in inter-beat timing of a heart as it is actively beating. While heart rate (e.g., HR) is simply measured as the number of beats of the heart over a given period of time, such as 70 beats per minute, HRV refers to the variations of the precise time durations between the beats. That time variation can vary between around 10 milliseconds (ms) of variation between beats to more than 100ms of variation between beats. This amount of time between beats is referred to as inter-beat intervals, e.g., IBI’s. When the IBIs are measured by an ECG, the IBIs may also be known as R-R intervals (also used interchangeably with “RR intervals”). There are several algorithms used to calculate an HRV value from IBIs and/or RR values. Regardless of the algorithm used, determining an HRV value requires precise heart beat data. Importantly, the biomarkers are not measured, they are estimated from raw sensor data, as described above.
[0130] Reference throughout this document to “HRV scoring” refers to the development of a score that is calculated utilizing various algorithms to present a scaled score from which comparisons over time may be made. Reference throughout this document to “Morning Readiness” refers to a score related to a user's relative balance of parasympathetic and sympathetic nervous system activity based on the user's trended data and baseline status .
[0131] Further, to the extent features, sides, or steps are described as being “first” or “second,” such numerical ordering is generally arbitrary, and thus such numbering can be interchangeable. Still further, in the present disclosure, like-numbered components of various embodiments generally have similar features when those components are of a similar nature and/or serve a similar purpose. Moreover, a person skilled in the art will recognize that terms such as “measurements” and “readings” are used interchangeably herein to signify the process of data collection by the sensors disclosed herein. Lastly, the present disclosure includes some illustrations and descriptions that include prototypes or bench models. A person skilled in the art will recognize how to rely upon the present disclosure to integrate the techniques, systems, and methods provided for into a product in view of the present disclosures.
Sensor Types
[0132] The present disclosure generally relates to systems, devices, and methods to measure and track human user biomarkers. For example, the systems and devices of the present disclosure can utilize multiple sensor types, such as optical sensors, pressure sensors, electricpassive sensors, and the like. The various sensors may be disposed on parts of a user’s body, such as the upper or lower parts of a user’s arm or leg, on one or more of a user’s fingers, and/or on the chest of a user. Some sensors may be positioned separate and away from the user, such as a camera sensor located on a phone, laptop, desktop, webcam, security camera, and the like. The miniaturization of technology allows a user to measure more biomarker data than previously available.
[0133] The increased amount of sensors, and related sensor data, can be incorporated by HRV algorithms to deliver more advanced biomarker physiology insights based on the increased number and type of sensors. For example, some optical sensors worn on the arm and/or the finger of a user can measure PPG (photoplethysmography) signals that can be transformed into data that displays the variations of a heart’s contraction and relaxation. Electric-passive sensors, such as chest straps or multi-lead ECG sensors can measure electrical heart signals that are transformed into ECG traces. Pressure sensors can be integrated into smartphones and wearable devices. Sensors worn against the skin of a user may be able to measure skin temperature, and skin pH. Furthermore, continuous glucose monitors can measure blood glucose levels automatically, and nearly continuously.
[0134] In some embodiments, each sensor can be used to measure and/or track one or more parts of a user’s body to collect readings therefrom. These readings can be combined with one another to provide a detailed analysis of user physiology. Use of a plurality of sensors can exponentially increase power effectiveness of any of these methods and derive greater accuracy of user data. [0135] The collection of camera image-based physiological data may allow ML algorithms to augment, supplement, and/or validate sensor data. Collecting the image-based data may be performed utilizing visual light and/or infrared cameras pointed at the face of one or more users. The collected image data may provide insight into biomarker data such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, or ketone levels or a combination of the foregoing. The resultant biomarker data may be combined to provide a score that may include HRV data, or may consist of collected biomarker data as a corollary to a calculated HRV score. This data may provide insight into a user’s general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing.
HRV and Health-Related Predictions
[0136] The primary goal of Heart Rate Variability (e.g., HRV) monitoring and data analysis is to determine how a person’s heart rate, between the beats, fluctuates so as to perform analysis on the heart rate data, track heart rate variability over time, create an HRV score, create a “morning readiness score” over time, create composite scores that include additional scoring separate from the HRV scores created, and create insights and guidance to help the person achieve a relevant goal. .
[0137] The HRV system and method herein disclosed describes improved techniques for obtaining usable HRV data using generally available sensors (e.g., physiological sensors, smartphone cameras or other sensors as herein described) and techniques for improving the quality of HRV data or using lower quality HRV data, including improving signal to noise ratios. This will permit more confident HRV scoring when using lower quality sensors such as cameras and permit HRV scoring to be produced from smaller data samples, i.e., reducing the time needed to take useful readings.
[0138] Although extensive medical research has been conducted on the various uses for HRV data, e.g., workout recovery and performance or health-related predictions, conventionally HRV data (or various scores) have been used to provide general guidance for end users. An exception is in the clinical or research setting, where HRV data has been collected for single (not longitudinal) snapshots of the user's autonomic nervous system, usually derived with longer readings (5 minutes to hours long) with clinical grade equipment. [0139] The user has the opportunity to tag the HRV data collected during any reading with contextual information. This tagged information doesn’t affect the scoring, but it can affect the Customized Insights and Personalized Guidance that are provided, and may assist the user in understanding the data and how it relates to any of a user’s goals. In a non-limiting example, the tag data types may include sleep data, exercise data, mood ratings, questionnaires, custom tags/notes, blood glucose level, body weight, as well as other relevant data to be shared with the user. The user may also link the data with third-party apps and services to automate any contextual data collection and display other types of data alongside the HRV data and Morning Readiness scores.
[0140] The HRV system utilizes captured data from one or more HRV readings to calculate an HRV Score, scaled on a 1-100 basis, based on the natural log of the Root Mean Square of Successive Differences (RMSSD) for the HRV data collected. Changes in the HRV Score correlate with changes in: breathing and respiratory patterns; physical stress; recovery from physical stress; physical performance; psychological stress and health; emotion and mood; cognitive performance; immune system function; inflammation, posture and structural health; injury; biological age; general health and wellbeing; resilience and adaptability; risk of disease; morbidity and mortality; motivation and willpower; and digestive stress.
[0141] Upon completion of the HRV score calculation, a user may compare the calculated HRV score and other non-proprietary HRV parameters to general population and/or demographic-filtered population data to provide an indication of how the user compares with the general population as a whole or with specific filtered portions of the general population. This comparison may provide a user with some indication as to changes in their HRV values with respect to their own historic values as well as historic values for a given population.
[0142] The HRV system may also create a daily expressed score for use in tracking a user’s HRV values over time. This daily expressed score is known as a Morning Readiness score. The Morning Readiness score is a scaled score (1-10) that shows the relative balance or imbalance in the user’s sympathetic and parasympathetic nervous system, as well as providing a number and color indicator of the user’s ability to handle the stress and challenges of the day. The Morning Readiness score correlates with day-to-day fluctuations in the nervous system for an individual, highlighting to the user when major changes may have occurred in the body, based on the user’s own unique individual patterns. [0143] Currently, data must be collected from at least two HRV readings to establish a true baseline and to begin calculating the Morning Readiness score. The Morning Readiness score may be generated through automated pattern recognition applied to the user’s HRV scores over time. The pattern recognition is based on research and uses statistical methods such as standard deviation and mean over time to create the Morning Readiness score each day. This pattern recognition is further refined by research in the HRV system’s unique database of HRV data collected and stored for each user registered with the HRV system. Machine learning algorithms may be applied as testing and data analysis prove machine learning algorithms to be of equal or greater accuracy than the HRV system’s human-generated algorithms. The machine learning algorithms utilized by the HRV system may be trained using the HRV system’s database in order to produce an algorithm that automatically detects a user’s HRV trend and assigns a morning readiness score.
[0144] The HRV system may also create custom algorithms to determine customized scores. Such customized scores may include an inflammation score or other scoring parameters that may utilize HRV collected data in combination with other scoring data, or other parameters that are collected from outside data sources. The HRV system has access to large electronic data stores containing large amounts of collected HRV data from users of the system as well as metadata from HRV collected data. Much of the HRV data is labeled with contextual tags (metadata) and can be reviewed to label appropriate portions of the data as training data. For example, a user may tag that they are sick, have ingested some caffeine, or have just exercised. In this way, the retained HRV data and metadata may be used to create models that classify HRV data into contextual tag categories, proxies and/or equivalents, or new categories. This data categorization and metadata may permit the HRV system to detect conditions of interest in newly collected HRV data through the use of machine learning algorithms.
[0145] In some embodiments, the HRV system may utilize such machine learning algorithms to predict various conditions of interest to the users of the HRV system. Such conditions may include the ability to predict physical conditioning, physical performance levels, stress levels, EHRV’s Morning Readiness Score, EHRV’s HRV Score (a normalized manipulation of InRMSSD), Stress Score, Productivity Score, Emotional State, and Illness Risk, and other physical conditions affecting one or more users. The HRV system may utilize data organized into any number of statistical or visual patterns. For example, the data may be organized into one or more Poincare plots (a two-dimensional plotting of beat variability), geometric plots, DFA plots, Power Spectral Density charts, spectrograms, or any number of unique data visualizations that could potentially be run by machine learning (ML) algorithms to analyze data patterns and derive insights, or to identify conditions based on patterns present in the data. The plotted data may be derived from single-reading statistical and visual analysis, and/or from long-term trending patterns. Utilizing machine learning algorithms to perform an analysis of such data plots may permit the HRV system to discover new conditions, such as, in a non-limiting example, conditions associated with stress, athletic performance, fatigue, and other conditions affecting the users of the HRV system.
[0146] As may be appreciated, many such classifications are possible for various types of athletes, gaming competitors, users interested in general health, etc. These classifications may in turn be reported, for example, as a score associated with which category the HRV data is classed, classified, or associated, as a predicted risk or performance indicator. Implementation would largely depend on having access to sufficient training data, which may need to be annotated or categorized for use as training, validation, and/or testing data. However, the HRV system currently collects such data and may continue to collect these types of data in a directed manner. The data collection effort may take the form of prompting users to report goals, race times or event performances, training logs, stress levels, etc., in combination with their HRV data readings. This information, in combination with the collected HRV data and HRV metadata, provides the basis for the calculation and creation of composite scores consisting of HRV information and additional data supplied by users.
[0147] There have not been any solutions that offer specific guidance to end users based on HRV data that has been collected. This lack of guidance stems from a lack of certainty in the reliability of input HRV data itself and the inability to leverage reliable outcome data that correlates HRV data with specific plans, courses of action, and outcomes. This results in HRV data being used to generally estimate the user’s nervous system state and provide equally general feedback. In a non-limiting example, HRV data has been used to provide daily feedback regarding the body’s apparent ability to handle a stressful workout, a binary categorization of a user’s risk for a condition, etc. In this non-limiting example, an application for a training plan, such as a triathlon training plan, may guide a user through a series of HRV measurements, scores, and plan steps to customize the training for the user based on his or her actual HRV data. An initial HRV reading is taken, followed by a programmed event (e.g., a workout). Thereafter, the application guides the user to take a subsequent, updated HRV reading. Depending on the change, if any, in HRV data collected during the reading and the HRV score, the plan may be modified. The decision as to how the plan is to be modified may be programmed into the application, e.g., based on HRV data research, learning from the community, etc. At the various points in the plan, the application may provide data feedback related to the HRV score, the HRV scoring trend, contextual feedback, the Morning Readiness scores, or a combination of the foregoing. This allows the user(s) to understand, based on HRV data and other contextual data, the effectiveness of the plan, why it has been modified, etc.
[0148] While this is quite useful to athletes wishing to determine if they are over-trained and should rest, or have reached a specific performance level (e.g., estimated half marathon time), current uses have limited the usefulness of current applications and overlook many potential uses for HRV data, particularly as applied while the user is actively engaged in a training plan or health or lifestyle improvement application, and should have specific, guided adjustments made to that plan.
[0149] In some embodiments, with some modification (e.g., continuous reading for live biofeedback), this general technique is applicable to a wide variety of possible applications, ranging from near-term or planned applications for guided breathing and meditation, to exercise and fitness plan modification and food sensitivity validation based on HRV scores combined with other data, and even long term plans to use HRV data in novel contexts. Such additional applications may include modifying the behavior of systems like gaming systems, content recommendation systems, or vehicles using HRV data.
[0150] Additionally, HRV data tends to be somewhat difficult to understand. This lack of understanding of what HRV data specifically indicates regarding a user’s physical condition has resulted in the use of various scores. While these various scores are quite useful in driving home the meaning of a user’s current HRV readings, current existing scores may also serve as a defined endpoint to guidance or advice that could flow from the HRV data.
[0151] In some embodiments, the system and method described herein plans to provide an improved set of recommendations that include specific guidance based on HRV data and other physiological, behavioral, and outcome-based data. The improved recommendations may be provided periodically, as part of an ongoing plan, or provided in real-time for live biofeedback. This will allow users to more confidently approach a myriad of tasks that could be improved by monitoring HRV data as well as other aforementioned data and tailoring specific feedback on the basis thereof. This may include provision of various custom scores and directed, goal-oriented applications for individuals or groups.
[0152] In some embodiments, the HRV system may expand on the scores or indices that are provided to users by leveraging the proprietary database of collected HRV data. Scores of interest to the HRV system, and by extension to the HRV system users, include a recovery score, an inflammation score, a cognitive function score, a readiness score for specific goals (e.g., triathlon readiness), a health score, a fitness score, a stress index, a “tilt” score (gaming/poker term for being stressed), a self-awareness score, and a glucose/HRV/ketones index. For example, a cognitive function score may have sub-categories that may include: chronic/acute stress; attend on/focus; productivity; and so on. Examples of health score subcategories may include: inflammation; resilience; anxiety; and so on. Examples of fitness score sub-categories may include: recovery, readiness, VChMax, and so on. Existing research may be useful in designing algorithms to make these predictions. In a non-limiting example, a cognitive function score may be created based on research indicating HRV scores are related to cognitive capability.
[0153] In some embodiments, HRV data quality remains a concern, particularly when attempting to offer anything but general guidance. While high quality HRV data can be obtained using a biosensor specifically designed for the task, such as a finger sensor or chest ECG strap, collection of high quality HRV data remains cumbersome due to the need for biosensors and somewhat extensive collection times.
[0154] In some embodiments, the system and method herein described may provide improved techniques for obtaining usable HRV data, as well as additional data input by a user or a medical service provider, using generally available sensors and input methods (e.g., smartphone cameras). This will permit more confident HRV scoring and combination scoring when using lower quality sensors such as cameras and permit HRV scoring to be produced from smaller data samples, thus reducing the time needed to take useful readings.
[0155] The efficacy of any heart rate variability metric critically depends upon the signal to noise ratio of its source data. In particular, series of so called IBI’s (inter-beat-intervals, i.e. time between consecutive R waves in the QRS complex, or the time between detected beats in a PPG pulse-waveform) are susceptible to contamination by artifacts which if ignored or improperly treated, demonstrably deteriorate the accuracy of estimated (Heart Rate Variability) HRV metrics. The QRS complex is the combination of three of the graphical deflections seen on a typical electrocardiogram (ECG or EKG). It is usually the central and most visually obvious part of the tracing. It corresponds to the depolarization of the right and left ventricles of the heart and contraction of the large ventricular muscles. The Q, R, and S waves occur in rapid succession, do not all appear in all leads, and reflect a single event and thus are usually considered together.
[0156] Most sensors which detect heart beats for digital signal processing are one of two types: electrocardiogram (ECG) and photoplethysmography (PPG). ECG functions by placing electrodes on or near the user's chest. With each beat, the human heart generates variations in skin-surface voltage roughly on the order of 1 millivolt. These variations induce electron movement in the ECG electrodes which are captured in computer memory by analog-to-digital conversion. In some embodiments, PPG functions by emitted light composed of one or more wavelengths and intensities onto the user’s skin (usually finger or earlobe) and measuring the light reflected back or transmitted across. In some embodiments, PPG functions by using a flash on the camera to emit light, by using the ambient light behind a finger or earlobe as the light source, or by a combination of both light sources. Because the arterioles and arteries distend when blood is pumped by each heartbeat, the opacity of the tissue varies with the cardiac cycle.
[0157] HRV monitoring and related analytics may be provided through a proprietary finger sensor attached to a user and used for collecting HRV data (photoplethysmogram (PPG) data collected using LEDs), although the mobile app allows users to input HRV data using third- party sensors (e.g., chest strap that collects electrocardiogram (ECG) data and provides interbeat intervals for calculating HRV).
[0158] In some embodiments, the system may utilize the physiological sensor, either an electrocardiogram (ECG) or a photoplethysmogram (PPG), to detect heart beats of a user of the system. Upon collection of the measurements from the physiological sensor, the sensor measurements derive the peak of the heart contraction and report the time, in milliseconds, between peaks. This derived set of measurements defines the inter-beat intervals or, as commonly known, the R-R intervals. In a non-limiting example, the physiological sensor may be specified as a finger mounted sensor, although other sensors applied to different parts of a user’s body may be equally effective in capturing the sensor measurements. For example, in embodiments the physiological sensors may be wearable, and may include chest straps or patches, smartwatches, wrist or arm wearables, ring wearables, hats, clothing, or other devices that can contact a portion of the body known to one skilled in the art. The wearable sensors may be based on ECG or PPG physiological sensors.
[0159] There is an interest in removing from the HRV system the need to use only a dedicated sensor, or just a physiological sensor to collect HRV data. There is interest in using an integrated sensor, such as a wearable device that collects HRV data natively (e.g., Apple Watch) or a smartphone or other computer based camera that facilitates image-based HRV data collection. The HRV system may couple the integrated sensor data, such as pupil dilation information with other data collection, such as user supplied blood pressure and device data, such as data readings from an accelerometer or other sensors installed within an electronic device such as a smart watch, smartphone, tablet, or other computer based sensors. Use of existing sensors of the user’s common hardware (e.g., smartphone, smartwatch, laptop, etc.) will provide an expanded access to users and data. Of these sensors, camera finger-based physiology detection sensors are currently in use for collecting HRV data. These sensor readings may be improved by reducing finger movement via reduced reading times or finger stabilization techniques or a magnetic accessory that attaches to the finger to stabilize it, among other methods for finger stabilization. Sensor readings may also be improved with the use of machine learning algorithms.
[0160] In some embodiments, camera, face-based (e.g., image-based), physiology detection is another mechanism to collect image data that can be used to collect HRV data, and derive HRV scores and other biometric data. The image-based, HRV-derived biomarker data may be combined with other biomarker data, such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, ketone levels. Furthermore, the image-based, HRV-derived biomarker data may also provide insight into a user’s general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing. The image-based detection may analyze still images, video images, or a combination of both. Use of image/video analysis permits a user to point any camera with high-enough resolution and/or sufficient speed (e.g., frames per second) at his or her face and to detect signals that may be used to estimate heartrate and HRV, as well as other data such as blood pressure, pupil dilation, etc., some or all of which may be combined into a composite score of which HRV data is only a portion. The image/video analysis may operate in situations wherein the image/video data was collected with environmental lighting (e.g., ambient lighting). In use cases where the environmental lighting was low, the image/video analysis may use image-correcting algorithms to adjust the intensity and brightness of the pixels in the frames.
[0161] Likewise, remote cameras can be used to determine this data, such as at a sporting event. In almost real-time, it is possible to detect color and intensity signals and other data and use this to provide an HRV score. The image-based data may be used to derive the various composite or customized scores that use HRV data, breathing rates, oxygen levels, blood pressure, pupil dilation, eye movement/blinking rate, emotional state and additional such parameters.
[0162] Face-based detection is important because it can be used more naturally during certain activities, such as driving a car, whereas finger-over-phone camera or other sensing is not possible or not preferable.
[0163] In addition to any existing method of using image data to determine HRV and related biometric data of interest, the HRV system has the ability to use one or more existing databases of HRV data and HRV metadata to improve the accuracy of image-based HRV scoring and other determinations. In a non-limiting example, the HRV system can utilize high quality HRV data and related HRV scores to learn which images or features are associated with the HRV scores. That is, with machine learning, the HRV system can learn which features in an image/video are most closely correlated with higher HRV scores. This may involve the use of one or more machine learning algorithms, for example, to categorize images or image features and associate these categories with particular HRV data and scores.
[0164] The R-R intervals may be transmitted to a system server using a wireless protocol such as Bluetooth, although this should not be considered limiting as alternative wireless protocols may be used such as BLE, Wi-Fi, NFC, ZigBee, or other such protocols developed in the future, for storage and analysis. The system may have a plurality of software modules that analyze the data to determine hourly measurements, daily measurements, and/or measurements throughout the day for heart rate variability (HRV) in an individual. [0165] After being digitized, the raw waveform is processed by a beat detection algorithm to determine where true heart beats occurred. For finger-over-camera or face-over-camera PPG measurements, there are two levels of beat detection and artifact detection. First, the raw color signal is de-noised, and then cleaned up prior to generating the pulse waveform and estimating the beats from the signals. Second, after the beat detection, the IBIs are run through another artifact detection algorithm that detects artifacts such as arrhythmias or ectopic beats. For PPG devices that are not the finger-over-camera or face-over-camera, the artifact detection/correction algorithms may be performed directly on estimated IBIs. In other PPG devices, the denoising step may be performed directly on the pulse waveform, followed by estimating the beats and IBIs, and then finally performing the artifact detection step. For ECG signals, the R-R intervals are calculated on the hardware device, and then artifact detection and correction is performed on the R-R interval signal.
[0166] For ECG signals, beat detection algorithms take the form of QRS complex detection algorithms. QRS detection may utilize wavelet analysis or some other pattern matching system. Algorithms for beat detection for both ECG and PPG vary across devices and software applications. After being processed at this stage, what remains is a series of interbeat-intervals (IBIs). An IBI is simply the amount of time (usually milliseconds) between two subsequent beats. A typical value might be 1,000 ms, which would in turn correspond to an instantaneous heart rate of 60 bpm. Given a noise free recording under perfect conditions, the IBIs could be used to directly calculate HRV as they are. As such is rarely the case, it is at this point that artifact detection algorithms should be applied to the IBIs to test for any errors or artifacts that may have entered the signal thus far.
[0167] In an embodiment, the HRV system software manages connections to multiple sensors, assisting the user in selecting the appropriate sensor for the current measurement. Upon receipt of R-R intervals from the hardware, the Elite HRV software, in real-time (within a second or two), displays the beat patterns and/or pulse waveforms and received data visually to the user for live or real-time biofeedback in the form of calculated heart rate, calculated HRV values, visual charts of heart rate patterns and R-R interval patterns. The Elite HRV software also checks, again in real-time, the received data for accuracy and quality. The data quality checks are based on published research standards (typically done manually by physiologists or research teams), historical population data, and patterns in prior data received within the same reading or session, i.e., beats are analyzed recursively throughout the reading as new beat intervals are received. The HRV system also assists the user visually and algorithmically in identifying when the user's heart rate has stabilized at the beginning of a reading.
[0168] Upon completion of the storage of all R-R interval data, the system is operative to apply the Root Mean Square of Successive Differences (RMSSD) calculation to the R-R intervals. The RMSSD analytical method is the industry standard time domain measurement for detecting Autonomic Nervous System (specifically Parasympathetic) activity in shortterm measurements, where short-term is defined as approximately 5 minutes or less. A natural log (In) is applied to the RMSSD calculation. RMSSD does not chart in a linear fashion, so it can be difficult to conceptualize the magnitude of changes as it rises and falls. Therefore, it is common practice in the application of RMSSD calculations to apply a natural log to produce a number that behaves in a more linearly distributed fashion.
[0169] The In(RMSSD) is expanded to generate a useful 0 to 100 score. The In(RMSSD) value typically ranges from 0 to 6.5. Using over 6,000,000 readings from an existing proprietary database, the system may be able to sift out anomalous readings and create a much more accurate scale where everyone fits in a 0 to 100 range - even Olympians and elite endurance athletes.
[0170] The HRV score may correlate with changes in breathing and respiratory patterns, physical stress, recovery from physical stress, physical performance, Psychological stress and health, emotion and mood, cognitive performance, immune system function, inflammation, posture and structural health, injury, biological age, general health and wellbeing, resilience and adaptability, risk of disease, morbidity and mortality, motivation and willpower, and/or digestive stress. The customized HRV score may be transmitted to a medical practitioner or directly to a user, where the medical practitioner or user may compare the customized HRV score, and other non-proprietary HRV parameters, to population data and/or demographic- filtered population data to provide a basis in comparison to a selected population.
[0171] In a non-limiting example, when measuring HRV changes before or after specific events, it is recommended that HRV readings should be taken for at least 60 seconds immediately pre-and post any activity or event. For better accuracy in the HRV readings it is recommended that the user keep the same body position between readings that the user wishes to compare to past or future readings. [0172] In an alternative non-limiting example, HRV readings can gather relevant HRV data in as little as 30 to 60 seconds duration or as long as 24 hours. However, for Morning Readiness type readings, it is recommended by the HRV system that the user take at least a two-minute reading to collect HRV data for that time. Morning Readiness may be estimated from readings of 60 seconds (e.g., 1 minute) to 300 seconds (e.g., 5 minutes), but certain HRV indices (such as frequency domain metrics) are only available for readings over 120 seconds (e.g., 2 minutes) This data collection effort should be performed after the user’s period of longest sleep. Using guidelines transmitted to users from the HRV system most users will perform a data collection HRV reading of between 60 and 180 seconds in duration. For HRV data collection during meditations or live biofeedback, the HRV system recommends taking data collection readings of between 4 and 20 minutes in duration and repeating as often as required by the user. During this data collection period the user has the option to turn on audio and/or visual cues for guided breathing patterns, mindfulness, and meditations.
[0173] Additionally, the system has a mobile application (app) for use in capturing and transmitting information between the user and the HRV monitoring system. The mobile app currently focuses on providing general data (e.g., heart rate), scaled HRV score, and Morning Readiness score coupled to high-level or general feedback based on the HRV data. Furthermore, the mobile app also provides more detailed HRV metrics for every reading, such as different time domain and frequency domain indices. For example, a Morning Readiness score may be presented as a numeric value and a gauge graphic.
[0174] In an embodiment, the software modules in the system server may be active to manage connections to multiple sensors, assisting the user in selecting the appropriate sensor for any desired measurement. Upon receipt of the R-R intervals from the sensor(s), regardless of the sensor utilized, the system software immediately performs a set of functions in real-time, where real-time is specified as an interval of less than two seconds from the receipt of the R-R interval information.
[0175] Initially, the system software displays the beat patterns and received measurement data visually to the user for live feedback to the user in the form of calculated heart rate, calculated HRV values, visual charts of heart rate patterns and R-R interval patterns. This feedback to the user is also known as biofeedback. Next, the system software is operative to check the received measurement data for accuracy and quality. In a non-limiting example, data quality checks are based upon published research standards, historical population data, and/or patterns in prior data received within the same “reading” or measurement collection activity. Heartbeats are analyzed recursively throughout the reading as new beat intervals are received.
[0176] In an embodiment, readings can be as little as 30 second to 60 seconds in duration or as long as 24 hours. For Morning Readiness type readings, recommendations to the user are to take a reading between 60 and 180 seconds in duration, with the average being approximately 120 seconds (2 minutes) after the period of longest sleep, which is typically a morning reading. For meditation actions or live biofeedback for a user, the recommendation is to take a reading of between 4- and 20-minutes duration, repeating as often as the user desires to foster user actions. The Morning Readiness Score may correlate with day-to-day fluctuations in the nervous system for an individual, highlighting to the user when major changes may have occurred in the body, based on their own unique individual patterns.
[0177] In a non-limiting example, the ranges of the Morning Readiness score provide information to the user on whether the user is in a Sympathetic or a Parasympathetic status on that given day. In this example, values in the 1-3 portion of the range are in the red zone of a gauge as represented on a gauge score graphic. This indicates a wide swing in balance either towards the Sympathetic or Parasympathetic side. A wide acute swing in either direction is usually in reaction to a strong acute stressor or reaching a threshold of accumulated stress. Values in the 4-6 range are in the yellow zone. Yellow indicates a similar, but not as drastic, change in relative balance as a red indication. Yellow days are often nothing to worry about in isolation. Values in the 7-10 range are the green zone. Green indicates that your relative balance is very close to the user’s norm. The readiness scores use standard deviation thresholds for an individual’s historical (baseline) HRV data to determine the readiness for the day. A perfect 10 score is achieved when the relative balance is slightly Parasympathetic leaning. This means that if the user normally scores around a 45 on the HRV score, then an HRV score of 46 may produce a relative balance score of 10. For example, one user could produce a score of 10 with an HRV of 46 (and baseline around 45) while another user might produce a 10 with an HRV of 47 (and baseline around 45) because of their historical data and would produce different thresholds (i.e. personalized to them).
[0178] In an embodiment, the sensitivity of the 1-10 relative balance score depends on a user’s individual patterns. If the user often fluctuates widely day-to-day, then the user’s relative balance gauge will become less sensitive to change. If the user’s HRV scores hardly fluctuate at all during the baseline period, the relative balance gauge will become more sensitive to small changes. Additionally, utilizing proprietary data analysis algorithms and machine learning systems the sensitivity to small changes may be increased further permitting greater accuracy for the relative balance score and the reporting of any fluctuations in a user's relative balance gauge.
[0179] In an embodiment, the data analysis results in an instant HRV score and a morning readiness score that can be used for spot checks, and can be used as a parameter to be analyzed over time to determine long term HRV measurements for an individual. The instant HRV scores are also accumulated and analyzed over time to help physicians and users in tracking HRV, forming a part of the health tracking data for the user. This instant HRV score is also used by professional and elite athletes to analyze their heart rate variability to optimize performance, and may be used by a coach or an automated software algorithm to also create training plans based upon the athlete’s performance as shown by the HRV score. The morning readiness score provides a daily baseline indication for the user. This score is trended and charted over time, to help the user understand how acute, short-term, mediumterm, and long-term choices and events impact their score over time.
[0180] In an embodiment, HRV data readings may currently be taken utilizing various devices where such devices may, include a mobile device have a network connection capability, such as a smartphone, iPad, tablet, wearable mobile device, laptops and other mobile devices, as well as cameras and sensors incorporated directly into fitness equipment. Data may also be able to be generated by sensors built directly into a mobile device and may not require a connection, wired or wireless, to external sensors. During the reading, the user may also have access to audio and/or visual cues to present guidance on breathing patterns, mindfulness, and meditations.
[0181] In an embodiment, after an HRV data reading is completed, the user, whether a medical professional taking a reading from a patient or a user taking a reading from their own body, may have an opportunity to tag the HRV reading with contextual information. The tag information may be attached to the completed score derived from the HRV reading but does not affect the calculation, analysis, or creation of the completed score. An optimized future system may utilize the tagged HRV reading with contextual information to discover meaningful patterns identified in data analysis or by machine learning algorithms to generate composite metrics that utilize the contextual information in the metric generation. The tag information may be attached to the score record to assist the user in understanding the HRV reading and score data and how the data relates to any goals that have been expressed by the user. In a non-limiting example, a user may add tag information consisting of sleep data, exercise data, mood ratings, questionnaires, custom tags/notes, blood glucose, body weight, or any other data that is useful for assisting the user in achieving their goals.
[0182] Additionally, the user may link an established account maintained on the system server with 3rd party applications and services to automate the collection and display of other types of data that may be associated with the collected HRV reading data and scores, including an established morning readiness score and composite scores reflecting HRV data and ancillary data collected from one or more users.
[0183] In an embodiment, the signal quality of the sensor is analyzed in conjunction with the full data captured by the sensor during the HRV reading action. The system server is operative to create a novel, customized signal quality rating. This signal quality rating may be provided to inform and educate the user on the validity and quality of the rating (e.g., the quality of the data and therefore the quality of calculated values and insights) when received and reviewed by the user. That is, the signal quality rating relates to the quality of the data, and therefore the quality of calculated values and insights.
Artifact Identification
[0184] In an embodiment, the signal quality of the HRV measurement apparatus is currently analyzed initially and again with the full collected data from an HRV reading. Currently, a proprietary signal quality rating is provided to the user to educate them on the validity and quality of the reading. This signal quality rating is based on internal research determining the degree of confidence in a result given a certain frequency, total amount, and magnitude of signal artifacts from all sources, as compared to the total duration of the reading and the detected patterns present within the reading. The signal quality score is based on published research standards that have been previously created by physiologists and/or research teams, historical population data that has been collected over time, and patterns in prior data received within the same data collection reading or session. [0185] In an embodiment, customized scoring may be generated from the analysis of the received signal data upon determination that the received HRV reading data is in a form that is ready for analysis by the receiving device, where the receiving device may consist of a system server, a smartphone with or without an internet connection, and/or fitness equipment having an internet connection or having an embedded analysis software module. This may occur when the received signal data is free of artifacts and signal corruption. While there are many potential sources of signal corruption, the net effect, regardless of corruption source or sensor type, can be classified as one of two fundamental types. Either:
[0186] The beat detection algorithm missed one or more beats that actually occurred (type 1), or
[0187] The beat detection algorithm detected one or more false beats that did not actually occur (type 2).
[0188] Type 1 is sometimes referred to as a false negative, type 2 as a false positive. As will be discussed below, these two types of artifacts have their own distinct waveform patterns and properties, such that the corrupted signal can be analyzed, and often times the impact of corruption can be mitigated or eliminated entirely. It should be noted that ectopic beats can exhibit properties of both false positives and false negatives.
[0189] When artifacts that may detract from or compromise signal quality are detected, there are numerous ways to handle the artifacts to clean or correct them programmatically. The HRV system uses a proprietary blend of algorithms for different scenarios to analyze and clean the signal from the data collection sensor or device, as needed. Attempts to clean the signal and improve the data collection effort may include feedback to the user in certain circumstances. In a non-limiting example, the system could suggest to the user data collection times to take readings to improve data collection (as with the Morning Readiness score), suggested postures associated with certain artifacts to remove the generation of these artifacts, suggestions for sensor(s) placement, etc. to reduce artifacts. Moreover, signal clean up (e.g., filtering techniques) may be applied differentially based on detection of known issues such as incorrect posture.
All artifact detection algorithms presented herein function with the same general logic:
1) Quantify what it means to be “normal” 2) Define a threshold difference from “normal” at which the value is to be declared artifactual.
[0190] The variation in sophistication between artifact detection algorithms is hidden within the definition of “normal”. For example, the naive approach to artifact detection, known as “simple thresholding” follows the same overall approach as more effective algorithms, but fails to quantify normalcy in an intelligent way. As suggested by its name, simple thresholding involves selecting some high heart rate value, such as 240 bpm, and some low value, perhaps 20 bpm, and marking any observed value outside of these ranges as artifactual. As one might expect, this system suffers from extremely high false negatives (declares an interval not an artifact when it actually was). One may narrow the values closer to an estimated average, but this only serves to trade false negatives for false positives. The simple thresholding described so far is a static variant, in which the threshold values are not modified per signal. A dynamic variant would be one in which the mean of the signal is calculated, and the thresholds set as mean + cl and mean - c2 for some constants cl, c2. In fact, this system suffers the same weaknesses as the static scheme.
[0191] In an embodiment, a slightly better approach to the dynamic simple thresholding described above would be to replace the mean with the median, and the cl, c2 values with cl*std, c2*std, where std is the signal's standard deviation. Adding this flexibility to the algorithm helps account for the large difference in non-artifactual variance observed across individuals. Still, this system suffers from critical flaws. Most notably the standard deviation of a signal is quite sensitive to artifacts itself. Therefore, if the signal contains a large number of artifacts, or a few artifacts of large magnitude, this system will allow the less deviant, but still artifactual intervals through.
[0192] Two innovations in the “normalize and threshold” schema produce significant improvements in detection accuracy. The first innovation is to analyze not the IBI intervals themselves, but rather the differences between subsequent intervals. This strategy minimizes the negative impact of valid local variations in heart rate, while retaining the ability to capture artifact generated spikes or impulses. The second innovation is quantile-based threshold determination. The Bemtson algorithm is an industry standard which utilizes both IBI difference analysis and quantile thresholding to good effect. This algorithm assumes a normal distribution of beat differences in order to calculate the Maximum Expected Difference (MED) for veridical beats, and well as the Minimum Artifactual Difference (MAD). IQR = Q3-Q1
QD = IQR/2 = SD/1.48 (assuming gaussian) MED = 3.32 * QD
MAD = (Median - 2.9 * QD) / 3
[0193] Where QD is the quartile deviation of the IBIs. The artifact cutoff threshold is then taken as a mean of the two values, which given normally distributed IBI differences will cover at least 97.5% of artifact-related differences, though in practice the number is often higher. Additionally, we have made two modifications to the Berntson algorithm in response to empirical testing on data from our user base. The first modification is regarding the logic which marks artifactual beats given threshold-exceeding IBI differences. In principle, the Berntson algorithm marks pairs of IBI’s, not individual IBI’s, which can be seen as one cost- of-difference based method. The second modification is a set of heuristics for identifying contiguous runs of artifactual beats. While uncommon in most test data sets, the reality of consumer heartbeat data is that it frequently contains sequences of spurious beats due to motion artifacts. For any artifact detection method based on IBI difference this presents a problem, since for a run of 3 or more artifactual IBIs, the outermost ones may have threshold exceeding differences, while the inner ones may not.
[0194] Another artifact detection technique with traction in the literature is based on impulse response detection. The strategy is to calculate a series of deviations from the median in order to detect unusually large impulses, then normalize each of these differences with another median derived value specific to each RR series.
[0195] A windowed version of this algorithm enhances accuracy by cutting the target series into overlapping windows and calculating the median and normalization factor for each window separately. It also sets the overlap factor such that each value (except first few) are tested at least twice. The series of normalized differences is calculated as follows: a- Xj(h) = (Wj(h) - Wmj) / med{ |Wj(h) - Wrnjl }
[0196] Where Xj(h) is the normalized difference from the median of the hth element in the jth window. Note that the median in the denominator is calculated once for the entire window. [0197] In an embodiment, a pattern-based windowed impulse response (PWIR) algorithm was tested for which good performance on non-pathological R-R datasets was reported. PWIR functions similarly to WIR except that the sign of differences from median is preserved in order to be able to match specific artifact shapes/pattems. Patterns fall into three categories that determine the appropriate corrective action to perform on an artifactual RR. Possible corrective actions include interpolation and recovery of split intervals via addition. The benefit of this method is that it tests not only the magnitude of an impulse, but the shape formed by every four consecutive samples. This allows for stricter threshold values without major increase in false positives.
Categories:
1. Missed beat b. Shows as positive single spike in RR series. c. i.e. False Negative
2. Spurious beat a. Negative spike with 1+ points (generally 2 points) b. i.e. False positive
3. Ectopic beat a. Single point negative spike followed by single point positive spike. b. Ectopic beat prevents normal beat from occurring and then subsequent beat come on original schedule.
[0198] Note that PWIR seems not to be designed for data contaminated with significant motion artifacts as the case of evenly split spurious beats is unhandled. While the authors clearly state that it is intended for certain usage, that should not be much consolation to developers whose applications consume data from disparate and unpredictable sources.
[0199] The final algorithm tested is one based upon the Integral pulse frequency modulation (IPFM) model of heart rate variability. The IPFM model, also called the “integrate and fire” model, describes the beating of the heart in terms of sympathetic and parasympathetic inputs to the sinoatrial (SA) node via a modulating function of time: m(t). The model states that the SA node accumulates these inputs until reaching a critical threshold, at which point a heartbeat is triggered and the integrator resets.
Figure imgf000041_0001
[0200] Where Psi is the critical threshold. According to IPFM, the heart beat timing differences are band-limited by the modulating signal, which is itself band-limited. The present algorithm exploits this limitation by estimating the derivative of the instantaneous heart rate signal, and determining where it exceeds a threshold derived from the IPFM model.
[0201] The premature contractions known as ectopic beats are even common in healthy individuals. These contractions can be either ventricular or atrial in origin, and are quite distinguishable from normal beats on an ECG. The ventricular ectopic beats can be further classified into ones in which the normal heart beat following the artifactual beat occurs “on schedule”, and supraventricular ectopic beats in which the normal heart beats are effectively “reset” by the artifactual beat. Known patterns such as these can be exploited by artifact detection algorithms. Additional physiological sources of artifacts include atrial fibrillations, ventricular fibrillations, and muscle contractions.
[0202] Other artifact sources can be classified as “technical” in that they arise from shortcomings or improper application of the sensor technology. Among these, “movement artifacts” are the most problematic. Because both electrical (ECG) and optical (PPG) sensing modalities work with minute signals, any variations in the distance from the sensor to the surface of the user’s skin can induce variations in the signal which are indistinguishable from heart beats, or otherwise reduce the signal to noise ratio such that the true heart period information is not recoverable. Poorly fastened electrodes or poorly placed LED PPG sensors can exacerbate this problem. Various algorithms exist which attempt to filter out movement artifacts via correlation with a concomitant accelerometer signal, though these algorithms vary between sensor platforms and are not within the scope of this paper. Additionally, there is no accepted standard for QRS complex detection, thus introducing the risk of poorly designed algorithms for an unvetted sensor platform. Beyond the detection of peaks, many sensor platforms attempt to clean up the signal by applying smoothing filters to the IBIs. While this practice can improve heart rate measurement stability, it can highly distort HRV metrics. Artifact Correction
[0203] In order to further evaluate the efficacy of various artifact detection algorithms, the system continues with the heart rate data analysis towards the ultimate end of extracting a useful time, frequency, or nonlinear parameter. Before extracting the parameter of interest though, artifact annotations supplied by the previously discussed algorithms are utilized. The naive approach to artifact correction is to delete them. While this strategy is acceptable for the evaluation of certain time-domain parameters, in particular SDNN and SDANN, it induces significant error in other cases. Frequency domain parameters are particularly sensitive to interruptions in the signal.
[0204] Generally speaking, artifactual R-Rs are interpolated rather than deleted. Performant interpolation methods described in the literature include i) linear interpolation, ii) non-linear predictive interpolation, and iii) cubic spline interpolation. While different effects have been reported for varying interpolation methods depending upon the parameter of interest and data source, the difference between non-deletion interpolation methods may not be significant at artifact rates below 5%. In the system signal processing pipeline, multiple interpolation methods are implemented, with the specific choice determined by which HRV parameter is being calculated. To fairly compare the deleterious impact artifacts on an HRV metric, a consistent method is used, such as, in a non-limiting example, cubic spline interpolation, for the annotations produced by all three detection algorithms (e.g., i) - iii) above) under review.
[0205] In addition to manually designing algorithms to reduce signal noise and improve HRV scoring from lower quality data, a large amount of historical HRV user data may be leveraged to provide more accurate HRV scores using lower quality data or less data. This additional data analysis allows for HRV scoring to be completed in a shorter time span or completed with data of a higher quality than is otherwise not currently possible or optimal. In a nonlimiting example, a user can provide less HRV data or provide HRV data of lower quality and receive a valid HRV score, perhaps tempered with a score quality or confidence rating. [0206] The HRV system may use machine learning to associate the presently input HRV data with a particular class or category based on a model trained with previously recorded HRV data and scores. A user’s input of lower quality HRV data (whether due to sensor type use or amount of HRV data collected) may be insufficient to assign an HRV score using normal processing via a static algorithm. However, the lower quality input HRV data may be input to a machine learning algorithm trained using the pre-existing HRV data currently collected and stored in the HRV system’s database. The HRV system may optionally utilize contextual data or a composite of signals to boost the quality of the collected HRV data and thus provide an HRV score of higher confidence in terms of accuracy and quality. In a non-limiting example, the machine learning algorithm may be trained using historical HRV data from a validated sensor. This permits the software to provide an HRV score even though the data collected typically would be insufficient. A reported quality score may indicate the technique used or the confidence of the score reported.
[0207] It is worth reviewing the causal sources of artifacts in order to best anticipate and handle them. Artifact sources can be thought of as being either physiological or technical in origin. Physiological artifacts occur when an electrical impulse is generated by some mechanism other than the depolarization of the heart’s sinoatrial node.
[0208] In a non-limiting example, a novel, customized HRV score may be calculated from the analysis of the received HRV reading data. The system may receive the R-R intervals directly from a chest strap heart rate monitor or other sensor device attached to a user. Obvious artifacts within the data, such as readings that are out of bounds, obviously incorrect, or corrupted, are cleaned and/or removed. The raw, unaltered R-R intervals are backed up securely to an electronic database maintained within the system server. This allows for optimization and improvement of algorithms for all current and past calculations, as well as for the export of the raw, unaltered R-R intervals to a different system or storage location if desired by the medical practitioner or user.
[0209] In embodiments, an additional novel and proprietary score, the morning readiness score, may be prepared by the system and transmitted to a user on a daily basis, in the morning and based upon a morning readiness HRV reading performed by the user. The Morning Readiness gauge indicates a user’s state of relative balance. In other words, it is comparing the user’s HRV values to the recent past and providing a comparison for the user on whether the user’s Autonomic Nervous System (ANS) is in a similar state or if it is swinging widely outside of the norm for the user.
[0210] In embodiments, additional scores may be calculated based upon additional physiological data in addition to collected HRV data. Such additional physiological data, such as image data, environmental data, historical health history data, or other biometric data may form the basis for one or more biological health scores that include HRV data as a particular component. The collection of camera image-based, physiological data may be performed utilizing visual light and/or infrared cameras pointed at the face of one or more users. The collected image data may provide insight into biometric and /or biomarker data such as heart rate, blood pressure, temperature, oxygen levels, CO2 levels, glucose, ketones, as well as insight into a user’s general awareness or alertness, stress, reflex time, resilience, or a combination of any of these data categories. The resultant combined score may include HRV data or may consist of collected biometric data as a corollary to a calculated HRV score.
[0211] In embodiments, the HRV system may provide users with the benefit of score and performance analysis to assist in predicting success with short-term and long-term physical goals and recommendations and suggestions on how to achieve identified user goals.
[0212] To achieve such predictions and recommendations from the HRV system users can submit data related to their goals, plans, HRV data and outcomes and utilize the HRV system to identify and/or formulate optimal plans to achieve their desired goals. In a non-limiting example, such a recommendation may take the form of a general training plan or a training plan customized for the individual. Community members may vote on these plans to surface the best plans, which could be promoted to users, e.g., based on HRV data similarity to those that have completed the plans.
[0213] In embodiments, the HRV may also provide Artificial Intelligence (Al) enhanced and implemented performance predictions and plan suggestions. These predictions and plan suggestions may take the form of a virtual coach, but specifically incorporating HRV data as an input. These Al suggested plans or virtual coaches may take the place of user submitted plans. To implement Al suggested plans, the HRV system may develop machine learning algorithms that take user profile data, including HRV data, and use it to predict the type or level of exercise to suggest to the user to achieve a specific goal. Similarly, this profile data, including HRV data, may be used to predict performance during an activity, such as running or biking. Additional types of program suggestions could be implemented outside of the health and fitness domain while still making use of HRV data. The additional program suggestions may realize the benefit of the scoring provided by the HRV system to create a service for users. [0214] The HRV system may leverage its ability to accurately analyze HRV data as a service to others. In a non-limiting example, the HRV system may offer a scoring service by which the HRV system receives HRV data collected by a third-party application, analyze the third- party collected HRV data as a service to a user or third-party entity, and output the analysis to the third-party app for use by the third-party app. This service offering may include receiving and ingesting the collected raw HRV data as a cloud service or offering an API to third parties for data ingestion, processing the raw data, and outputting proprietary score(s) to the requesting application. This service may be offered by the HRV system and used to operate on a variety of different input data types and produce a variety of different HRV based scores, system and application modifications, or data displays.
[0215] In an embodiment, the HRV system may also provide trend and analysis information based upon HRV data collected and scores derived from the HRV data collected. In a nonlimiting example, the Morning Readiness score calculated by the HRV system may provide a daily baseline indication for the user. The Morning Readiness score is trended and charted over time to help the user understand how acute, short-term, medium-term, and long-term choices and events impact the score over time. In another example, the HRV Score and other data and parameters can be charted and analyzed longitudinally, as well as for each individual reading.
[0216] The large amount of existing HRV user data may allow the HRV system to provide more specific guidance to users in view of the user’s trend data. In a non-limiting example, the HRV system can discover, either utilizing a manual review or an automated machine learning process, that prior users exhibiting a similar trend had a positive or negative outcome by making certain adjustments. These data insights can form the basis of customized feedback for the users given their data trends, desired outcomes and past user experiences. In this non-limiting example, the HRV system may associate a current user’s trend data and a stated goal (e.g., mental health, weight loss, etc.) with other users having similar trend data, known modifications (e.g., increased exercise, decreased sleep, etc.), and the same or similar stated goal. Having this information, the HRV system software can suggest changes that have been helpful for past members and provide cautionary information about modifications or continuations of the same behavior that have been historically harmful or negative for members in the past. [0217] Additionally, the HRV data may indicate inflammation in the body and may be analyzed to create an inflammation score for tracking adverse conditions, also forming a portion of the tracked health data. Also, in addition to analyzing a user’s own data, the user has the option to link their data to a team, where a coach, wellness practitioner, or medical practitioner may view the data.
[0218] In addition to requests to the HRV system for analysis of their own data, users have the option to link their own collected data to a team or group, where a coach or healthcare practitioner can view the data. The coach or healthcare practitioner in turn may have access to team level and individual team member level HRV based feedback, such as proprietary scores, customized modifications to training plans, etc. This allows the users, e.g., coaches, trainers, healthcare professionals, to access customized guidance and recommendations for clients, patients, etc., e.g., at the team or organization level, subgroups within the team or organization, or individual team or organization members. This permits group leaders to have access to HRV data of the team or group and associated HRV-based guidance with increasing specificity. In a non-limiting example, a CrossFit gym may obtain an HRV-based suggested modification (e.g., color coded Green/Yellow/Red indication) to the workout of the day (WOD) for individual users or groups of users. This would allow a personal trainer to understand which users are capable of strenuous, moderate, or light exercise that day and have access to suggested modifications to the workouts. These modifications may be selected based on global data (e.g., other users having similar HRV readings or trends) or more specific data, e.g., coach or healthcare professional modifications matched to HRV recommendation categories.
[0219] In this example, a matrix display may be provided for dynamically organizing team or group members per HRV system-based suggestions or modifications. A variety of user interfaces and functionalities may be provided in connection with a team-based view. To support this view the HRV system may provide a capability to sort team members by HRV- based workout intensity recommendation. In an exemplary embodiment, a matrix may be displayed organizing the team or group members into columns and rows, such as one user per row, with a color coded (or otherwise indicated) HRV based modification, along with an HRV score in associated columns. These HRV system-based modifications may be paired with predetermined, customized guidance per user, such as that input by a coach, health practitioner, etc. As above, the matrix can be re-organized to dynamically group users via various modalities. The HRV system may prepare the matrix listing users per sub-group (e.g., offense and defensive positions), based on HRV scores (or ranges), based on modifications, or based on any grouping that provides useful information to the user.
[0220] In embodiments, the HRV system may be implement utilizing a finger sensor based on LEDs that collect PPG data. The finger sensor uses three LEDs (infrared, red and green). The LEDs are paired with sensors (detectors) on opposing sides. The LEDs cycle to attempt to obtain a strong reading, which assists in handling user differences (skin tone, cardiac patterns, etc.). The LEDs take readings at 500 MHz. The current sensor can measure other data, such as pulse oximetry data, in addition to HRV data. However, the HRV system data collection readings may be performed utilizing other sensor devices including gaming input devices, AR/VR gloves, or other physical sensors. The HRV system may accept HRV data collected by any available hardware device that provides sufficient signal quality to collect the HRV data at acceptable sample rates.
[0221] In embodiments, instead of utilizing an external sensor the HRV system may collect HRV data using an integrated sensor such as a wearable device that collects HRV data natively. Examples of such integrated sensors may include devices such as, in a non-limiting example, an Apple Watch, or a smartphone or other computer-based camera that facilitates image-based HRV data collection, coupled with other data collection (e.g., blood pressure, pupil dilation, device data such accelerometer, etc.). Use of existing sensors of the user’s common hardware (e.g., smartphone, smartwatch, laptop, etc.) may extend the ability to collect HRV data more conveniently and provide more users and data. Of these sensors, cameras and finger-based physiology detection sensors are among the few currently viable options. These have been used by others for obtaining HRV data. These sensors may be improved by reducing finger movement via reduced reading times or finger stabilization techniques utilizing a magnetic accessory that attaches to the finger to stabilize it, etc.
[0222] In embodiments, in addition to suggested modifications to work out plans or health or wellness treatments, such data can be used to validate treatments, for display or feedback by gamers or those watching a live streaming event. The data may be utilized in an office to determine when employees should take breaks, to guide meditation or breathing practices using live, real-time feedback, to create, modify or evaluate the efficacy of corporate wellness programs, and in stress level monitoring. The HRV data may also be used in content recommendation systems, to enhance sports broadcasting and news broadcasting, or to modify the behavior of systems or devices, such as the behavior of automated vehicles, self- service kiosks, gaming systems, advertisement or content selection systems, smart home devices, office furniture, etc. In a non-limiting example, the user’s detected HRV data or a score using the collected HRV or other data may be used to influence advertisement selection (alone or in combination with other contextual data, e.g., GPS location of the user’s device) or to influence music selection systems to change music based on a user’s determined mood or goal for the day (and the current progress towards that goal). The collected HRV data could in turn be fed into other device applications, e.g., virtual assistants or smart home devices to adjust their recommendations, tone, etc., or to adjust office furniture, room temperature, ergonomics, and sleep environment. As noted above, such scores or suggested modifications may be provided as a service to various third-party applications and devices.
[0223] In embodiments, the HRV system may collect and accumulate HRV data, HRV scores, camera sensor-based image data, and/or other biometric data for a user. This data may be acquired from various sensors and sources of data and stored in an electronic storage apparatus. The HRV system may then look to the user to define a goal with regard to improving one or more HRV and/or combined HRV and composite data scores that the user wishes to achieve and provide a suggested activity to achieve that goal. The activity suggested for the user may be one part of a predetermined plan based at least in part on a model trained using HRV data, HRV scores, camera sensor-based image data, and/or other biometric data.
[0224] In embodiments, the HRV system may obtain biometric data for a plurality of users, including HRV data collected from a sensor, and utilize the biometric data to train a model utilizing machine learning algorithms. The machine learning algorithms may classify the biometric data into one or more predetermined classes. This collected data may then be analyzed by the HRV system to predict an HRV score, or a composite score, during or after completion of a pre-established activity. The HRV and composite data may be normalized to reflect population trends and help a user understand their particular scores as compared against population averages and norms.
[0225] In embodiments, the HRV system contains a method for biometric monitoring and scoring in which the HRV system is actively collecting biometric data from a plurality of users. The collected data measures physiological data and environmental data associated with said plurality of users over time and stores all collected data in an electronic storage system. The HRV system is then active to analyze all collected data to create a composite score that is based at least on heart rate activity data, biometric and/or biomarker data, and environmental data. The composite score may be compared against historical composite scores to determine activity modifications that will impact the behavior of any of a plurality of users prior to collecting additional biometric data. The HRV system may present these activity modifications to the plurality of users on a display device such as a wearable device, smartphone, or other mobile device in combination with recommended actions to accomplish said activity modifications.
[0226] In embodiments, the HRV system may utilize one or more signal cleaning algorithms to detect artifacts in the collected data and improve the collected data by removing any detected artifacts that impair the signal quality of the biometric and/or physiological data. Additionally, the signal cleaning of the collected data may be performed during a data collection action and the cleaned collected data is stored in an electronic format prior to analysis of said cleaned collected data.
[0227] In embodiments, the HRV system may track the composite score over a preconfigured time span. The composite score may consist of at least HRV data, biometric data, physiological data, and environmental data associated with a single user or a group of users. The HRV system may receive from a user or medical practitioner a threshold composite score or composite score range that is preferred for the user to maintain. The HRV system may transmit to a user recommended actions comprising events, interventions, and/or planned steps in accordance with maintaining said user’s particular composite score.
[0228] In an embodiment, the HRV system may utilize any of a finger sensor, an LED sensor, a chest-strap electrocardiogram sensor, a camera, or sensors contained within or attached to a mobile device associated with a user. Once calculated, the composite score may be presented to a user as a numeric value and a gauge graphic to permit the user to visually understand changes in the composite score over time. Additionally, the composite score, recommendations, and guidance may be provided as a report, as part of an ongoing data display, or in real-time as live biofeedback to a user during an activity. Machine Learning
[0229] In embodiments, a machine learning model may be trained to learn the patterns in the sequence of video frames from finger- over-camera or face-over-camera images that are associated with the blood volume change in the tissue. The training data may be pairs consisting of video frames recorded from finger- over-camera or face-over-camera images as input and R-R intervals from an HRV sensor used as the desired output, or ground-truth data. This input and output pair may need to be synchronized before training the model by using signal processing peak detection methods. A longer video can be subdivided into shorter videos using a moving time window. The shorter subdivided video segments may be fed into the model to generate R-R intervals. This model may be a deep learning network, such as CNN (Convolutional Neural Network).
[0230] In embodiments, the machine learning model may include supervised learning. A supervised learning model may be used to eliminate the need to extract features or to do feature engineering. For example, a supervised learning model nay be trained to learn to extract the features from the video frames such that it is possible for the video data stream to be incorporated into the machine learning model.
[0231] In embodiments, a machine learning model may be trained to utilize a series of R-R intervals (e.g., a data stream of R-R intervals) as input into the machine learning model, and to have as an output a time-domain, or frequency-domain HRV metric (e.g., biomarker) such as an HRV Score, a RMSSD value, and/or high-frequency power value as an output. This machine model may be a LSTM (Long Short Term Memory) model that can learn timedependent patterns in a time-series signal.
[0232] In embodiments, the machine learning model may be trained to detect and remove artifacts from a data stream of R-R interval signals. The artifacts to be removed may be caused by motion, arrhythmias, premature ectopic beats, atrial fibrillation, measurement and/or signal noise.
[0233] In embodiments, a hybrid machine learning model may be trained. For example, a stacking model incorporating a CNN (convolutional neural network) model stacked with an LSTM (e.g., long short term memory) model to detect HRV value directly from the video input. The stacking model may be formed by stacking the CNN model on the LSTM model, or by stacking the LTSM model on the CNN model.
[0234] In embodiments, a machine learning model may be trained to identify a subset of pixels (instead of all of them) in video frames in order to generate the R-R interval and HRV measurement. In this machine learning model, a subset of the entire set of pixels in a video frame are identified, and then used by the machine learning model to determine the biomarker (e.g. metric) of interest. This pixel selection of the subset of pixels may increase the signal-to-noise ratio in the extracted information. Furthermore, the identification of the subset of pixels may reduce the amount of computation time, and/or power to generate the output of the model. There are at least two methodologies to extract a subset of pixels from a video image.
[0235] In one methodology, in embodiments, the frames in the video may be divided into an N x M grid. Features extracted from each cell in the grid are used to train a machine learning model that picks the one or more cells with the highest information content. Then, the R-R intervals or HRV value estimates are fused together to provide for determination of the biomarker (e.g. metric) of interest.
[0236] In another methodology, in embodiments, a machine learning model may use an attention mechanism to do spatial segmentation in the frames. In spatial segmentation, an image may be subdivided into multiple segments. In this process, every pixel in the image is associated with an object type. In an embodiment, all objects of the same type are marked using one class label while in another embodiment, instance segmentation, similar objects get their own separate labels. The machine learning model includes an encoder and a decoder. The encoder extracts features from the video image through filters, and the decoder is responsible for generating the final output which is usually a segmentation mask containing the outline of the object.
[0237] In embodiments, to train a machine learning model, R-R intervals and HRV measured from ECG may be used as ground-truth. The R-R intervals determined from ECG measurements and PPG measurement may have some differences. A machine learning model may be trained to learn the difference between these values, and use the differences to improve the performance of the model. The factors affecting the difference between the R-R intervals as measured from ECG and PPG include heart diseases, blood pressure, artery wall thickness and elasticity, body and environmental temperature, red cell count, hemoglobin content, etc.
[0238] In embodiments, instead of training a machine learning model on input video frames, a model may be trained on PPG time-series extracted from video frames by using image processing methods.
[0239] In embodiments, a machine learning model may be trained by computing the spectrogram or scalogram on a sliding window on PPG represented as an image, and a machine learning model such as a CNN model may be used to learn the associated HRV.
[0240] In embodiments, approaches similar to those described above may be used to train machine learning models to estimate other biomarkers, such as respiration rate or other metrics of HRV. For example, supervised learning, LSTM, CNN, finding a subset of pixels, and spatial segmentation may be used to estimate other biomarkers, such as respiration rate or other metrics of HRV.
[0241] In embodiments, a machine learning model may be trained to predict various conditions of interest for a user based on an HRV trend. Such conditions may include the physical conditioning, physical performance levels, stress levels, EHRV’s Morning Readiness Score, EHRV’s HRV Score, Stress Score, Productivity Score, Emotional State, and Illness Risk, and other conditions such as having diabetes or cardiovascular disease. A machine learning model may be trained using contextual data from the HRV system’s database paired with associated HRV values.
[0242] In embodiments, a machine learning model may be trained to improve sensor readings and HRV estimation when using a camera to collect data from finger-over-camera scans by detecting movement of the finger, proper covering of the camera’s lens, proper finger placement on the camera, and proper amount of finger pressure on the camera, and/or movement of the phone while taking the measurement. This information may be used to guide the user to improve how they use their phone to collect data.
[0243] Likewise, in embodiments, a machine learning model may be trained to improve sensor readings and HRV estimates when using a camera to collect data from a user’s face during a camera-over-face measurement. [0244] In embodiments, a each HRV reading may be tagged with contextual information. This tagged contextual information e.g., data) may be used to train a machine learning model to generate composite metrics. The tag information may consist of sleep data, exercise data, mood ratings, questionnaires, custom tags/notes, blood glucose, body weight, or any other data that is useful for assisting the user in achieving their goals.
[0245] In embodiments, a machine learning model may be trained to report quality score or the confidence of a biomarker estimation or prediction.
[0246] In embodiments, a machine learning model may be trained to be a virtual (e.g., Al) coach to perform predictions, suggestions or, recommendations to help a user achieve a specific goal. This model may learn from HRV changes for that user to learn how a user’s body responds to various triggers, lifestyle choices, environmental conditions, etc. The model may suggest specific actions, lifestyle choices, or behavioral changes.
[0247] In embodiments, in addition to being trained to provide recommendations to a user based on the user’s specific biomarker data, an Al coach may use machine learning models that are trained to associate a current user’s biomarker, trend data, and a stated goal (e.g., mental health, weight loss, etc.) with other users having similar data and trend, known modifications (e.g., increased exercise, decreased sleep, etc.), and the same or similar stated goal. This may be known as “look alike” modeling. Having this information, the HRV system software may suggest changes that have been helpful for past members and provide cautionary information about modifications or continuations of the same behavior that have been historically harmful or negative for other members in the past.
[0248] In embodiments, transfer learning methods may be used to train two machine learning models sequentially. For example, a model trained to estimate HRV values from finger-over- camera videos, may be used to train a machine learning model to estimate HRV from face- over-camera videos.
[0249] In embodiments, a machine learning model may incorporate unsupervised or supervised learning methods to be trained to find sub-populations that have similar R-R dynamics or HRV characteristics. Identifying a sub-population to which a user belongs, may help to improve biomarker estimations, predictions, and/or classification tasks. Knowing a user’s sub-population can help with look-alike modeling, as described above. Other types of data such as age, gender, ethnicity, medical history, activity level, etc. may be included in this clustering.
[0250] In embodiments, a machine learning model may be trained to classify and/or cluster a user’s R-R intervals dynamics. In this model, each class or cluster is associated with a specific value of HRV Score and/or other HRV metrics (e.g., biomarker). Once this model is trained it may be applied on shorter R-R intervals (e.g., shorter duration measurement periods) to estimate HRV Score.
[0251] In embodiments, a machine learning model may be trained to use natural language processing (e.g., NLP) to develop unstructured tags from analyzing users' spoken words and/or written messages. The app may quantify the mood, stress level, energy, etc. by analyzing the user’s tone, speed of talking, choice of words, and the like.
[0252] FIG. 1 A schematically illustrates a trace of heart beat data 100, such as would be produced by a pulse waveform from a PPG signal. The trace includes the R portion of the heart beat data, as identified in Ri, R2, and R3. The R-R intervals (e.g., RR), that is the time between consecutive R values, is indicated as 973ms between Ri and R2, and 1102ms between R2 and R3. The RR values contribute to the determination of the user’s HRV. The trace of heart beat data 100 is the result of measuring raw sensor signals that are transformed into a heart beat trace. The raw sensor data may be collected from optical sensors, electrical sensors, a time series of still images, video images or a combination of any of the listed sensor types. An example of a PPG signal is shown in FIG. IB. Time series data, whether from a series of still images or from a video, is required for calculating and/or deriving R-R intervals and HRV. The HRV system may analyze the sensor data, whether from a single source or from a combination of the sensors, using algorithms trained by ML to convert (e.g., transform) the raw sensor signal into biomarkers.
[0253] FIG. 2A-illustrates an exemplary embodiment 200 of using a smartphone to measure user heartbeat data. In the embodiment illustrated in 200, a user places a finger 230 over an optical sensor 220 on the smartphone 210. During measurement, the optical sensor 220 is coupled to an illumination source (not shown). The illumination source provides white light to the surface of the user’s finger 230 and the optical sensor 220 records variations in the color and intensity of the light reflected off of the various tissues in the user’s finger. These tissues include skin, blood vessels and capillaries, as well as the blood coursing through the blood vessels and capillaries. Algorithms in the software analyze the recorded images and calculate precise electrical heart beat data from EEG and ECG measurements, and precise optical heart beat data from PPG measurements. The precise heart rate data is used by the software derive HRV scores and provide customized insights, including EHRV’s Morning Readiness Score, EHRV’s HRV Score (a normalized manipulation of InRMSSD), Stress Score, Productivity Score, Emotional State, and Illness Risk. In various embodiments, the optical sensor may be a camera sensor located on a phone, laptop, desktop, webcam, security camera, and the like. This embodiment may be referred to as finger-over-camera measurements, and the camera may be referred to as a finger-over-camera sensor. The finger-over-camera sensor may be used with or without the illumination source.
[0254] In some embodiments, other biomarker data of a user is measured and incorporated into the Customized Insights. Some non-limiting examples of such biomarker data can include measurements such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, and ketone levels. Furthermore, biomarker data may also provide insight into a user’s general awareness, awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing.
[0255] In some embodiments, an optical sensor (e.g., a camera) may be used to measure color and intensity data detected from the face of a user through still image and/or video image analysis. Use of still image and/or video image analysis permits a user to point any optical sensor (e.g., camera), having sufficient resolution and/or frame speed, at his or her face and detect color and intensity data which can be transformed into biomarker data such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, and ketone levels, and may be used to provide the user with estimates of the user’s general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing.
[0256] FIG. 2B illustrates an exemplary embodiment 250 of capturing an image of the face of a user to collect face-based data. In the embodiment illustrated in 250, an optical sensor 260 records an image of the user’s face 270 which is displayed on a smartphone 210. This embodiment may be referred to as face-over-camera measurements, and the camera may be referred to as a face-over-camera sensor. The face-over-camera sensor may be used with or without an illumination source. In some embodiments, the camera sensor data may include a time series of still images and/or video images of the face of the user. It is important that the camera-over-camera sensor data include time series data. That is, the algorithms use changes in the color and intensity of the raw sensor data in the algorithms, and in learning through ML processes.
[0257] In some embodiments, one or more sensors can be used to detect biomarkers as part of a dual sensor input system and/or a multi-sensor input system. For example, in some embodiments, optical sensors 220 and 260 can be integrated with a camera, such as in an embodiment of a mobile phone 210, as shown in FIGS. 2A and 2B, in order to determine biomarkers of the user. The sensors 210 and 260 can be used to record the physiological data of the user using a face-over-camera sensor 260 either as still photos in a time series, and/or as video images, provided the camera is of sufficient quality and or has sufficient speed, as shown in FIG. 2B. In some embodiments, the biomarkers can be determined by contacting a finger over-camera sensor 220 with a finger of the user 230, as shown in FIG. 2 A.
[0258] The finger-over-camera sensor 220 and the face-over-camera sensor 260 can be used in combination to interact with one or more body parts of a user to collect measurements therefrom. An RR probability model trained from the historical data of a higher quality sensor, which is used in-the-loop for RR modeling from camera data, can be used as input features for a supervised classification system for physiological conditions. Each of the sensors 220 is compatible with a variety of camera configurations. For example, the finger- over-camera sensor 220 can be placed on a rear-facing camera of the phone 210 to facilitate user grip and allow readings or measurements to occur while the phone 210 is being held in a position that is natural for a user. The finger-over-camera sensor 220 can utilize the index finger for conducting measurements, though it will be appreciated that any finger or part of the hand can be placed over the sensor to take the measurement.
[0259] In embodiments, the raw sensor data can be transformed into biomarkers to link camera data patterns in R-R intervals and other biomarkers to various morbidities. Some non-limiting examples of morbidities to which the data patterns can be linked can include certain cancers, diabetes, kidney disease, heart disease, high blood pressure, and so forth.
[0260] In some embodiments, the biomarker data captured by the sensors can be converted into biomarker values used to calculate a score or index for various conditions. For example, the presently disclosed system can utilize captured data from one or more HRV readings to calculate an HRV Score, scaled on a 1-100 basis, based on the natural log of the Root Mean Square of Successive Differences (RMSSD) for the HRV data collected. Changes in the HRV Score correlate with changes in: breathing and respiratory patterns; physical stress; recovery from physical stress; physical performance; psychological stress and health; emotion and mood; cognitive performance; immune system function; inflammation, posture and structural health; injury; biological age; general health and wellbeing; resilience and adaptability; risk of disease; morbidity and mortality; motivation and willpower; and digestive stress.
[0261] In some embodiments, the system can be configured to measure physiological signals based on a time series signal. For example, in some embodiments, the camera can be used over a period of time to detect color changes in light reflected from the skin of a user/patient, e.g., the face of a user/patient. A person skilled in the art will recognize that every pump of the heart can cause an increased flow of blood to the face, which can cause color changes that are detected by the camera in different color scales and intensities. Taking measurements of the face over time allows a time series signal to be created of the patient’s face that can be processed to create an average profile of the face under specific conditions.
[0262] FIGS. 3A-3C illustrate alternative embodiments of wearable sensors 310 that can be used with the embodiments of the present disclosure. A sensor can be placed over the finger to determine the biomarkers discussed above. Additional examples of the wearable sensors 320 can include chest straps 330 or patches, smartwatches 340, wrist or arm wearables 350, ring wearables 360, smart apparel, or other devices that can contact a portion of the body known to one skilled in the art. These wearable sensors 320 can be used alone, and or in combination with the finger-over-camera 220 or the face-over-camera sensor 260 to take additional measurements, validate measurements taken by other sensors, and/or improve accuracy of measurements to build a user profile, among other advantages known to one skilled in the art. Readings performed by the wearable sensors 320 can be combined with readings from the finger-over-camera 220 or the face-over-camera sensor 260 to enhance accuracy of the resultant user profile constructed from the collected data.
[0263] As discussed above, in some embodiments, data from the additional sensors can be used to perform validation of readings from a given sensor. For example, a sensor, e.g., the finger-over-camera sensor 220, can measure one or more biomarkers with simultaneous data collected from the face-over-camera sensor 260 being used to verify data collected therefrom. If the data collected from the two sensors corresponds to one another, the data can be saved to the system for storing in a user profile. Otherwise, in the event of a discrepancy, a signal can be sent to the sensors to repeat the measurement or to use a third sensor to provide distinguishing validation between the two sensors. For example, biomarkers measured by the face-over-camera sensor 260 can be used to validate measurements taken by the finger-over- camera sensor 220, though, it will be appreciated, that biomarkers measured by the finger- over-camera sensor 220 may be used to validate measurements taken by the face-over-camera sensor 260.
[0264] It will be appreciated that the one or more sensors measure the biomarkers substantially simultaneously such that multiple measurements that occur at the same time can be accurately compared to one another. A person skilled in the art will recognize that substantially simultaneous operation includes the finger-over-camera sensor 220 and the face- over-camera sensor 260 being activated within approximately 0.1 seconds of one another, though in some embodiments, substantially simultaneous operation can include from approximately 0.01 seconds to approximately 2 seconds.
[0265] In some embodiments, the measurements of the biomarkers can be used to train machine learning within the system. For example, the system can be configured to learn ranges of biomarkers collected from the user over time, and use these previous measurements in combination with subsequent measurements to tailor ranges which the user experiences. In some embodiments, the machine learning models can be used to suggest biomarker measurements to be taken based on previous readings performed by the user.
[0266] The machine learning models can be trained in phases. For example, the color extraction and intensities detected by the sensors can detect the heart rate, from which the RR interval can be determined. The RR interval data can be collected and graphed over a time span of at least 30 seconds, although the time span may be longer if greater accuracy is desired, and the data is collected continuously over the reading time span. In some embodiments, the time-series RR intervals can be converted to two-dimensional images, such as scalograms or spectrograms, for purposes of direct biomarker estimation or classification.
[0267] The machine learning models can train the system to take measurements that allow phasing out of one or more of the sensors over time. For example, initial calibration and measurements of certain biomarkers can be performed using a plurality of sensors, e.g., the finger-over-camera sensor and the face-over-camera sensor. Following repeated measurements of the biomarkers of a given user, the machine learning models can construct a profile of the user that includes ranges of the values of the measured biomarkers. As more measurements are taken and the ranges of the values become smaller, the machine learning models can obtain biomarker values using a single sensor of the one or more sensors, which results in quicker, more accurate measurements. The number of measurements taken can vary, but is commonly presumed that the system can be trained after 5-10 initial readings. However, this range is not to be taken as a limit, but as an initial estimate of the number of initial readings. It will be appreciated that the machine learning models can continue to access the plurality of sensors as needed and/or for validation of measurements of certain biomarkers if a measured value falls outside of the expected range.
[0268] In some embodiments, training of the system can occur sequentially. For example, the finger-camera sensor 220 can be trained using machine learning models that are exposed to validated HRV sensor data. Once the finger-over-camera sensor 220 is sufficiently trained, the finger-over-camera models can be used to train the face-over-camera sensor 260 and/or additional sensors of the presently disclosed system.
[0269] FIG. 4 presents a view of artifact detection accuracy in terms of the detection of false positive artifact detection consistent with certain embodiments of the present invention. In an exemplary embodiment, at 400 the mean false positive rate is the ratio of falsely annotated artifacts to the total number of veridical intervals such as, in a non-limiting example, negative artifacts. The false positive artifact detection is performed utilizing the Mod-Berntson, IPFM and PWIR processes. This figure displays the relative accuracy of each method when compared directly. The exceptionally high False Positive Rate (FPR) of PWIR can be mitigated by increasing its threshold parameter, but not without inducting a reduction in false positives on missed beats which hurts the HRV estimation mean error more than the false positives do.
[0270] FIG. 5 presents a view of artifact detection accuracy in terms of the detection of true positive artifact detection consistent with certain embodiments of the present invention. In an exemplary embodiment, the system presents a mean true positive rate as the ratio of correctly identified artifacts to the total number of artifacts 500. While not shown here, the performance of IPFM when evaluated exclusively on spurious beat-type artifacts is actually exceptional. Unfortunately, the instantaneous-derivative metric used by IPFM as a threshold metric is not nearly as sensitive to missed beats, which hurt its overall performance significantly. It was also found that the median FNR value on all artifact conditions for the Modified Berntson algorithm was 0%.
[0271] The IPFM and WPIR algorithms were applied using threshold parameters recommended by Osman et al. Further analysis might include a complete parameter search against the present test data. It is worth noting that the modified Berntson algorithm is robust across data conditions given its standard parameterization.
[0272] FIG. 6 presents a view of artifact impact on the system consistent with certain embodiments of the present invention. In an exemplary embodiment, at 600 when calculating the impact of artifact on the system the best performance for IPFM was found with threshold set to 4.5, and for PWIR set to 2.5. These thresholds are used in all included plots. The optimal threshold can differ significantly with data source, therefore performance of these algorithms on a data source for which there is no prior knowledge may be worse. Alternatively, no parameter optimization was performed for the modified Berntson algorithm. This property of being functionally parameterless is of considerable value when no opportunities for pre-emptive tuning or source analysis are available.
[0273] FIG. 7 presents a view of the display of HRV statistics for a user post reading consistent with certain embodiments of the present invention. In an exemplary embodiment, at 702 the display presents a display of the time-domain and frequency-domain results for the HRV reading taken by a user or by a medical practitioner, a coach, or other professional on behalf of the user. The time-domain statistics may include, but are not limited to, the mean RR interval, rMSSD, In(rMSSD), SDNN, PNN50, NN50 and a 7-day HRV Calculated Value (CV). The frequency domain statistics may include, but are not limited to, the total power consumed, the Low Frequency (LF) power consumed, the High Frequency (HF) power consumed, and the ratio of LF to HF power.
[0274] FIG. 8 presents a view of the display of the continuation of HRV statistics for a user post reading consistent with certain embodiments of the present invention. In this embodiment, at 804 the display presents the display of frequency domain results for the HRV reading as previously described and continues with a display of the user’s heart rate results for the reading. The heart rate results statistics presented include but are not limited to, the minimum heart rate, maximum heart rate, and average heart rate captured during the data reading action. [0275] The sensors of the present disclosure can provide a number of readings for determining biomarkers when activating the sensors. For example, as shown in FIG. 9, the sensors can be used to take a Morning HRV reading, an Open HRV reading, an HRV snapshot, or a Research Reading, among others. Each reading and/or snapshot can trigger the sensors to determine biomarkers over a specific time frame, e.g., whether to take an instantaneous measurement, or a series of measurements over an extended period or duration. The readings can be selected by toggling the appropriate function on a screen of a phone 900 by a user, with each reading configured to be customized according to user preferences.
[0276] FIG. 10 illustrates an exemplary interface 1000 of the system when the HRV Snapshot is toggled for measurement. For example, when the HRV Snapshot is selected, users have the option to Track and monitor respiration rates in addition to measuring a snapshot of their HRV. Once this feature is enabled and “Take Test” is pressed, one or more of several sensors may be activated to measure the respiration rates. For example, the chest strap 330 may be activated. The face-over-camera sensor 260 may also be activated, with the camera being pointed towards the user’s body, e.g., face, to determine respiration rate of the user. The two sensors, the face-over-camera sensor 260 and the chest strap 330 may be used individually, or in conjunction with each other.
[0277] FIGS. 11 A-l IE illustrate a system tutorial of use of the finger-over-camera sensor 220. It will be appreciated that this embodiment and accompanying description is merely exemplary and the combination of techniques for engaging with the sensor can be changed in a number of ways known to one skilled in the art. As discussed above, the finger-over- camera sensor 220, in some embodiments, can be used in combination with the chest strap 330 or another feature in order to facilitate accuracy of the biomarker. Once the chest strap 330 is secured, the user is instructed to place the pad of their finger over the finger-over- camera sensor 220, which in this embodiment is the rear-facing camera, though, in some embodiments, the finger-over-camera sensor 260 can include the front-facing camera. It will be appreciated that the rear-facing camera is chosen due to a user’s natural grip in holding a phone that places the fingers proximate to the rear-facing camera, which allows for more straightforward measurements to be taken. Once positioned, such that the camera and flash are covered, as instructed by the tutorial, the application verifies that the lens is covered, appropriate pressure is exerted on the sensor, and that there is sufficient lighting prior to taking a reading. In some embodiments, the system can collect a supplementary signal to the face-over-camera sensor 260 and/or the finger-over-camera sensor 220 by measuring the force of the finger (pressure) applied on the touch screen. The combination of the sensors and the supplementary signal, taken simultaneously and/or sequentially can be used to strengthen the accuracy of the data collected by the sensors overall.
[0278] To take the reading of the chest strap 330 and the finger-over-camera sensor 220 substantially simultaneously to measure various biomarkers at a substantially similar point in time. The system can receive pulse data from the chest strap 330 and verify that the finger- over-camera sensor 220 is able to make biomarker measurements from a user’s contact therewith. Once the user’s position is verified, the user is instructed to remain substantially immobile while the reading begins automatically. The duration of the reading can be substantially instantaneous, e.g., approximately one second, though, in some embodiments, the reading can occur over a period of approximately 10 seconds, approximately 20 seconds, approximately 30 seconds, approximately one minute, approximately two minutes, or approximately three minutes or more. Furthermore, a reading may take between 1 second and 10 minutes, but a typical duration is 1-2 minutes. While the reading is taken, the system records a timestamp of the readings taken by the chest strap and the finger-over-camera sensor 220 such that the readings of each can be correlated with one another at specific moments to collect multiple data points to confirm accuracy of the readings.
[0279] FIG. 12 illustrates an interface during which the reading is in progress. As shown, the user interface 1200 can include a reading of the heart rate and the HRV score, as well as a graph showing one or more of these values throughout the duration of the measurement. In some embodiments, an option to stop measurement is included, along with a timer to indicate progress of the reading to inform the user how much time remains before results are provided.
[0280] FIG. 13 illustrates the results of the readings taken by the system in accordance with the embodiments discussed above. As shown, the results can include an HRV chart and a heart rate chart showing the values measured over the duration of the measurement, as well as additional biomarkers. A person skilled in the art will recognize that the charts can be customized based on the biomarkers being measured and/or based on user preferences that can be customized in settings. [0281] In some embodiments, the system can provide reminders regarding use of one or more of the sensors. An exemplary embodiment of such reminders is shown in FIG. 14, which provides reminders prior to starting the face-over-camera reading. Some non-limiting examples of such reminders can include instructions for proper phone positioning to obtain the most accurate measurements, sitting still and avoiding turning the face away from the sensor, maintaining good lighting, and staying in the frame, since removing the user from the frame can negatively impact determination of biomarkers and create inaccuracies in the data.
[0282] Once the face-over-camera option is toggled, calibration of the wearable sensors and the face-over-camera sensor 260 can begin. For example, once “Got It” is selected on the interface of FIG. 14, an instruction for connecting a wearable sensor is displayed to the user, as shown in FIG. 15. FIG. 15 illustrates the communication of a wearable sensor 1500 over the finger of the user to collect pulse data, though it will be appreciated that the connection can be formed with any wearable sensor, such as illustrated by 330. Once the system receives pulse data from the wearable sensor, the face-over-camera sensor 260 can be calibrated. Even though the wearable sensor is collecting pulse data for calibration, it is understood that the collection of data from the face-over-camera sensor 260 can be initiated substantially simultaneously.
[0283] FIGS. 16A-16F illustrate calibration of the face-over-camera sensor 260. As shown in FIG. 16 A, the user can position their face in the designated region with respect to the camera such that the sensor can detect the face. As shown in FIG. 16B, proper face detection can be adjusted and checked on the interface of the phone to ensure that the user’s face appears in the designated region. The system can perform an accuracy check to determine that the user’s face is detected and proper lighting is being used, at which point the system is ready to perform biomarker measurements, as shown in FIG. 16C.
[0284] FIGS. 16D-16E illustrate different views that the user may choose from (swipe between) while they are taking a face-over camera HRV snapshot reading that is in progress. 16D is intended to provide no distraction and includes instructions to “relax and breathe normally.” As shown, the HRV snapshot is set to be taken over the course of one minute, with changes in the user interface indicating progress of the HRV snapshot. The user can be in any of the screens ( as shown in FIGS. 16D-16E) throughout the measurement, depending on their preference. 16E is intended to provide live biofeedback or biomarker data. As the measurement continues, values for such biomarkers such as heart rate and HRV score, as well as a graph for the duration of the measurement are displayed, as shown in FIG. 16E. 16F is simply showing the user their face recording.
[0285] FIG. 17 illustrates an error message that can be received during the HRV snapshot. In the event that the sensor fails to get a reading of the face, e.g., in the event that lighting changes or the user’s face moves outside of the designated region, an error message can be sent to the user. As shown, the message can indicate that measurement of the biomarkers can continue via the other sensors, such as those taken by the finger-over-camera sensor 220 and/or the chest strap 330. For example, as shown, while respiration rate may not be able to be tracked due to poor video quality, HRV readings can continue as the other sensors continue to collect data. Once the issues identified in the warning are fixed, tracking can resume.
[0286] In some embodiments, users can add tags and/or events to inform the system about their overall well-being and log recent events. Events can be logged before or after the reading is performed. The events may be logged before the results are provided to afford the system more data for tracking when building out the user profile. As shown in FIG. 18, tagging can be enabled in the app, with tags including moods, and energy levels, as well as recent events, such as sleep schedules, exercise routines, and illnesses can be tracked, which will allow the system to account for discrepancies in biomarkers.
[0287] As shown in FIG. 19, additional tagging can be enabled in the app. In some embodiments, the system may use ML to incorporate location tags, such as GPS, and respiratory tracking. For example, if respiration is abnormal, and the user indicated that they have been ill, the machine learning capabilities of the system can account for that when building the user profile to understand that the illness may be a contributing factor to data discrepancies. Furthermore, the app may use ML with natural language processing (e.g., NLP) to develop unstructured tags from analyzing users' spoken words or written messages. The app may quantify the mood, stress level, energy, etc. by analyzing the user’s tone, speed of talking, and choice of words, and the like.
[0288] FIG. 20 illustrates an exemplary embodiment of results from the Research Reading of the face-over-camera sensor 260. As shown, in addition to the HRV score and heart rate values, the results can display a respiration rate over the course of the measurement as well as respiration rate history, e.g., over the past three days, as shown. The results can also display the tags that were selected, as well as values of body position, and options to save and share results to social media platforms, such as Twitter and Facebook.
[0289] FIG. 21 illustrates the home screen 2100 of the system. The home screen 2100 can illustrate user history, previously measured biomarkers, comparisons of readings performed over the previous few uses of the system, and so forth. The home screen 2100 can also be customized to display data associated with the readings that have been performed by the user during the use history of the application.
[0290] While the home screen 2100 may be customized to display data associated with the readings that have been performed by the user during the use history of the application, additional data that has been extracted, as discussed above, can be combined with the associated data from a smartphone or other devices to build an overall profile of the user. Some non-limiting examples of the associated data can include GPS location and elevation, local events such as pandemic news, disasters, political sentiment, metadata about the user (age, gender, other demographics), other biomarkers such as blood pressure, pulse ox, blood glucose levels, mood (obtained via integrations, or direct user input), and/or behavior data (exercise, sleep, food, supplements, medical activity). In some embodiments, the associated data is retained in the system to aid in building the overall profile of the user, but is not displayed on the home screen 2100.
[0291] In some embodiments, the system can perform a live heart rate detection and breathing rate measurement. As shown, in FIGS. 22A-22B, the user can position their face in relation to the face-over-camera sensor 260 to scan an image of the face. For example, facebased detection can be used more naturally during certain activities, such as driving a car, whereas finger-over-phone camera or other sensing is not possible or not preferable. The face-over-camera sensor 260 can track user data by one of the following: color extraction (visible and invisible wavelengths), pixel movement, and/or eye-specific movement, including pupil dilation, as well as pixel intensity. FIG. 22A is an exemplary embodiment of a calibration of a scanner function that can collect biomarker data live via the face-over- camera sensor. The system can use various points of the face to extract the color profile thereof throughout the duration of the measurement to provide live heart rate and respiration rate readings. FIG. 22B is an interface for performing the face-over-camera live reading The face-over-camera sensor 260 can detect blood flow changes and slight variations in color in certain areas of the face to determine the heart rate as well as biomarkers to further build the profile of the user.
[0292] In some embodiments, the finger-over camera sensor 220 and the face-over camera sensor 260 can communicate with a web browser, a projector, and/or a computer application installed on a laptop or desktop. In such embodiments, the face-over-camera sensor can be part of a webcam or an integrated camera in a laptop that can detect the user’s face. The system can create a profile of the user within the system and take measurements thereof over a certain time period, e.g., as the user sits at the computer, to measure biomarkers over a period of time to improve the accuracy of the user profile. FIG. 23 illustrates an exemplary embodiment of such a system for detecting biomarkers in users utilizing a camera or a live feed. For example, the system can interact with a webcam of a desktop or laptop to determine biomarkers over video conference such as Zoom, Facebook Messenger, Facetime, and the like. A person having ordinary skill in the art will recognize that use of the sensors of the present embodiment during video conferencing applications can provide benefits for remote diagnosing of patients and virtual doctor visits in emergency medical situations or if the patient does not have immediate access to medical care.
[0293] As shown in FIG. 23, users can input such parameters as height, weight, and stress levels, and the system can use the sensor attached to the webcam in ways similar to that of the face-over-camera sensor to measure biomarkers such as heart rate, HRV, blood pressure, oxygen levels, CO2 levels, glucose levels, ketone levels, general awareness or alertness, stress, reflex time, resilience, training, body temperature, or related capacity or capability, or a combination of the foregoing, of the users in real-time.
[0294] In some embodiments, the system can capture a snapshot of a user’s face over the video conference and calibrate the system for an upcoming live video conference to begin a build out of a user’s profile, or use a pre-recorded video of a user over a video conference to make subsequent measurements of the biomarkers, as shown in FIGS. 24A-24F. In this way, the HRV system may use recorded still images and/or video images in a manner that is analogous to live still images and/or video images.
[0295] In some embodiments, a plurality of cameras can be used to acquire a three- dimensional image of the target in order to generate a more complete data set. For example, using a plurality of cameras can provide a plurality of angles of the face of the user in multiple degrees of freedom, e.g., up to six degrees of freedom. Moreover, in some embodiments, the application can integrate cameras from various locations in order to collect more biomarker data that will result in more accurate measurements. For example, the presently disclosed system can communicate with sensors located in cameras that are located in a user’s car, at the computer at work, at the gym or wellness center, or in a hospital or wellness setting to collect biomarker data during the course of the user’s day to build out a user profile that captures a detailed history of biomarker values. This combination of a plurality of cameras (e.g. optical sensors) may provide raw data that may be transformed into new biomarkers that have heretofore been unknown. Put another way, the increase in sensors and/or cameras in the environment coupled with the increasing resolution and video speed (e.g., frames per second) of those sensors and/or cameras create the possibility of discovering a data set of physiological signals that have been previously unknown. For example, a data set of physiological signals may allow the HRV system to build user profiles that make it possible to build unique user profiles that can be associated with only one user, such as a fingerprint.
[0296] Using the data collected after analyzing the snapshot, the system can calculate heart rate, HRV score, and other biomarker values for a given user. Upon completion of the HRV score calculation, a user may compare the calculated HRV score and other non-proprietary HRV parameters to general population and/or demographic-filtered population data, as shown in FIG. 24F, to provide an indication of how the user compares with the general population as a whole or with specific filtered portions of the general population. This comparison may provide a user with some indication as to changes in their HRV values with respect to their own historic values as well as historic values for a given population.
[0297] In some embodiments, the HRV system may also create a daily expressed score for use in tracking a user’s HRV values over time. This daily expressed score is known as a Morning Readiness score. The Morning Readiness score is a scaled score (1-10) that shows the relative balance or imbalance in the user’s sympathetic and parasympathetic nervous system. The Morning Readiness score correlates with day-to-day fluctuations in the nervous system for an individual, highlighting to the user when major changes may have occurred in the body, based on the user’s own unique individual patterns.
[0298] In some embodiments, the HRV system may create a user profile that is unique to an individual. Using ML, the HRV system may integrate an individual user’s biomarker data to create a “biomarker fingerprint” that is unique to the individual user. This unique “biomarker fingerprint” may be operative to identify an individual using biomarker data collected from one or more sensors, including still images and/or video images.
[0299] FIG. 25 presents a view of the display of the data integration connections for the user device consistent with certain embodiments of the present invention. In this embodiment, at 2506 the user may specify one or more exterior cloud integrations, such as, in a non-limiting example, a Fitbit or Google Fit cloud-based data set, with which the HRV monitoring device may connect. The exterior cloud based data may integrate with the HRV monitoring system to receive, transmit, and exchange heart rate, heart rate variability, exercise regimen, nutrition data, and any other data that assists in monitoring and managing the health of the user. In a non-limiting example, the user may configure the HRV monitoring system to transmit data to any selected exterior cloud based data integrator, or to an exterior monitoring device maintained by a medical practitioner of the user’s choice.
[0300] FIG. 26 presents a view of the display of the historical log for a user consistent with certain embodiments of the present invention. In this embodiment, at 2608 the historical log presents data to the user to permit a visual tracking of the morning readiness score, sleep statistics, exercise statistics, and other health information. The data presented may be configured by the user to provide that information that is most useful to the user in monitoring trends for these health-related statistics over time.
[0301] FIG. 27 presents a view of the historical trends for HRV statistics for a user consistent with certain embodiments of the present invention. In this embodiment, at 2710 the displays presented to the user provide a snapshot of statistical information in chart form, permitting the user to understand changes in their health readings and health related statistics over time. In a non-limiting example, a portion of the display presents a chart of a coefficient of variation for the HRV data readings expressed as a percentage and a second portion of the display presents a chart of the total power readings captured over time. However, this example should not be considered limiting as the user may configure the display to present any other statistical measure captured by the system over time. The selected statistical data may then be charted and presented to the user on this display when requested.
[0302] FIG. 28 presents a view of the connection capability for the sensors associated with the HRV monitoring system consistent with certain embodiments of the present invention. In this embodiment, at 2812 the user would open this display to begin the process of selecting the desired Heart Rate (HR) monitoring sensor and performing the connection actions to place the HR monitoring system in readiness to perform a data reading. The HR system presents the user with one or more sensors that have been discovered through an embedded near-field communication protocol, such as, in a non-limiting example, Bluetooth Low- Energy (BLE). The user may then select the sensor they intend to use for the HRV data reading by selecting the sensor name on the display screen. The user is also presented with a troubleshooting capability to resolve issues when a sensor does not connect or indicates errors or issues with connection. Upon an indication of connectivity and readiness, the user may proceed to use the selected HR monitoring sensor to capture the HRV data reading.
[0303] FIG. 29 figure presents a view of the historical trends for HRV statistics for a user related to morning readiness scores and HRV values consistent with certain embodiments of the present invention. In this embodiment, at 2914 this display presents charts of the morning readiness and HRV data readings over a selected span of time. The user may choose the length of time for the charted information from an icon on the screen indicating the desired time span. The user may also choose to change the chart time span to move from one timespan to another by selecting a different time span icon on the display screen, allowing the user to compare short term and long-term trends.
[0304] FIG. 30 presents a view of the detailed data values for HRV statistics for a user related to morning readiness scores and HRV values consistent with certain embodiments of the present invention. In this embodiment, at 3016 the user may be presented with an indicator that displays the morning readiness score as a relative measure between sympathetic and parasympathetic conditions to provide a relative balance indicator. This display also provides the user with heart rate and HRV data readings and charts intra-reading values that may be interpolated from the HRV data readings. This detailed information display provides the user with a view into the actual variability in the intra-beat heart rate data.
[0305] FIG. 31 presents a view of the informational data for a user related to morning readiness scores and HRV values expressed as autonomic balance between the sympathetic and the parasympathetic systems consistent with certain embodiments of the present invention. In this embodiment, the displayed information 3118 is educational and informative in nature, providing the user with an understanding of how the morning readiness score indicates a balance between the sympathetic and parasympathetic condition of the user’s autonomic nervous system.
[0306] FIG. 32 presents a view of the historical trends for HRV statistics for a population related to morning readiness scores and HRV values consistent with certain embodiments of the present invention. In this embodiment, the user is presented at 3220 with metrics associated with the user and presented as a comparison with a filtered population. The user or a medical practitioner, a coach, or wellness practitioner may input metrics associated with a particular HRV score, age and gender to create a comparison between the input metrics and the filtered population as a whole.
[0307] FIG. 33 presents a view of the historical trends for HRV statistics for a population related to morning readiness scores and HRV values consistent with certain embodiments of the present invention. In this embodiment, shown as a continuation of the comparison view of FIG. 33, the user at 3320 is informed that the filtered population is filtered based upon morning readiness readings from all system users having more than 2 measurements stored within the morning readiness score database maintained by the system server. Once again, the information provided to the user on this screen is informational and is intended to educate the user on how and why age, gender, fitness level, and health can affect the user’s HRV.
[0308] FIG. 34 presents a view of the raw data captured for RR intervals and HRV values consistent with certain embodiments of the present invention. In this embodiment, the user is presented with the actual data captured by the selected sensor and presented to the user as a chart of values over time 3424. The R-R interval data is collected over a time span of at least 1 minute, although the time span may be longer if greater accuracy is desired, and the data is collected continuously over the reading time span. Furthermore, the time span may be shorter than 2 minutes. In particular, the time span may be as brief as 1 minute, e.g., 60 seconds. The R-R intervals are reported in milliseconds and provide the basis for the determination of HRV, which is charted over the same time span and presented on a normalized scale of 1 - 100. From this raw data, the user or medical practitioner may have a more optimized view of the user’s HRV and the RR intervals that contribute to the HRV values for the time span during the data reading.
[0309] FIG. 35 presents a view of the relationship between R-R intervals and HRV values consistent with certain embodiments of the present invention. FIG. 35 is showing live R-R intervals during a reading, as well as “real time” heart rate and HRV values. In this embodiment, in this view the user or a medical practitioner, a coach, or other professional is presented with a comparative chart of a user’s heart rate and the associated variability in that heart rate (HRV) for a particular data reading 3526. From this view, the user or the medical practitioner may derive a better understanding of the amount of variability the user is experiencing in their monitored heart beats, even though the number of beats and timing may be well within physical norms for the user’s age, gender, fitness level, and weight. This particular display may present a user, or the user’s medical practitioner, some insight into whether steps should be taken to optimize the user’s HRV.
[0310] FIG. 36 presents a view of the display of HRV statistics and signal quality for a user post reading consistent with certain embodiments of the present invention. In an exemplary embodiment, this display presents the user with an operational view of the signal quality for received data from the sensor and the HRV and HR data values collected during a reading period 3628. The signal quality presents the user with a view of whether data artifacts are appearing in the recorded data and, if so, how many such artifacts have been detected. If the user is experiencing poor results from a data reading, the user may use this display to determine if signal quality or data artifacts are causing the poor data reading. If either the signal quality or data artifacts are causing an issue with collecting data measurements, the user may take steps to correct the issue.
[0311] FIG. 37 presents a view of the user feedback and tagging display consistent with certain embodiments of the present invention. In an exemplary embodiment, the user may utilize this input data view to tag a collected HRV data reading with a mood the user was feeling when the HRV data reading was recorded 3730. The user may also use this display to record notes as to physical feelings and edit information associated with the user’s interaction with caffeine, alcohol or other chemicals. The user may also add metadata associated with sleep, energy level, exercise, and/or soreness to add to the HRV data reading when it is stored within the HRV data reading database. This information, although somewhat subjective, may also assist the user in optimizing their heart rate variability over time.
[0312] FIG. 38 presents a view of the display of HRV statistics and signal quality for a user post reading consistent with certain embodiments of the present invention. In an exemplary embodiment, the user may visit this display page to review insights into the user’s condition over time 3832. Although the display presents the data as a weekly insights display, this time span should not be considered limiting as the user may select different time spans over which to observe the data points presented on the display, once again in an effort to optimize the user’s HRV over time.
[0313] FIG. 39 presents a view of the composite reading and data collection process consistent with certain embodiments of the present invention. At 3900 a user or medical practitioner on behalf of a user initiates a data collection action to collect HRV data, physiological data, biometric data, and environmental data. The initial data collection secures information about the user’s heart beat R-R values, environmental data about the area in which a user is located, and other biometric and physiological data such as heart rate, blood pressure, oxygen levels, CO2 levels, glucose, ketones, general awareness or alertness, stress, reflex time, resilience, training or related capacity or capability, and video imagery. The data collected during the HRV reading is stored in the HRV system server electronic data store and combined with historical data and other information to create an initial composite score, comprising HRV data, environmental data, and other biometric data, for the user at 3902. At 3904 the HRV system may analyze the collected and accumulated historical data to create a planned event, intervention or planned step to assist a user in achieving one or more expressed goals with regard to the composite score and provide this guidance to the user. After the user has performed the communicated planned event or events, intervention or planned steps the user at 3906 will perform another data collection action to update the initial data recordings and collect updated data on all sampled values subsequent to the user performance of the planned event, intervention, or planned step. At 3908 the HRV system may perform a calculation to update the composite score utilizing the collected data from the most recent data collection effort. At 3910 the HRV system analyzes the updated composite score to determine if the latest calculated composite score is above or below a threshold value or within a range that is indicated as desired for the user.
[0314] If the updated composite score is not above or below a threshold value or within a range desired for the user, the HRV system updates the planned intervention for the user at 3912 by choosing or creating a modification to the previously recommended event, intervention, or planned step and returns this value to the HRV system server. The HRV system server then returns to process step 3904 to provide this information to the user. If the updated composite score meets or exceeds the threshold value or is within the desired range for the user the HRV system provides updated feedback to the user on their composite score values and how the user is meeting their goals with regard to the established composite score. At 3916 the HRV system queries the user to determine if additional data collection and/or analysis is desired by the user, or by the medical practitioner associated with the user. If additional readings or analysis are desired the HRV system returns to step 3904 to provide the updated modifications created for the user by the HRV system and the user performs the remaining steps in the process utilizing the updated modifications in performing those steps. If no further steps are required the HRV system at 3918 may produce a score validation for the user and create a final report for the current data collection readings and the user’s current state with regard to their expressed goals and/or the composite level goals established for the user by the HRV system.
[0315] FIG. 40 presents a view of the HRV system configuration consistent with certain embodiments of the present disclosure. In this embodiment HRV data may be collected from a user through the use of any sensor or device configured to collect HRV data. These sensors or devices may include capturing the HRV data through attaching an ECG sensor 4000, a PPG sensor 4002, or a smart wearable device 403 to the user, or the HRV data may be captured through the use of a camera 4004 or using a smartphone 4006. The data captured by any sensor or device may be collected and transmitted as a stream of data in real-time, or may be collected and transmitted in a batch at a later time. Regardless of the method of collection or data transmission the collected data is transmitted to the system data processor 4008. Within the system data processor 4008 a plurality of modules are active at 4010 to perform the HRV analysis herein described and creating parameters for review as well as predictions and recommendations for the user. The data, predictions, and recommendations are later transmitted to any display device associated with a user at 4012 to display the information for consumption by the user.
[0316] While certain illustrative embodiments have been described, it is evident that many alternatives, modifications, permutations and variations will become apparent to those skilled in the art in light of the foregoing description.
[0317] Various embodiments may be implemented at least in part in any conventional computer programming language. For example, some embodiments may be implemented in a procedural programming language (e.g., “C”), or in an object-oriented programming language (e.g., “C++”). Other embodiments may be implemented as a pre-configured, standalone hardware element and/or as pre-programmed hardware elements (e.g., application specific integrated circuits, FPGAs, and digital signal processors), or other related components.
[0318] In an alternative embodiment, the disclosed systems, devices, and methods may be implemented as a computer program product for use with a computer system, a smartphone, a smartwatch, and the like. Such implementation may include a series of computer instructions fixed either on a tangible, non-transitory medium, such as a computer readable medium (e.g., a diskette, CD-ROM, ROM, or fixed disk). The series of computer instructions can embody all or part of the functionality previously described herein with respect to the system.
[0319] Those skilled in the art should appreciate that such computer instructions can be written in a number of programming languages for use with many computer architectures or operating systems. Further, such instructions may be stored in any memory device, such as semiconductor, magnetic, optical or other memory devices, and may be transmitted using any communications technology, such as optical, infrared, microwave, or other transmission technologies.
[0320] Among other ways, such a computer program product may be distributed as a removable medium with accompanying printed or electronic documentation, e.g., shrink wrapped software, preloaded with a computer system, e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the network, e.g., the Internet or World Wide Web). In fact, some embodiments may be implemented in a software-as-a- service model (“SAAS”) or cloud computing model. Of course, some embodiments may be implemented as a combination of both software, e.g., a computer program product, and hardware. Still other embodiments are implemented as entirely hardware, or entirely software.
[0321] As used in any embodiment herein, the term “module” may refer to software, firmware and/or circuitry configured to perform any of the aforementioned operations. Software may be embodied as a software package, code, instructions, instruction sets and/or data recorded on non-transitory computer readable storage medium. Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., nonvolatile) in memory devices. “Circuitry”, as used in any embodiment herein, may comprise, for example, singly or in any combination, hardwired circuitry, programmable circuitry such as computer processors comprising one or more individual instruction processing cores, state machine circuitry, and/or firmware that stores instructions executed by programmable circuitry. The modules may, collectively or individually, be embodied as circuitry that forms part of a larger system, for example, an integrated circuit (IC), system on- chip (SoC), desktop computers, laptop computers, tablet computers, servers, smart phones, etc.
[0322] Any of the operations described herein may be implemented in a system that includes one or more storage mediums having stored thereon, individually or in combination, instructions that when executed by one or more processors perform the methods. Here, the processor may include, for example, a server CPU, a mobile device CPU, and/or other programmable circuitry.
[0323] Also, it is intended that operations described herein may be distributed across a plurality of physical devices, such as processing structures at more than one different physical location. The storage medium may include any type of tangible medium, for example, any type of disk including hard disks, floppy disks, optical disks, compact disk read-only memories (CD-ROMs), compact disk rewritables (CD-RWs), and magneto-optical disks, semiconductor devices such as read-only memories (ROMs), random access memories (RAMs) such as dynamic and static RAMs, erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), flash memories, Solid State Disks (SSDs), magnetic or optical cards, or any type of media suitable for storing electronic instructions. Other embodiments may be implemented as software modules executed by a programmable control device. The storage medium may be non- transitory.
[0324] As described herein, various embodiments may be implemented using hardware elements, software elements, or any combination thereof. Examples of hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, memristors, quantum computing devices, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
[0325] Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0326] The term "non-transitory" is to be understood to remove only propagating transitory signals per se from the claim scope and does not relinquish rights to all standard computer- readable media that are not only propagating transitory signals per se. Stated another way, the meaning of the term "non-transitory computer-readable medium" and "non-transitory computer-readable storage medium" should be construed to exclude only those types of transitory computer-readable media which were found in In Re Nuijten to fall outside the scope of patentable subject matter under 35 U.S.C. § 101.
[0327] The terms and expressions which have been employed herein are used as terms of description and not of limitation, and there is no intention, in the use of such terms and expressions, of excluding any equivalents of the features shown and described (or portions thereof), and it is recognized that various modifications are possible within the scope of the claims. Accordingly, the claims are intended to cover all such equivalents.
[0328] The illustrated and described systems and methods are in no way limiting. A person skilled in the art, in view of the present disclosures, will understand how to apply the teachings of one embodiment to other embodiments either explicitly or implicitly provided for in the present disclosures. Further, a person skilled in the art will appreciate further features and advantages of the present disclosure based on the above-described embodiments. Accordingly, the disclosure is not to be limited by what has been particularly shown and described, except as indicated by the appended claims.

Claims

CLAIMS What is claimed is:
1. A method for training machine learning models to transform camera-based images into estimates of physiological biomarkers, comprising: providing a time series of video frames collected from a region of a body of a subject that is associated with a blood volume change within a tissue of the body; providing a time series of R-R intervals, the R-R intervals provided as ground truth data; synchronizing the time series of the video frames and the time series of the R- R intervals to provide an input pair to provide a synchronized time series of video frames and time series of R-R intervals; and training a machine learning model to estimate R-R intervals from the synchronized time series of video frames and time series of R-R intervals using the time series of R-R intervals as the ground truth data.
2. The method of claim 1, wherein: the time series of video frames are recorded by finger-over-camera or face- over-camera video cameras; and the time series of R-R intervals are calculated from inter-beat intervals measured by an ECG.
3. The method of claim 1, wherein the machine learning model is configured as a deep learning network.
4. The method of claim 3, wherein the deep learning network is a convolutional neural network.
5. The method of claim 1, wherein the training comprises supervised learning.
6. The method of claim 5, wherein the supervised learning is configured to operate without exact known features of the regions of the body of the subject.
7. The method of claim 1, wherein the synchronizing is performed before training the machine learning model by using one or more signal processing peak detection methods.
8. The method of claim 1, wherein: the synchronized time series of video frames and time series of R-R intervals comprises a first synchronized segment of the synchronized time series of video frames and time series of R-R intervals having a first duration; and the first synchronized segment may be subdivided into one or more subdivided synchronized segments having one or more durations.
9. The method of claim 8, wherein: a moveable time window is provided to subdivide the first synchronized segment into the one or more subdivided synchronized segments; and the one or more subdivided synchronized segments are used to train the machine learning model.
10. A method for training machine learning models to learn time-dependent patterns in a time-series signal, comprising: providing a time domain metric value; providing a time series of R-R intervals as ground truth data; and training a machine learning model to determine a time domain metric value from the time series of R-R intervals.
11. The method of claim 10, wherein the time domain metric value comprises at least one of an HRV score or a RMSSD value.
12. The method of claim 10, wherein the machine learning model comprises a long short term memory model.
13. The method of claim 10, further comprising training the machine learning model to produce a frequency domain metric value.
14. The method of claim 10, further comprising training the machine learning model to produce a non-linear metric value.
15. The method of claim 10, wherein the machine learning model is configured as a deep learning network.
16. The method of claim 15, wherein the deep learning network is configured as a convolutional neural network.
17. The method of claim 10, wherein the machine learning model is a hybrid machine learning model.
18. The method of claim 17, wherein the hybrid machine learning model is trained by a stacking model.
19. The method of claim 18, wherein the stacking model comprises a convolutional neural network and a long short term memory model.
20. The method of clam 10, wherein the machine learning model is trained to detect and remove artifacts from the time series of R-R intervals.
21. The method of claim 20, wherein the artifacts are caused by one or more of motion, arrhythmias, premature ectopic beats, atrial fibrillation, measurement, or signal noise.
22. A method for training machine learning models to determine a subset of pixels in video frames to generate one or more biomarker values, comprising: providing a time series of video frames collected from a region of a body of a subject that is associated with a blood volume change within a tissue of the body; dividing each of the time series of video frames into cells of an N x M grid; detecting one or more cells having the highest information content; calculating estimates of the one or more biomarkers from each of the detected cells; and training a machine learning model to fuse the estimates of the one or more biomarkers from each of the detected cells to generate the one or more biomarker values.
23. The method of claim 22, wherein the one or more biomarker values comprise at least one of R-R intervals or an HRV value.
24. The method of claim 22, wherein the dividing increases the signal to noise in the detected cells.
25. The method of claim 22, wherein the dividing reduces the amount of computation required to generate the one or more biomarker values.
26. A method for training machine learning models to determine a subset of pixels in video frames to generate one or more biomarker values, comprising: providing a time series of video frames collected from a region of a body of a subject that is associated with a blood volume change within a tissue of the body; using attention mechanism to do spatial segmentation in each of the time series of video frames; detecting the spatial segmentation in each of the time series of video frames; and training a machine learning model to calculate estimates of the one or more biomarkers from the spatial segmentation in each of the time series of video frames.
27. The method of claim 26, wherein the one or more biomarker values comprise at least one of R-R intervals or an HRV value.
28. The method of claim 26, wherein the spatial segmentation in each of the time series of video frames increases the signal to noise in the detected cells.
29. The method of claim 22, wherein the spatial segmentation in each of the time series of video frames reduces the amount of computation required to generate the one or more biomarker values.
PCT/US2022/027267 2022-05-02 2022-05-02 Machine learning models for estimating physiological biomarkers WO2023214957A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/027267 WO2023214957A1 (en) 2022-05-02 2022-05-02 Machine learning models for estimating physiological biomarkers

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/027267 WO2023214957A1 (en) 2022-05-02 2022-05-02 Machine learning models for estimating physiological biomarkers

Publications (1)

Publication Number Publication Date
WO2023214957A1 true WO2023214957A1 (en) 2023-11-09

Family

ID=88646811

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/027267 WO2023214957A1 (en) 2022-05-02 2022-05-02 Machine learning models for estimating physiological biomarkers

Country Status (1)

Country Link
WO (1) WO2023214957A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130345568A1 (en) * 2012-06-25 2013-12-26 Xerox Corporation Video-based estimation of heart rate variability
WO2021111436A1 (en) * 2019-12-02 2021-06-10 Binah.Ai Ltd System and method for physiological measurements from optical data
WO2021160306A1 (en) * 2020-02-13 2021-08-19 Qompium Computer-implemented method for synchronizing a photoplethysmography (ppg) signal with an electrocardiogram (ecg) signal
US20210390455A1 (en) * 2020-06-11 2021-12-16 DataRobot, Inc. Systems and methods for managing machine learning models
WO2022082077A1 (en) * 2020-10-16 2022-04-21 Whoop, Inc. Physiological monitoring systems

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130345568A1 (en) * 2012-06-25 2013-12-26 Xerox Corporation Video-based estimation of heart rate variability
WO2021111436A1 (en) * 2019-12-02 2021-06-10 Binah.Ai Ltd System and method for physiological measurements from optical data
WO2021160306A1 (en) * 2020-02-13 2021-08-19 Qompium Computer-implemented method for synchronizing a photoplethysmography (ppg) signal with an electrocardiogram (ecg) signal
US20210390455A1 (en) * 2020-06-11 2021-12-16 DataRobot, Inc. Systems and methods for managing machine learning models
WO2022082077A1 (en) * 2020-10-16 2022-04-21 Whoop, Inc. Physiological monitoring systems

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
KAREEM MURTADHA; LEI NINGRONG; ALI ALI; CIACCIO EDWARD J.; ACHARYA U. RAJENDRA; FAUST OLIVER: "A review of patient-led data acquisition for atrial fibrillation detection to prevent stroke", BIOMEDICAL SIGNAL PROCESSING AND CONTROL, ELSEVIER, AMSTERDAM, NL, vol. 69, 25 June 2021 (2021-06-25), NL , XP086756293, ISSN: 1746-8094, DOI: 10.1016/j.bspc.2021.102818 *
ROHAN BANERJEE; ANIRUDDHA SINHA; ARPAN PAL; ANURAG KUMAR: "Estimation of ECG parameters using photoplethysmography", 13TH IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOENGINEERING, IEEE, 13 November 2013 (2013-11-13), pages 1 - 5, XP032541375, DOI: 10.1109/BIBE.2013.6701546 *
ZHANG QI; SONG SHUANG: "Heart Rate Variability Parameters Extraction Based on Facial Video*", 2018 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION (ICIA), IEEE, 11 August 2018 (2018-08-11), pages 586 - 590, XP033604107, DOI: 10.1109/ICInfA.2018.8812485 *

Similar Documents

Publication Publication Date Title
JP7261811B2 (en) Systems and methods for non-invasive determination of blood pressure lowering based on trained predictive models
Panwar et al. PP-Net: A deep learning framework for PPG-based blood pressure and heart rate estimation
Li et al. The obf database: A large face video database for remote physiological signal measurement and atrial fibrillation detection
EP3352661B1 (en) Method and electronic device for cuff-less blood pressure(bp) measurement
US20230240545A1 (en) Heart Rate Variability Composite Scoring and Analysis
JP2012196484A (en) Residual-based monitoring of human health
US20230082362A1 (en) Processes and methods to predict blood pressure
WO2021225744A1 (en) Heart rate variability monitoring and analysis
US11617545B2 (en) Methods and systems for adaptable presentation of sensor data
Chu et al. Non-invasive arterial blood pressure measurement and SpO2 estimation using PPG signal: A deep learning framework
Rescio et al. Ambient and wearable system for workers’ stress evaluation
Kumar et al. A novel CS-NET architecture based on the unification of CNN, SVM and super-resolution spectrogram to monitor and classify blood pressure using photoplethysmography
Ho et al. A telesurveillance system with automatic electrocardiogram interpretation based on support vector machine and rule-based processing
Kim et al. Normalization of photoplethysmography using deep neural networks for individual and group comparison
EP4124287A1 (en) Regularized multiple-input pain assessment and trend
Clifford et al. A scalable mHealth system for noncommunicable disease management
WO2023214957A1 (en) Machine learning models for estimating physiological biomarkers
WO2023214956A1 (en) Systems, devices, and methods for biomarker detection and tracking
WO2022020645A1 (en) Systems and methods for rapidly screening for signs and symptoms of disorders
Yoon et al. Blood Pressure Measurement Based on the Camera and Inertial Measurement Unit of a Smartphone: Instrument Validation Study
US20220254502A1 (en) System, method and apparatus for non-invasive & non-contact monitoring of health racterstics using artificial intelligence (ai)
Dai Smart Sensing and Clinical Predictions with Wearables: From Physiological Signals to Mental Health
Tabei Novel smartphone-based photoplethysmogram signal analysis for health monitoring applications
Tomlinson Non-invasive vital-sign monitoring and data fusion in acute care
Ma Towards a Contactless Vital Sign System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22940896

Country of ref document: EP

Kind code of ref document: A1