WO2018049531A1 - Systems, devices, and methods for biometric assessment - Google Patents

Systems, devices, and methods for biometric assessment Download PDF

Info

Publication number
WO2018049531A1
WO2018049531A1 PCT/CA2017/051091 CA2017051091W WO2018049531A1 WO 2018049531 A1 WO2018049531 A1 WO 2018049531A1 CA 2017051091 W CA2017051091 W CA 2017051091W WO 2018049531 A1 WO2018049531 A1 WO 2018049531A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
breathing
heart rate
metabolic
Prior art date
Application number
PCT/CA2017/051091
Other languages
French (fr)
Inventor
Eric Reiher
Frederic Benard
Nicolas HENIN
Dominik POGORZELSKI
Original Assignee
Omsignal Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omsignal Inc. filed Critical Omsignal Inc.
Priority to EP17849972.9A priority Critical patent/EP3515301A4/en
Priority to CA3046375A priority patent/CA3046375A1/en
Publication of WO2018049531A1 publication Critical patent/WO2018049531A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/083Measuring rate of metabolism by using breath test, e.g. measuring rate of oxygen consumption
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level

Definitions

  • VTs can be informative critical points in exercise physiology because they are considered to be linked to increasing levels of anaerobic metabolism. Below VTl, aerobic metabolism can be the primary source of energy generation. Between VTl and VT2, aerobic metabolism alone may not provide enough energy and some amount of energy usually comes from anaerobic metabolism. There is usually an eventual build-up of anaerobic metabolites in this zone. Above VT2, the primary source of energy is anaerobic metabolism. As a result, fast build-up of anaerobic metabolites such as lactate can occur, inducing a relatively quick onset of fatigue.
  • FIG. 9 illustrates a method of calculating ventilatory thresholds with detection and filtering of noise data while the user is talking, according to some embodiments.
  • FIG. 10 illustrates a method of calculating effort index of exercises, according to some embodiments.
  • FIG. 11 illustrates a method of constructing and/or training a model to calculate effort index, according to some embodiments.
  • FIG. 15 illustrates a method of estimating breath/step patterns, according to some embodiments.
  • a user's heart rate can be measured versus ventilation and two inflection points can be identified corresponding to the VT1 and VT2 thresholds specific to the user. Then the HR1 and HR2 can be identified that characterize the metabolic zones for that user. Each user can have different VT1 and VT2, and accordingly, different HR1 and HR2.
  • the VT1 and/or VT2 thresholds can change over time, such as due to training, aging, and/or the like.
  • HR heart rate
  • B breathing information
  • TH estimation can be carried out before training. In some embodiments, TH estimation can be carried out during training.
  • movement information such as acceleration information (A)
  • the breathing sensor 210 can use any suitable technique to sense user breath/acquire the breathing data.
  • the breathing sensor 210 can use respiratory inductance plethysmography (RIP) to acquire the breathing data.
  • the breathing sensor 210 can include an electric conductor, such as a wire for example, that surrounds at least part of the torso of the user to monitor ventilation by measuring the cross sectional area of the chest and/or abdomen. Respiratory movement of the thorax can change the size of the conductor, thereby changing the inductance of the conductor.
  • a detector can be used to detect the change of the inductance so as to acquire breathing data, including breathing rate, breathing depth, and ventilation, among others.
  • users can use the device 200 to monitor one or more exercises, including but are not limited to walking, running, jogging, dancing, biking, stair climbing, tennis, basketball, soccer, and racquetball.
  • the device 200 can compute one or more ventilatory thresholds, and/or determine one or more metabolic zones/training zone, of the user.
  • the user can learn his metabolic zones/training zones without doing any specific training, as long as he does train in all the zones.
  • a graph and/or other indication of ventilatory thresholds can be produced for each individual user to help him understand and train in the right zone.
  • these zones can also change over time, such as due to the user's improved conditioning, so the device 200 can keep track of trends to make sure the user stays on the right track for optimal workouts and improvement.
  • the device 200 can provide biometric assessment for users to monitor their everyday activities.
  • the device 200 can detect abnormal heart beat using the cardiac sensor 220.
  • the device 200 can detect abnormal respiratory activities using the breathing sensor 210.
  • the device 200 can calculate metrics that indicate the fatigue level of the user, thereby alerting the user to take break from work if needed.
  • FIG. 3 shows a schematic of a device 300 with wireless capabilities for biometric assessment.
  • the device 300 includes a breathing sensor 310, a cardiac sensor 320, and (optionally, as indicated by dotted lines) an accelerometer 330.
  • a processor 340 is operably coupled to the sensors 310-330 to process the data acquired by the sensors 310- 330.
  • non-transitory computer-readable media include, but are not limited to: magnetic storage media such as hard disks, floppy disks, and magnetic tape; optical storage media such as Compact Disc/Digital Video Discs (CD/DVDs), Compact Disc-Read Only Memories (CD-ROMs), and holographic devices; magneto-optical storage media such as optical disks; carrier wave signal processing modules; and hardware devices that are specially configured to store and execute program code, such as Application-Specific Integrated Circuits (ASICs), Programmable Logic Devices (PLDs), Read-Only Memory (ROM) and Random-Access Memory (RAM) devices.
  • ASICs Application-Specific Integrated Circuits
  • PLDs Programmable Logic Devices
  • ROM Read-Only Memory
  • RAM Random-Access Memory
  • Other embodiments described herein relate to a computer program product, which can include, for example, the instructions and/or computer code discussed herein.
  • the external device 450 can include a smartphone.
  • the three sensors 410-430 can transmit the acquired data to the external device 450 for biometric assessment continuously, periodically, or on demand (e.g., in response to a request from the external device 450).
  • the external device 450 can also control the operation of the sensors 410-430.
  • the smartphone can control the breathing sensor 410 to increase the data acquisition rate (also referred to as sampling rate) when the user's breathing rate increases (e.g., during high intensity training).
  • the smartphone can display the biometric assessment for the use so as to allow convenient monitoring of the user's activities.
  • the user is presented in real time with a plot of HR versus time. In some embodiments, the user is presented in real time with a plot of the metabolic zone versus time. These plots can make use of the personalized HR1 and HR2 thresholds (e.g., for a single exercise session). In some embodiments, the HR can be averaged per zone. In some embodiments, the evolution of these metrics over time (e.g., for multiple exercise sessions) can also be assessed.
  • the effort index generated using the model constructed or trained by the method 1200 can have numerous applications.
  • the user can conveniently monitor his effort during training and can adjust his training based on the predicted effort index.
  • the effort index can be used to more objectively characterize the level of fatigue of the user, as discussed in detail below.
  • FIG. 13 illustrates a method 1300 of providing training information for the user.
  • fatigue data from previous training/exercise session(s) is retrieved.
  • instructions for the next training/session is provided based at least in part on the fatigue data retrieved at step 1310.
  • the fatigue data can be estimated using the method 1200 illustrated at FIG. 12.
  • a user can make multiple steps per breath.
  • 100% breathing rhythm can refer to substantially complete synchronization between steps and breathing.
  • each of the user's inhalations and exhalations can occur every three steps.
  • the breathing partem in this example is 3:3.
  • a breathing rhythm of 40% can be assessed if, 40% of the time, the user's steps and breathing are aligned.
  • the synchronization can be quantified by the average phase shift (e.g., between 0 and 2 ⁇ ) between the steps and the breath.
  • the synchronization can be quantified by the average time shift (e.g., in the unit of seconds) between the steps and the breath.
  • the method 1400 can also include the optional step of acquiring global positioning system (GPS) data during the training, at step 1460.
  • GPS global positioning system
  • the GPS data can be used to provide distance and speed information for the user.
  • the user can be presented with the breath/step pattem, along with a suggestion to modify the breath/step pattern based on one or more factors such as, but not limited to, optimization (e.g., recommend that for every 5 steps, the user inhales on the first 3 steps and exhales on the last 2 steps, or in a 3/2 pattem), the zone the runner is in (e.g., recommend that a runner in the anaerobic zone inhale/exhale in a 1/1 pattem), and/or the like.
  • optimization e.g., recommend that for every 5 steps, the user inhales on the first 3 steps and exhales on the last 2 steps, or in a 3/2 pattem
  • the zone the runner is in e.g., recommend that a runner in the anaerobic zone inhale/exhale in a 1/1 pattem
  • a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format. [00116] Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
  • networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
  • program or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of embodiments as discussed above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present invention need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present invention.
  • Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
  • the phrase "at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physiology (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pulmonology (AREA)
  • Cardiology (AREA)
  • Emergency Medicine (AREA)
  • Obesity (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A device includes a breathing sensor that is configured to obtain breathing data of a user during use. The device also includes a cardiac sensor that is configured to obtain heart rate data of the user during use. The device also includes a processor that is communicatively coupled to the breathing sensor and the cardiac sensor. The processor is configured to calculate a biological metric based at least in part on the breathing data and the heart rate data. The processor can estimate a metabolic parameter of the user based on the biological metric. In some embodiments, the device can also include a motion sensor that is communicatively coupled to the processor to obtain acceleration data of the user during use. In such embodiments, the processor is configured to calculate the biological metric based at least in part on the breathing data, the heart rate data, and the acceleration data.

Description

SYSTEMS, DEVICES, AND METHODS FOR BIOMETRIC ASSESSMENT
CROSS-REFERENCE TO RELATED APPLICATIONS
[001] This application claims priority to and benefit of U.S. Provisional Application No. 62/395,643 entitled "Systems, Apparatuses, and Methods for Biometric Assessment," filed September 16, 2016, the disclosure of which is incorporated herein by reference in its entirety.
BACKGROUND
[002] It is generally understood that, during exercise/training, human beings can be in one of three zones: an endurance zone, a race zone, and a sprint zone. FIG. 1 illustrates these three zones using ventilation (middle curve) as a function of exercise intensity. Ventilation can be characterized as the product of breathing rate and breathing depth (i.e., BRxBD). Breathing rate (BR) can be characterized as the number of breathes per unit time (e.g., per minute) and breathing depth (BD, sometimes also referred to as tidal volume) can be characterized as the amount of air that moves in and out of the lungs, and/or is exchanged by the lungs, with each breath.
[003] FIG. 1 shows that in general, as exercise intensity increases, ventilation also increases. The increase in ventilation can be divided into the three zones broadly demarcated by two inflection points, also referred to as ventilatory thresholds (VTs): VTl and VT2. VTl is the point where ventilation starts to increase at a faster rate than that of the volume of oxygen consumed by the person (VO2). The zone at low exercise intensity before VTl is usually referred to as the endurance zone (sometimes also referred to as the aerobic zone). Beyond VTl, ventilation and VO2 increase along a straight line at a different rate compared to the straight line prior to VTl . This zone is usually referred to as the race zone. As exercise intensity continues to increase, a second inflection point VT2 occurs, where ventilation again increases sharply. The zone after VT2 is usually referred to as the sprint zone (sometimes also referred to as the anaerobic zone). The second inflection point VT2 is sometimes also referred to as the anaerobic threshold (AT). [004] VTs can be informative critical points in exercise physiology because they are considered to be linked to increasing levels of anaerobic metabolism. Below VTl, aerobic metabolism can be the primary source of energy generation. Between VTl and VT2, aerobic metabolism alone may not provide enough energy and some amount of energy usually comes from anaerobic metabolism. There is usually an eventual build-up of anaerobic metabolites in this zone. Above VT2, the primary source of energy is anaerobic metabolism. As a result, fast build-up of anaerobic metabolites such as lactate can occur, inducing a relatively quick onset of fatigue.
[005] Training at the exact VT point can optimize the development of the endurance of an athlete. It has also been shown that too much training between VTl and VT2 can result in over-training and chronic injuries. As a result, it can be helpful for a person to be guided to perform the majority of his or her training below VTl, with only a minor percentage above VT2 in order to induce significant training adaptations.
[006] Conventionally, VTs are calculated during an incremental exercise test, in which a person does exercises that has monotonically increasing intensity. In addition, talking is typically not allowed, since talking may artificially increase the breathing rate, thereby influencing the accuracy of the ventilation measurement. These calculations in a controlled environment (e.g., in a laboratory) can be impractical to implement during everyday training of users.
[007] Alternatively, heart rate can also be used to define the VTs. For example, exercises at heart rates below a first threshold HR1 can be regarded as in the endurance zone. Exercises at heart rates between the first threshold HR1 and a second threshold HR2 (HR2 > HR1) can be regarded as in the race zone, and exercises at heart rates above HR2 can be regarded as in the sprint zone.
SUMMARY
[008] Systems, devices and methods for biometric assessment are described herein. In some embodiments, a device includes a breathing sensor that is configured to obtain breathing data of a user during use. The device also includes a cardiac sensor that is configured to obtain heart rate data of the user during use. The device also includes a processor that is communicatively coupled to the breathing sensor and the cardiac sensor. The processor is configured to calculate a biological metric based at least in part on the breathing data and the heart rate data. The processor can estimate a metabolic parameter of the user based on the biological metric. In some embodiments, the device can also include a motion sensor that is communicatively coupled to the processor to obtain acceleration data of the user during use. In such embodiments, the processor is configured to calculate the biological metric based at least in part on the breathing data, the heart rate data, and the acceleration data.
BRIEF DESCRIPTION OF DRAWINGS
[009] FIG. 1 is a plot depicting the relationship between exercise intensity and ventilation for a user, in accordance with some embodiments.
[0010] FIG. 2 illustrates a device for biometric assessment, in accordance with some embodiments.
[0011] FIG. 3 illustrates a device for biometric assessment with wireless capabilities, in accordance with some embodiments.
[0012] FIG. 4 illustrates an example wearable system for biometric assessment, in accordance with some embodiments.
[0013] FIG. 5 is a graph illustrating example ventilatory thresholds and metabolic zones, in accordance with some embodiments.
[0014] FIG. 6 illustrates a method for biometric assessment, according to some embodiments.
[0015] FIG. 7 illustrates a method of estimating ventilatory thresholds, according to some embodiments.
[0016] FIG. 8 illustrates a method of calculating ventilatory thresholds using different exercises, according to some embodiments.
[0017] FIG. 9 illustrates a method of calculating ventilatory thresholds with detection and filtering of noise data while the user is talking, according to some embodiments.
[0018] FIG. 10 illustrates a method of calculating effort index of exercises, according to some embodiments. [0019] FIG. 11 illustrates a method of constructing and/or training a model to calculate effort index, according to some embodiments.
[0020] FIG. 12 illustrates a method of estimating fatigue using calculated effort indices, according to some embodiments.
[0021] FIG. 13 illustrates a method of providing training information for the user, according to some embodiments.
[0022] FIG. 14 illustrates a method of estimating synchronization between breathing and step patterns, according to some embodiments.
[0023] FIG. 15 illustrates a method of estimating breath/step patterns, according to some embodiments.
DETAILED DESCRIPTION
[0024] In some embodiments disclosed herein, a user's heart rate (HR) can be measured versus ventilation and two inflection points can be identified corresponding to the VT1 and VT2 thresholds specific to the user. Then the HR1 and HR2 can be identified that characterize the metabolic zones for that user. Each user can have different VT1 and VT2, and accordingly, different HR1 and HR2. The VT1 and/or VT2 thresholds can change over time, such as due to training, aging, and/or the like.
[0025] Aspects disclosed herein include systems, devices, and methods for biometric assessment. The biometric assessment takes into account cardiac information such as heart rate (HR), and breathing information (B) such as breathing rate, to estimate one or more ventilatory thresholds (THs) that define the different metabolic zones, i.e., TH= f(HR, B). In some embodiments, TH estimation can be carried out before training. In some embodiments, TH estimation can be carried out during training.
[0026] In some embodiments, movement information, such as acceleration information (A), can be used to filter out/exclude data that is taken when the user is at rest. In such embodiments, the estimated thresholds are a function of the breathing information, heart rate, and movement, e.g., TH=f(HR, B, A). It is understood that while embodiments described herein can refer to the use of sensors for measuring heart rate, breathing, and movement/acceleration, unless explicitly indicated otherwise, the use of a sensor for movement/acceleration can be optional.
[0027] After estimation of the threshold(s), real-time calculation of metrics (M) can be performed during training (i.e., M= f(HR, B)) so as to allow a user to know precisely the metabolic zone of his training. Some non-limiting examples of metric (M) include time spent in each metabolic zone (also referred to as a "training zone" herein) (e.g., endurance zone, sprint zone, and race zone) during a training session, cumulative time spent in each metabolic zone per week and/or per month, average speed and/or pace when a user is in a particular metabolic zone, average speed and/or pace when a user is just under (e.g., within 10 beats per minute) of the VT1 threshold, and/or combinations thereof. In some embodiments, the metrics M can be visually presented, as graphs for example, for the user to visually review his metabolic zones. In some embodiments, aspects disclosed herein permit a user to precisely track his metrics M, and hence his training, over an extended period of time (e.g., more than half a year, more than a year, or even longer). The measurements of heart rate, breathing, and (optionally) acceleration can be further combined in different ways to increase the insights into the user's physical condition in several different ways as discussed in more detail below. In some embodiments, the metric M can be any suitable mathematical formulation based on HR and B such as, for example, a sum, a weighted sum, a product, a weighted product, an exponentially weighted moving average for HR/B/A combination thereof, and/or the like. In some embodiments, the metric M can account and/or normalize for any suitable additional information such as, for example, movement (e.g., acceleration, as already noted herein), physiological parameters such as weight and/or height, and/or the like.
[0028] FIG. 2 shows a schematic of a device 200 for biometric assessment, according to embodiments. The device 200 includes a breathing sensor 210 configured to acquire breathing data (sometimes also referred to as respiratory data and/or ventilatory data) of a user, a cardiac sensor 220 to acquire heart rate data, and (optionally, as indicated by dotted lines) a motion sensor 230, such as an accelerometer for example (referred to hereon as the accelerometer 230, for simplicity), to measure movement of a user during exercise. The breathing data can include, for example, breathing rate (BR), breathing depth (BD), and/or ventilation (BRxBD). The heart rate data can include heart rate (HR) and/or heart rate variation (HRV). The sensors 210-230 are operably coupled and/or communicatively coupled to a processor 240, which is configured to receive and process data provided by the sensors 210-230 so as to calculate various metrics for the user.
[0029] The breathing sensor 210 can use any suitable technique to sense user breath/acquire the breathing data. In some embodiments, the breathing sensor 210 can use respiratory inductance plethysmography (RIP) to acquire the breathing data. For example, the breathing sensor 210 can include an electric conductor, such as a wire for example, that surrounds at least part of the torso of the user to monitor ventilation by measuring the cross sectional area of the chest and/or abdomen. Respiratory movement of the thorax can change the size of the conductor, thereby changing the inductance of the conductor. A detector can be used to detect the change of the inductance so as to acquire breathing data, including breathing rate, breathing depth, and ventilation, among others. In some embodiments, the breathing sensor 210 is configurable for acquisition rates up to 25 times/second, or about every 0.04 seconds. More information of breathing sensors using RIP technology can be found in U.S. Patent Application Publication No. 20100286546 Al, the entire disclosure of which is incorporated herein in its entirety.
[0030] In some embodiments, the breathing sensor 210 using RIP technology includes a wire surrounding the torso of the user to form a loop. In some embodiments, the breathing sensor 210 using RIP technology includes a wire surrounding the torso of the user to form a partial loop.
[0031] In some embodiments, the breathing sensor 210 can use magnetic induction monitoring, a non-contact technique to assess cardiorespiratory activity by measuring the impedance distribution within the thorax. In such embodiments, the breathing sensor 210 can include a sensor coil attached to the body of the user. The sensor coil can be driven by an alternating current to excite an alternating magnetic field, B\. Bi induces eddy currents within the thorax. These eddy currents, in turn, can excite another alternating magnetic field, 52, whose size and orientation depends on the thoracic impedance distribution. Since the distribution of the thoracic impedance varies with physiological activity, the 2¾-field also varies. Therefore, information on cardiorespiratory activity can be obtained by measuring the variation of B2.
[0032] In some embodiments, the secondary magnetic field B2 can be measured by a secondary coil using a traditional transformer model with mutual inductance M12. In some embodiments, the sensor coil can be attached on the back of the user. In some embodiments, the sensor coil can be attached on the chest of the user. In some embodiments, the sensor coil can be attached on the abdomen of the user.
[0033] In some embodiments, the breathing data includes an indication of how many times the user took a "big breath", generally characterized as a breath that is longer in duration that others, that results in greater air exchange than others, and/or the like.
[0034] The cardiac sensor 220 in the device 200 can be configured to acquire heart rate information and/or heart rate variation information of the user. In some embodiments, the cardiac sensor 220 can include an electrocardiogram sensor (sometimes also referred to as an EKG sensor or ECG sensor) to monitor electrical activities of the user's heart. The ECG sensor can include electrodes placed on the skin of the user to measure the rate/rhythm of the heartbeat. In some embodiments, the electrodes are placed on the chest of the user. In some embodiments, the electrodes are placed on the abdomen of the user. In some embodiments, the electrodes are placed on the shoulder(s) of the user.
[0035] In some embodiments, the cardiac sensor 220 can use optical methods to measure the heart rate and/or the heart rate variation. In some embodiments, the cardiac sensor 220 can include a light source to provide a light beam incident on the user's skin. The light beam then gets reflected from components like tissue, bones, veins, and arteries, among others. The cardiac sensor 220 can further include a detector to detect the reflected light. When the blood volume at the position where the light beam is illuminating changes due to heartbeat, the amount of light reflected back also changes. Therefore, the change in the reflected light can translate the heartbeat into the electrical domain and the profile of the detected signal is usually referred to as a photoplethysmography or PPG signal.
[0036] In some embodiments, the light beam can include a visible light beam. In some embodiments, the light beam can include a near infrared (IR) beam. In some embodiments, the light beam can include an IR beam. In some embodiments, the light beam can be provided by one or more light emitting diodes (LEDs). In some embodiments, the light beam can be provided by a laser.
[0037] In some embodiments, the light beam is incident on the wrist of the user. In some embodiments, the light beam is incident on the user's finger. In some embodiments, the light beam is incident on any other appropriate area of the user.
[0038] In some embodiments, the cardiac sensor 220 can include a transmission-type optical heart rate monitor. In this case, the cardiac sensor 220 can include a light source to provide a light beam illuminating the skin of the user. The cardiac sensor 220 further includes a detector measuring light that transmits through the skin. When the blood volume increases or decreases due to heartbeat, the amount of light received by the detector decreases or increases accordingly. In some embodiments, this transmission-type optical heart rate monitor can be placed on a finger tip of the user.
[0039] In some embodiments, the cardiac sensor 220 includes its own processor (not shown in FIG. 2) to process the raw data (e.g., optical signals acquired by the detectors in optical cardiac sensors, or electric signal picked up by electrodes in strap-type heart rate monitors). The cardiac sensor 220 then transmits the processed data or an indication thereof, such as heart rate or heart rate variation to the processor 240 for biometric assessment.
[0040] In some embodiments, the cardiac sensor 220 transmits the raw data directly to the processor 240, which then estimate heart rate and/or heart rate variation from the raw data. The estimated heart rate and heart rate variation are then used for biometric assessment. In this case, the same processor 240 can be used to process raw data from all three sensors 210 to 230. This centralized processing can reduce the size and complexity of the device 200.
[0041] The accelerometer 230 in the system is configured to determine movement information for the user, such as acceleration information. In some embodiments, the measured acceleration can be two-dimensional (2D) acceleration. For example, the 2D acceleration can include acceleration on horizontal directions (e.g., direction parallel to the surface of the ground). In some embodiments, the measured acceleration can be three- dimensional (3D), which includes acceleration on two horizontal directions and the vertical acceleration (e.g., direction perpendicular to the ground).
[0042] In some embodiments, the accelerometer 230 includes a piezoelectric accelerometer using piezoelectric materials to measure acceleration. A piezoelectric material can be defined as a material that develops a distributed electric charge when pressed or subjected to a force. Therefore, piezoelectric materials can transform mechanical work input into electrical output and vice versa. In some embodiments, the piezoelectric accelerometer can include a disk-like base of piezoelectric material connected to a proof mass. The base can be secured to the moving body and electrodes are connected on either side of the disk. When the body accelerates, the proof mass exerts a force on the piezoelectric disk and a charge builds up across the electrodes. Acceleration can be derived by measuring the charge.
[0043] In some embodiments, the accelerometer 230 can include a piezoresistive accelerometer, which can act as both AC- and DC-response sensors. Piezoresistive materials have the property of changing their resistance under physical pressure or mechanical work. As a result, if a piezoresistive material is strained or deflected, its internal resistance can change and can stay changed until the material's original position is restored. This change of resistance therefore can be used to derive the acceleration.
[0044] In some embodiments, the accelerometer 230 includes a Micro-Electro-Mechanical Systems (MEMS) accelerometer that measures the electric charge on a capacitor to detect small movements of a proof mass attached to a spring. The small movements, induced by acceleration, can be used to derive the acceleration of the proof mass.
[0045] In some embodiments, the accelerometer 230 can include an optically-enabled accelerometer (also referred to as an optical accelerometer), where the capacitive pickoffs are replaced by an optical transducer. The optical transducer can measure small displacements of a mechanical proof mass and translate these displacements into acceleration. More specifically, an optical accelerometer can include a proof mass suspended by a pair of tethers from two anchors. Each side of the proof mass also has a respective measuring element to measure the displacement of the proof mass with respect to the anchors. [0046] In some embodiments, the accelerometer 230 includes a resonant accelerometer (also referred to as frequency-modulated accelerometers), which measures acceleration based on detection of the resonant frequency of tethers that suspend a proof mass. Acceleration of the proof mass can cause opposing changes in the effective stiffness of the tethers, resulting in equal but opposite shifts in their resonant frequencies. Detection of this opposing shift can be utilized to calculate the acceleration of the proof mass, while any mutual shift of the tethers caused by unwanted orthogonal acceleration or temperature drift are cancelled out.
[0047] In some embodiments, the accelerometer 230 can include a thermal accelerometer, which uses a centrally located resistive heating element to heat the gas molecules and temperature sensors such as thermocouples to measure the temperature difference between the time when there is no acceleration and when acceleration is applied. When subjected to acceleration, the less dense air molecules in the heated gas move in the direction of acceleration and the cool and denser molecules move in the opposite direction, creating a temperature difference. The temperature difference is proportional to acceleration and therefore can be used to derive the acceleration.
[0048] The processor 240 in the device 200 can be configured to process data acquired by the breathing sensor 210, the cardiac sensor 220, and the accelerometer 230. The processors 240 described herein can include one or more suitable processor (e.g., a central processing unit (CPU) such as an application-specific integrated circuit (ASIC), and/or a field programmable gate array (FPGA)) configured to execute one or more instructions received from, for example, a memory of the device 200 (not shown). In some embodiments, the processor 240 can be or include a Reduced Instruction Set computing (RISC) processor. The processor(s) 240 can be in communication with a memory and/or a network card. In some embodiments, the processor 240 can be configured to transmit information (e.g., data, instructions and/or network data packets) to and/or receive information from a memory and/or a network card.
[0049] Various example, non-limiting usage scenarios for the device 200 are described herein. In some embodiments, users can use the device 200 to monitor one or more exercises, including but are not limited to walking, running, jogging, dancing, biking, stair climbing, tennis, basketball, soccer, and racquetball. By combining heart rate, breathing, and acceleration information, the device 200 can compute one or more ventilatory thresholds, and/or determine one or more metabolic zones/training zone, of the user. The user can learn his metabolic zones/training zones without doing any specific training, as long as he does train in all the zones. In some embodiments, a graph and/or other indication of ventilatory thresholds can be produced for each individual user to help him understand and train in the right zone. In some embodiments, these zones can also change over time, such as due to the user's improved conditioning, so the device 200 can keep track of trends to make sure the user stays on the right track for optimal workouts and improvement.
[0050] In some embodiments, for example, the processor 240 can be configured to estimate economy associated with the activity being undertaken by the user. For example, the processor 240 can be configured to determine running economy based on the breathing signal from the breathing sensor. In some embodiments, the processor 240 is configured to estimate running economy based on the estimated amount of oxygen exchanged by the user per unit time, per unit body weight, or both. The amount of oxygen can be estimated based on volume of air exchange, and an assumption that air contains about 21% oxygen. In some embodiments, the device 200 can include an oxygen sensor (not shown) to provide an exact level of oxygen that the user is exposed to. For example, consider two athletes running at the same speed of 15 km/hour. It is estimated, based on embodiments disclosed herein, that the first athlete has an oxygen uptake (VCte) of 42 mL/kg/min and the second athlete has an oxygen uptake of 54 mL/kg/min. In this case, the first athlete has a good economy while the second one would have a relatively poorer economy.
[0051] In some embodiments, the device 200 can provide biometric assessment for users to monitor their everyday activities. In some embodiments, the device 200 can detect abnormal heart beat using the cardiac sensor 220. In some embodiments, the device 200 can detect abnormal respiratory activities using the breathing sensor 210. In some embodiments, the device 200 can calculate metrics that indicate the fatigue level of the user, thereby alerting the user to take break from work if needed. [0052] FIG. 3 shows a schematic of a device 300 with wireless capabilities for biometric assessment. The device 300 includes a breathing sensor 310, a cardiac sensor 320, and (optionally, as indicated by dotted lines) an accelerometer 330. A processor 340 is operably coupled to the sensors 310-330 to process the data acquired by the sensors 310- 330. The system further includes a memory 360 to store the acquired data, though in some embodiments, the device 300 includes a database (not shown) to store the acquired data. In some embodiments, the memory 360 stores the raw data acquired by the sensors 310 to 330. In some embodiments, the memory 360 stores the data processed by the processor 340. In some embodiments, the memory 360 stores processor-executable instructions so as to instruct the processor 340 to process the data acquired by the sensors 310 to 330.
[0053] In some embodiments, the memory 360 can include a computer storage product with a non-transitory computer-readable medium (also can be referred to as a non- transitory processor-readable medium) having instructions or computer code thereon for performing various computer-implemented operations. The computer-readable medium (or processor-readable medium) is non-transitory in the sense that it does not include transitory propagating signals per se (e.g., a propagating electromagnetic wave carrying information on a transmission medium such as space or a cable). The media and computer code (also can be referred to as code) may be those designed and constructed for the specific purpose or purposes. Examples of non-transitory computer-readable media include, but are not limited to: magnetic storage media such as hard disks, floppy disks, and magnetic tape; optical storage media such as Compact Disc/Digital Video Discs (CD/DVDs), Compact Disc-Read Only Memories (CD-ROMs), and holographic devices; magneto-optical storage media such as optical disks; carrier wave signal processing modules; and hardware devices that are specially configured to store and execute program code, such as Application-Specific Integrated Circuits (ASICs), Programmable Logic Devices (PLDs), Read-Only Memory (ROM) and Random-Access Memory (RAM) devices. Other embodiments described herein relate to a computer program product, which can include, for example, the instructions and/or computer code discussed herein.
[0054] Examples of computer code include, but are not limited to, micro-code or microinstructions, machine instructions, such as produced by a compiler, and/or files containing higher-level instructions that are executed by a computer using an interpreter. For example, embodiments may be implemented using C, Java, C++, MATLAB or other programming languages and/or other development tools.
[0055] The memory 360 can be any memory (e.g., a RAM, a ROM, a hard disk drive, an optical drive, other removable media) configured to store information (e.g., one or more software applications, user account information, media, text, etc.). The memory 360 can include one or more modules performing the functions described herein. In some embodiments, the functions described herein can be performed by any number of modules. For example, in some embodiments, the functions described herein can be performed by a single module.
[0056] The device 300 also includes a wireless communication interface 370 to transmit and/or receive data between the device 300 and any other device or system, such as a Smartphone for example. In some embodiments, the wireless communication interface 370 transmits data (e.g., ventilatory threshold, instructions, and/or metabolic zone information) to a user (e.g., a smartphone associated with the user) and receives data (e.g., feedback and/or other information) from the user. In some embodiments, the wireless communication interface 370 communicates with other system using radio frequency signals. In some embodiments, the frequencies of the signal can be in the amplitude modulation (AM) region (e.g., about 0.6 MHz, about 0.8 MHz, about 1.0 MHz, about 1.2 MHz, about 1.4 MHz, or about 1.6 MHz, including any values and sub ranges in between). In some embodiments, the frequencies of the signal can be in the frequency modulation (FM) region (e.g., about 88 MHz, about 90 MHz, about 92 MHz, about 94 MHz, about 96 MHz, about 98 MHz, about 100 MHz, about 102 MHz, about 104 MHz, about 106 MHz, or about 108 MHz, including any values and sub ranges in between).
[0057] In some embodiments, the wireless communication interface 370 communicates with other systems using a third Generation (3G) network. In some embodiments, the wireless communication interface 370 communicates with other systems using a Fourth Generation (4G) network. In some embodiments, the wireless communication interface 370 communicates with other systems using LTE network. [0058] In some embodiments, the wireless communication interface 370 communicates with other systems using Wi-Fi signals. In some embodiments, the wireless communication interface 370 communicates with other systems using Bluetooth signals.
[0059] The device 300 further includes a power source 350 to power one or more of the sensors 310-330, the processor 340, and the wireless communication interface 370. In some embodiments, the power source 350 can include a battery or a DC power adapter. In some embodiments, the power source 350 can include an AC power supply through a transformer to yield the appropriate voltage and current to operate device 300. In some embodiments, the power source 350 can include other energy storage devices, including but are not limited to, a capacitor, a super-capacitor, a fuel cell, a superconducting magnetic energy storage (SMES), a flywheel energy storage, a hydraulic accumulator, or any other energy storage devices known in the art.
[0060] In some embodiments, the power source 350 can be rechargeable. In some embodiments, the power source 350 can be recharged wirelessly via inductive charging. In some embodiments, the power source 350 can be recharged wirelessly via resonant charging. In some embodiments, the power source 350 can be recharged via infrared charging. In some embodiments, the power source 350 can include solar cells that can be recharged before, during, or after training.
[0061] FIG. 4 shows an example wearable system 400 for biometric assessment, according to embodiments. The system 400 includes a breathing sensor 410 configured to acquire breathing data, a cardiac sensor 420 to monitor heart activities, and an accelerometer 430 to measure acceleration of the user. One or more of the sensors 410- 430 can be integrated into a garment 440 of the system 400. For example, the wire of the breathing sensor 410 can be stitched and/or woven into the garment 440, as described in more detail herein. In some embodiments, the garment 440 can be a shirt. In some embodiments, the garment 440 can be a tank top, such as, for example, a running tank, a compression tank, and/or the like. In some embodiments, the garment 440 can be a bra. In some embodiments, the garment 440 can include any other active wear that is appropriate for training. [0062] In some embodiments, one or more of the sensors 410-430 can be integrated into the fabric or textile of the garment 440. In some embodiments, one or more of the sensors 410-430 can be disposed on the garment 440, such as adhered to the surface of the garment 440 via adhesive means. In some embodiments, one or more of the sensors 410- 430 can be disposed on a belt, which in turn is placed on or into the garment 440. In some embodiments, the belt can be removable from/reattachable to the garment 440. In such embodiments, a user can fit the belt including the sensors 410-430 to different garments at different times. In some embodiments, one or more of the sensors 410-430 can be in a removable/reattachable component relative to the garment 440. For example, in some embodiments, the breathing sensor 410 can be woven into the garment 440, and the sensors 420, 430 can be in a box-like component that can be attached to the garment 440, such as via, for example, a clip, a hook and loop strip, and/or the like.
[0063] The sensors 410-430 can additionally or alternatively be in communication with an external device 450. In some embodiments, the external device 450 includes a data storage unit to store data acquired by the sensors 410-430, such as a memory and/or a database. In some embodiments, the external device 450 includes a processing unit to process data acquired by the sensors 410-430.
[0064] In some embodiments, the external device 450 can include a smartphone. In some instances, the three sensors 410-430 can transmit the acquired data to the external device 450 for biometric assessment continuously, periodically, or on demand (e.g., in response to a request from the external device 450). In some embodiments, the external device 450 can also control the operation of the sensors 410-430. For example, the smartphone can control the breathing sensor 410 to increase the data acquisition rate (also referred to as sampling rate) when the user's breathing rate increases (e.g., during high intensity training). In some embodiments, the smartphone can display the biometric assessment for the use so as to allow convenient monitoring of the user's activities.
[0065] In some embodiments, the system 400 can include a device 460 that is configured to process and digitize the data (including breathing data, heart rate data, and acceleration data) from the sensors 410-430, and/or calculate one or more thresholds, metrics based on the sensor data. The device 460 can transmit the sensor data, the calculated thresholds, and/or the calculated metrics, to the external device 450, such as a Smartphone, that in turn transmits the data to a backend server (not shown). In some embodiments, the device 460 can include one or more of the sensors 410-430. In some embodiments, the device 460 is removable from/reattachable to the garment 440. In some embodiments, the garment 440 includes one or more conductive traces formed therein such that attachment of the device 460 to any portion of the conductive trace(s) permits communicative coupling of the device 460 to the sensors 410-430.
[0066] In some embodiments, the calculation of thresholds (e.g., AT and VT) and/or one or more metrics can be carried out by the device 460. In some embodiments, the calculation of thresholds (e.g., AT and VT) and/or one or more metrics can be carried out by the external device 450. In some embodiments, the calculation of thresholds (e.g., AT and VT) and/or one or more metrics can be carried out by the backend server. In some embodiments, the device 460 includes at least a processor and a memory.
[0067] In some embodiments, the VT1 and VT2 can be assessed for a user by plotting HR as a function of breathing ventilation and identifying the inflection points HR1 and HR2. This can be performed offline after filtering the data of the user (e.g., filtering the data that is taken when the user is talking or at rest) and aggregating the data from multiple exercise sessions. Once the heart rates HR1 and HR2 matching the VT1 and VT2 (or VT and AT) thresholds are identified, the amount of time the user spends or has spent in each zone (e.g., endurance zone, race zone, or sprint zone) can be assessed based on the HR during an exercise session.
[0068] In some embodiments, the user is presented in real time with a plot of HR versus time. In some embodiments, the user is presented in real time with a plot of the metabolic zone versus time. These plots can make use of the personalized HR1 and HR2 thresholds (e.g., for a single exercise session). In some embodiments, the HR can be averaged per zone. In some embodiments, the evolution of these metrics over time (e.g., for multiple exercise sessions) can also be assessed.
[0069] FIG. 5 shows a graph 500 as an example visual illustration of ventilatory thresholds and metabolic zones that can be displayed on the extemal device 450 in the system 400. The graph 500 includes a biometric curve 540 showing a metric (M) as a function of breathing. The metric is calculated taking into account the breathing data, the heart rate data, and the acceleration data acquired by the sensors 410-430 in the system 400. The biometric curve 540 has two threshold points VT and AT that divide the graph into three metabolic zones: the endurance zone 510, the race zone 520, and the sprint zone 530. The graph 500 also shows a status indicator 545 on the biometric curve 540 to illustrate to the user his current activity status, and provides an indicator of which metabolic zone the user is in, and whether he is above/below the VT and/or AT. The status indicator 545 also tells the user how much margin he has in order to stay in the same zone (e.g., how much faster or slower he can run without going beyond the current zone).
[0070] In some embodiments, the biometric curve 540 can be individualized for each user to accurately characterize the metabolic zones of the user. In some embodiments, the biometric curve 540 can be updated periodically taking into account training data (e.g., breathing data, heart rate data, and acceleration data) acquired during a certain time interval, such as during a single exercise session, over a week of exercise sessions, and/or the like. The update can also illustrate the change of the user's athletic level so as to allow the user to adjust his training protocols. In some embodiments, the biometric curve 540 can be updated in real-time. In some embodiments, the biometric curve can be updated after each exercise session. In some embodiments, the biometric curve 540 can be updated every day, every week, every two weeks, every month, and/or the like. In some embodiments, the biometric curve 540 can be updated at any other time interval selected by the user, or updated per the user's request.
[0071] In some embodiments, the graph 500 can be displayed in a real-time manner during training, in which case the user can monitor the training at any moment during the training. In some embodiments, the graph 500 can be displayed after the training. In this manner, for some sports such as swimming or soccer, where it may not be convenient or safe for the user to watch the display during training, the post-training display can be made available.
[0072] FIG. 6 illustrates a method 600 of biometric assessment, according to embodiments. The method 600 includes acquiring and/or obtaining acceleration data at step 610a, acquiring and/or obtaining breathing data at step 620a, and acquiring and/or obtaining heart rate data at step 630a. The steps 610a-610c can be performed at any order. In some embodiments, they can be performed substantially simultaneously.
[0073] In some embodiments, the acquisition of the data, including the breathing data, the heart rate data, and the acceleration data can be carried out periodically. The acquisition rate (sometimes also referred to as the sampling rate or sampling frequency) can be from about 1 Hz to about 1 kHz or more (e.g., about 1 Hz, about 2 Hz, about 5 Hz, about 10 Hz, about 20 Hz, about 50 Hz, about 100 Hz, about 200 Hz, about 500 Hz, or about 1 kHz or more, including any values and sub ranges in between). In some embodiments, the acquisition of the acceleration data, breathing data, and heart rate data can have substantially the same sampling frequency. In some embodiments, at least one of the acceleration data, breathing data, and heart rate data can be acquired at a different sampling frequency than the others. For example, in some embodiments, acceleration data can be acquired at a higher sampling frequency that the frequency for acquiring the breathing data, considering that the number of steps taken per minute is usually greater than the number of breaths every minute.
[0074] At step 620 of the method 600, the acquired breathing data (B), heart rate data (HR), and acceleration data (A) are used to calculate a metric M (e.g., , i.e., M=f(B, HR, A). Some non-limiting examples of metric (M) include time spent in each metabolic zone (e.g., endurance zone, sprint zone, and race zone) during a training session, cumulative time spent in each metabolic zone per week and/or per month, average speed and/or pace when a user is in a particular metabolic zone, average speed and/or pace when a user is just under (e.g., within 10 beats per minute) of the VTl threshold, and/or combinations thereof. The metric M is used to estimate a metabolic parameter, such as the metabolic zone of the user's activity at step 630 of the method 600. The metabolic zones can include endurance zone, race zone, and sprint zone as described above.
[0075] FIG. 7 illustrates a method 700 of estimating ventilatory thresholds (VTs), according to embodiments. In some embodiments, the estimated VTs can be used to facilitate the estimation of metabolic zones in the method 600.
[0076] At step 710 of the method 700, a user exercises at a low intensity, such that the user is highly likely to be exercising in zone 1 (e.g., an endurance zone), such as based on an initial assessment that accounts for one or more of the user's age, height, weight, a subjective health evaluation, and/or the like. During the exercises in zone 1, breathing data, heart rate data, and optionally acceleration data is acquired for subsequent use. At step 720 of the method 700, the user increases the intensity of exercise, such that the user is exercising in zone 2 (e.g., a race zone). During the exercises in zone 2, breathing data, heart rate data, and optionally acceleration data is also acquired for subsequent use. At step 730 of the method 700, the user again increases the intensity to exercise in zone 3 (e.g., a sprint zone). Breathing data, heart rate data, and optionally acceleration data is then acquired in zone 3 for subsequent use.
[0077] At step 740 of the method 700, the acquired breathing data, heart rate data, and acceleration data in each zone (i.e., zone 1, zone 2, zone 3) is used to calculate one or more VTs that can distinguish at least two of these three different zones. In this manner, VT is calculated as a function of the breathing data, the heart rate data, and optionally the acceleration data, i.e., VT= f(B, HR) or VT=f(B, HR, A).
[0078] Steps 710 to 730 are shown in FIG. 7 to acquire data for illustrative purposes. In practice, the data acquisition can be performed in more than three steps. For example, the user can gradually increase the exercise intensity from low intensity to high intensity. The number of steps can be greater than 3 (e.g., about 4, about 5, about 8, about 10, about 15, about 20, or greater, including any values and sub ranges in between). At each step, data is taken for subsequent calculation of VTs. In some embodiments, a device (e.g., device 200 in FIG. 2 and/or device 300 in FIG. 3) can be configured to send instructions to the user to exercise at different intensities in order to acquire the breathing data, heart rate data, and acceleration data at each of those intensities.
[0079] FIG. 8 illustrates a method 800 of calculating VTs using independent exercises, according to embodiments. Three sets of exercises are performed at steps 810, 820, and 830, respectively. The exercise at one step is independent from the exercise in another step in a sense that the relative intensity of the two exercises can be arbitrary. For example, the exercise in the first step 810 can be more intense than the exercise in the second step 820. Alternatively, the exercise in the first step 810 can be less intense than the exercise in the second step 820. As long as the user covers all the zones of interest, VTs can be calculated. At each step (810, 820, or 830), breathing data, heart rate data, and optionally acceleration data is acquired. At step 840, the acquired data is used to calculate the VTs. As described herein for FIG. 4, in some embodiments, the evaluation of thresholds and/or any metrics at step 840 can be performed at a backend server, and/or any other remote device, which receives the training/sensor data from various exercise sessions from the user. In some embodiments, the step 840 can further include filtering the data to remove data points that are acquired when the user is at rest or talking. In some embodiments, the step 840 can further include various machine learning models/approaches to calculate and/or update the thresholds such as, but not limited to, supervised learning (e.g., artificial learning, support vector machines, linear classifiers, decision trees, Bayesian networks, and/or the like), unsupervised learning (e.g., artificial neural networks, cluster analysis, and/or the like), deep learning (e.g., deep belief networks, deep Boltzmann machines, and/or the like), and/or combinations thereof.
[0080] In the method 800, the independent exercises can be supervised and/or unsupervised and the user can do anything he wants, including simply talking, such as to establish a base heart rate. Talking can be detected and removed from the data used to calculate the thresholds, as discussed in more details below with reference to FIG. 9. In some embodiments, a device (e.g., device 200 in FIG. 2 and/or device 300 in FIG. 3) can be configured to send instructions to the user to perform each set of exercise. In some embodiments, the accelerometer can also be employed to find areas in the exercise where the user is in a steady state. In some embodiments, the accelerometer can also be employed to find areas in the exercise where there is no major pace or movement variations that can cause large, unnatural variations in heart rate or breathing.
[0081] FIG. 9 illustrates a method 900 of calculating VTs with detection and filtering of noise data in the presence of talking. The method 900 includes three steps of independent exercises 910, 920, and 930. At each step, the user performs exercises and training data (including breathing data, heart rate data, and acceleration data) is taken. The acquired data is then sent for detection and filtering of data points, at which the user talks, at step 940. As described above, talking can introduce artifacts into the measurement of breathing data, thereby reducing the accuracies of the resulting metric or VTs. At step 940, those data points that may be distorted by talking are detected and removed. The filter data is then used for calculating VTs, at step 950. In some embodiments, a device (e.g., device 200 in FIG. 2 and/or device 300 in FIG. 3) can be configured to send instructions to the user to perform each independent exercise.
[0082] FIG. 10 illustrates a method 1000 of calculating an effort index of exercises. The method 1000 includes acquiring acceleration data at step 1010a, acquiring breathing data at step 1010b, and acquiring heart rate data at step 1010c. The acquisition of these three types of data can be performed at any order (e.g., concurrently). The acquisition rate can be from about 1 Hz to about 1 kHz or more (e.g., about 1 Hz, about 2 Hz, about 5 Hz, about 10 Hz, about 20 Hz, about 50 Hz, about 100 Hz, about 200 Hz, about 500 Hz, or about 1 kHz or more, including any values and sub ranges in between). At step 1020, the acquired data, including the breathing data, the acceleration data, and the heart rate data, is used to calculate an effort index, which can notify the user how much effort he is using during the training. In some embodiments, the training session can include a set of exercises and the effort index can be calculated for each exercise in the set of exercises. Each of these effort indexes can be transmitted to the user. In some embodiments, a system (e.g., device 200 in FIG. 2, device 300 in FIG. 3, and/or system 400 in FIG. 4) can be configured to send instructions to the user to perform each exercise during the training session.
[0083] FIG. 11 illustrates a method 1100 of constructing and/or training a model to calculate effort index using breathing data, heart rate data, and acceleration data. At step 1110 in the method 1100, the user does some exercises. In some embodiments, a system (e.g., device 200 in FIG. 2, device 300 in FIG. 3, and/or system 400 in FIG. 4) can be configured to send instructions to the user to perform each exercise. At step 1120, which can be executed at substantially the same time as step 1110, breathing data, acceleration data, and heart rate data is acquired for the set of exercise performed at step 1110. At step 1130, the user provides a perceived effort index for each exercise. The perceived effort index is the respective effort index that is perceived by the user while performing each exercise. At step 1140, the perceived effort index provided by the user is used to generate and/or train a model along with the breathing data, the heart rate data, and the acceleration data. Said another way, the step 1140 is to construct or fine-tune a mathematical model that can convert subsequently acquired (e.g., during a subsequent training session) breathing data, heart rate data, and optionally acceleration data into an automatically generated effort index for the user.
[0084] In some embodiments, the perceived effort index can include ten levels as described in Table 1 below.
Table 1 : Perceived Effort Indices and the corresponding description of the effort
Figure imgf000024_0001
[0085] In embodiments, disclosed herein, using heart rate data (e.g. heart rate variation, or HRV), breathing data (e.g., breathing rate or BR, and ventilation), and (optionally) acceleration data, the effort of a training session can be accurately modeled with an average score error of less than 1 based on, for example, comparison between a metric computed with the sensor data and metrics associated with perceived effort. The lower the HRV, the higher the BR and the higher the ventilation, then the higher the effort is. For a constant HR during a long period, the effort modeled can slowly increase reflecting the user's actual perception of his effort.
[0086] In some embodiments, various features of the training data (including the heart rate data, breathing data, and acceleration data) can be calculated. Features that most correlate with the perceived effort from the user are then used to build a linear fit model to combine the features to predict effort.
[0087] In some embodiments, the steps 1110 to 1140 in the method 1110 are repeated for multiple exercise sessions and/or multiple training sessions each including a set of exercises. For at least some sessions, the user can exercise with a slightly different perceived effort so as to construct a more accurate model. In some embodiments, at roughly the same level of effort, the user can perform multiple sets of exercise. This can train the model to more accurately relate the training data (including the breathing data, the heart rate data, and the acceleration data) to the perceived effort index.
[0088] As an example model, the collected data and perceived effort index information can be used as data points in an n-dimensional space. For example, the breathing data can be plotted along an xl axis, the heart rate data can be plotted along an x2 axis, the acceleration data can be plotted along an x3 axis, and the perceived effort index can be plotted along a y-axis. For example, an effort level of 6 can be identified within the n- dimensional space as a volume cloud of data ranges for each of the heart rate data, the breathing data, and the acceleration data. During a subsequent exercise session and/or a subsequent training session, if the user's metrics lie within this volume cloud, the user can be predicted to have achieved an effort index of 6. In some embodiments, the user can provide feedback on the predicted effort index, which can be used to modify the volume cloud mapping to that effort index. In this manner, the model can change over time to reflect improvements or setbacks in the user's training.
[0089] The effort index generated using the model constructed or trained by the method 1200 can have numerous applications. In some embodiments, the user can conveniently monitor his effort during training and can adjust his training based on the predicted effort index. In some embodiments, the effort index can be used to more objectively characterize the level of fatigue of the user, as discussed in detail below.
[0090] Fatigue can be used to adjust how much training a user should be doing in a subsequent training session. If he is rested and recuperates fast from his training, then the amount and intensity of training can be increased. Otherwise, the amount and/or the intensity of his next training should be substantially equal to or less than the amount and/or intensity in the previous session.
[0091] Traditionally, heart rate variation (HRV) measured every morning is used to characterize how fatigued a user is. However, this HRV measure is relative. If a user is in an over-trained state, it can be hard to detect fatigue. This is because if the user continues training, his HRV can remain stable and thus the HRV does not change. In addition, HRV alone may not accurately reflect fatigue because fatigue can vary significantly from one user to another. The same HRV measure might be very good for one user (rested) but very bad for another (tired).
[0092] FIG. 12 illustrates a method 1200 of estimating fatigue (e.g., an index from 0 to 1 or any other suitable scale, a "yes "/"no" indication of fatigue, and/or the like) using calculated effort indices, according to embodiments. At step 1210, a first effort index is calculated before the user does any training, as described herein. In some embodiments, the first effort index can be calculated less than 10 hours before training (e.g., less than 10 hours, less than 8 hours, less than 6 hours, less than 4 hours, less than 3 hours, less than 2 hours, less than 1 hours, less than 30 minutes, less than 20 minutes, less than 10 minutes, or less than 5 minutes, including any values and sub ranges in between).
[0093] At step 1220, the user exercises and a second effort index is calculated. At step 1230, the user finishes the exercises and a third effort index is calculated after the completion of the exercise. In some embodiments, the third effort index can be calculated after the user feels he is rested. In some embodiments, the third effort index can be calculated after 1 hour of the exercise (e.g., after 1 hour, after 2 hours, after 3 hours, after 5 hours, after 7 hours, or after 10 hours). [0094] In some embodiments, the measure(s) of "rest" can be defined and/or otherwise characterized prior to the exercise session. The measures of "effort" can be defined and/or otherwise characterized during the exercise session, and the measures of "recovery" can be defined and/or otherwise characterized after the exercise session. Rest, effort and recovery can be all based on heart rate, breathing and (optionally) acceleration data but make use of different metrics and models.
[0095] At step 1240, the three effort indices acquired at steps 1210 to 1230 are used to estimate the fatigue level. For example, the fatigue level can be constructed by combining the rest, effort and recovery scores of recent exercise sessions as well as considering the time spent in the different metabolic zones. Compared to heart rate variation alone as used in conventional methods of estimating fatigue, effort before, during and after training gives more information. During low effort training, there is typically not too much difference between the effort before and after the training. In running, the user's speed can slowly increase over time at low efforts. During high effort training, the user can raise his heart rate near his maximum. Using effort, the method 1200 follows the performance of athletes not only at high intensities (e.g., race pace) but also at low intensity (e.g., recovery runs). In some embodiments, low intensity workouts can be employed to analyze fatigue, as athletes are not racing every week.
[0096] The method 1200 can also include an optional step 1250 to monitor the fatigue level of the user for an extended period of time. In some embodiments, the extended period of time can be about 2 weeks. In some embodiments, the extended period of time can be about 3 weeks. In some embodiments, the extended period of time can be about 4 weeks. In some embodiments, the extended period of time can be about 6 weeks or more. The continuous monitoring of the fatigue level can provide the user with more insight into his training and allow the user to adjust his training protocol if needed.
[0097] In some embodiments, the estimation and/or monitoring of fatigue accounts for the number of "big breaths" that the user has taken in one or more exercise sessions. Generally, the more rested/less fatigued the user is, the fewer the number of big breaths taken by the user. [0098] FIG. 13 illustrates a method 1300 of providing training information for the user. At step 1310 of the method 1300, fatigue data from previous training/exercise session(s) is retrieved. At step 1320, instructions for the next training/session is provided based at least in part on the fatigue data retrieved at step 1310. In some embodiments, the fatigue data can be estimated using the method 1200 illustrated at FIG. 12.
[0099] In some embodiments, the instructions can include target heart rate for the user. In some embodiments, the instructions can include target pace (e.g., in running exercises) for the user. In some embodiments, the instructions can include suggestion of more rest before next training (e.g., when the fatigue level from previous training is higher than a threshold value).
[00100] FIG. 14 illustrates a method 1400 of estimating synchronization (also sometimes referred to as "rhythm") between breathing and steps during training, according to embodiments. In other words, the synchronization/rhythm characterizes how the user's steps (i.e., when the foot hits the ground) are timed and/or in pace with respect to the user's inhalation and exhalation. At step 1410 of the method, acceleration data is acquired (e.g., using accelerometer 230 as shown in FIG. 2). At step 1430, the acceleration data is then used to estimate the step partem including the time points at which each of the user's foot hits the ground. For example, the vertical component of acceleration can be analyzed to ascertain minima and maxima consistent with steps so as to estimate the step pattern.
[00101] At step 1420, breathing data is acquired (e.g., using the breathing sensor 210 as shown in FIG. 2). At step 1440, the breathing data is used to estimate the breath pattern (also referred to as the breathing rhythm) including the time points at which the user inhales and exhales. In some embodiments, the breathing rhythm can be calculated by comparing specific steps and breaths. In some embodiments, the breathing rhythm can be calculated by monitoring the average number of steps and breaths in historical information, such as in a predetermined, past time window.
[00102] At step 1450 of the method 1400, the time points of the steps estimated at step 1430 are compared with the time points of the breath estimated at step 1440 so as to estimate the synchronization between breath and steps. In some embodiments, the synchronization can be quantified by the percentage of pairs of timings points that are synchronized.
[00103] Typically, a user can make multiple steps per breath. 100% breathing rhythm can refer to substantially complete synchronization between steps and breathing. For example, each of the user's inhalations and exhalations can occur every three steps. The breathing partem in this example is 3:3. A breathing rhythm of 40% can be assessed if, 40% of the time, the user's steps and breathing are aligned. In some embodiments, the synchronization can be quantified by the average phase shift (e.g., between 0 and 2π) between the steps and the breath. In some embodiments, the synchronization can be quantified by the average time shift (e.g., in the unit of seconds) between the steps and the breath.
[00104] In some embodiments, the synchronization can be estimated for a predetermined distance. For example for a distance running of 10 miles, the synchronization can provide synchronization information at each mile. In this manner, the user can learn at which point the synchronization becomes better or worse. For instance, the user may learn that he loses synchronization after 3 miles, in which case he may adjust his pace during the first 3 miles so as to keep the synchronization for a longer time.
[00105] In some embodiments, the synchronization can be estimated for a predetermined amount of time. For example, for an endurance training session of 1 hour, the synchronization can provide synchronization information at each minute.
[00106] The method 1400 can also include the optional step of acquiring global positioning system (GPS) data during the training, at step 1460. The GPS data can be used to provide distance and speed information for the user.
[00107] FIG. 15 illustrates a method 1500 of estimating breath/step patterns, according to embodiments. At step 1510 of the method, acceleration data is acquired (e.g., using the accelerometer 230 as shown in FIG. 2). At step 1530, the acceleration data is then used to estimate the step partem including the timing points at which the user's foot hits the ground. At step 1520, breathing data is acquired (e.g., using the breathing sensor 210 as shown in FIG. 2). At step 1540, the breathing data is used to estimate the breathing partem of the user, including the time points at which the user inhales and exhales. [00108] At step 1550, the breath/step pattern can be estimated using the time points of the steps estimated at step 1530 and the time points of the breath estimated at step 1540. In some embodiments, the breath/step pattem can include information about the number of steps that are taken during each cycle of breath (i.e., one inhale and one exhale). In some embodiments, the breath/step pattern can include information about which leg the user prefers using in order to make a stride when the user inhales.
[00109] The method 1500 can also include an optional step 1560 of acquiring GPS data during the training. The GPS data can be used The GPS data can be used to provide distance and speed of the user.
[00110] In some embodiments, the user can be presented with the breath/step pattem, along with a suggestion to modify the breath/step pattern based on one or more factors such as, but not limited to, optimization (e.g., recommend that for every 5 steps, the user inhales on the first 3 steps and exhales on the last 2 steps, or in a 3/2 pattem), the zone the runner is in (e.g., recommend that a runner in the anaerobic zone inhale/exhale in a 1/1 pattem), and/or the like.
[00111] While various embodiments have been described above, it should be understood that they have been presented by way of example, and not limitation. Where methods described above indicate certain events occurring in certain order, the ordering of certain events can be modified. Additionally, certain of the events may be performed concurrently in a parallel process when possible, as well as performed sequentially as described above.
[00112] While various inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.
[00113] The above-described embodiments can be implemented in any of numerous ways. For example, embodiments of designing and making the technology disclosed herein may be implemented using hardware, software or a combination thereof. When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
[00114] Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
[00115] Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format. [00116] Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
[00117] The various methods or processes (outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.
[00118] In this respect, various inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other non-transitory medium or tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the invention discussed above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present invention as discussed above.
[00119] The terms "program" or "software" are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of embodiments as discussed above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present invention need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present invention. [00120] Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
[00121] Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
[00122] Also, various inventive concepts may be embodied as one or more methods, of which an example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
[00123] All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
[00124] The indefinite articles "a" and "an," as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean "at least one."
[00125] The phrase "and/or," as used herein in the specification and in the claims, should be understood to mean "either or both" of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with "and/or" should be construed in the same fashion, i.e., "one or more" of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the "and/or" clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to "A and/or B", when used in conjunction with open-ended language such as "comprising" can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
[00126] As used herein in the specification and in the claims, "or" should be understood to have the same meaning as "and/or" as defined above. For example, when separating items in a list, "or" or "and/or" shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as "only one of or "exactly one of," or, when used in the claims, "consisting of," will refer to the inclusion of exactly one element of a number or list of elements. In general, the term "or" as used herein shall only be interpreted as indicating exclusive alternatives (i.e. "one or the other but not both") when preceded by terms of exclusivity, such as "either," "one of," "only one of," or "exactly one of." "Consisting essentially of," when used in the claims, shall have its ordinary meaning as used in the field of patent law.
[00127] As used herein in the specification and in the claims, the phrase "at least one," in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified. Thus, as a non- limiting example, "at least one of A and B" (or, equivalently, "at least one of A or B," or, equivalently "at least one of A and/or B") can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
[00128] In the claims, as well as in the specification above, all transitional phrases such as "comprising," "including," "carrying," "having," "containing," "involving," "holding," "composed of," and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases "consisting of and "consisting essentially of shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.

Claims

1. A device, comprising:
a breathing sensor configured to obtain breathing data of a user during use;
a cardiac sensor configured to obtain heart rate data of the user during use; and a processor communicatively coupled to the breathing sensor and the cardiac sensor, the processor configured to:
calculate a biological metric based at least in part on the breathing data and the heart rate data; and
estimate a metabolic parameter of the user based on the biological metric.
2. The device of claim 1, the device further comprising:
a motion sensor communicatively coupled to the processor, the motion sensor configured to obtain acceleration data of the user during use,
wherein the processor is configured to calculate the biological metric based at least in part on the breathing data, the heart rate data, and the acceleration data.
3. The device of claim 1 or 2, wherein the breathing sensor uses Respiratory
Inductance Plethysmography (RIP) to obtain the breathing data.
4. The device of claim 1 or 2, wherein the breathing sensor obtains the breathing data at a rate up to about 25 times/second.
5. The device of claim 1 or 2, wherein the cardiac sensor includes an
electrocardiogram sensor configured to monitor electrical activity of the user's heart.
6. The device of claim 1 or 2, wherein the cardiac sensor includes an optical sensor configured to obtain the heart rate data.
7. The device of claim 6, wherein the cardiac sensor includes a second processor to process optical signals to obtain the heart rate data.
8. The device of claim 2, wherein the motion sensor is at least one of a piezoelectric accelerometer, a piezoresistive accelerometer, a micro-electro-mechanical system accelerometer, an optically-enabled accelerometer, a resonant accelerometer, or a thermal accelerometer.
9. The device of claim 2, wherein the breathing sensor, the cardiac sensor, and the motion sensor obtain the breathing data, the heart rate data, and the acceleration data periodically.
10. The device of claim 9, wherein the breathing data, the heart rate data, and the acceleration data are obtained at a sampling frequency of from about IHz to about IKHz.
11. The device of claim 1 or 2, further comprising:
a communications interface communicatively coupled to the processor, the communications interface configured to:
transmit the biological metric or the metabolic parameter to the user.
12. The device of claim 11, wherein the metabolic parameter is an indication of a metabolic zone of a set of metabolic zones for the user.
13. The device of claim 12, wherein the communications interface is further configured to:
transmit, to the user, a first instruction to perform a physical exercise at a first intensity;
transmit, to the user, a second instruction to perform the physical exercise at a second intensity; and
transmit, to the user, a third instruction to perform the physical exercise at a third intensity,
wherein the breathing sensor is further configured to:
obtain first breathing data of the user when the user performs the physical exercise at the first intensity, the user deemed to be in a first metabolic zone of the set of metabolic zones at the first intensity;
obtain second breathing data of the user when the user performs the physical exercise at the second intensity, the user deemed to be in a second metabolic zone of the set of metabolic zones at the second intensity; and
obtain third breathing data of the user when the user performs the physical exercise at the third intensity, the user deemed to be in a third metabolic zone of the set of metabolic zones at the third intensity, and
wherein the cardiac sensor is further configured to:
obtain first heart rate data of the user when the user performs the physical exercise at the first intensity;
obtain second heart rate data of the user when the user performs the physical exercise at the second intensity; and
obtain third heart rate data of the user when the user performs the physical exercise at the third intensity.
14. The device of claim 13, wherein the motion sensor is further configured to:
obtain first acceleration data of the user when the user performs the physical exercise at the first intensity;
obtain second acceleration data of the user when the user performs the physical exercise at the second intensity;
obtain third acceleration data of the user when the user performs the physical exercise at the third intensity.
15. The device of claim 13 or 14, wherein the processor is further configured to:
estimate at least one ventilatory threshold based on a first data, a second data, and a third data, the ventilatory threshold distinguishing at least two of the first metabolic zone, the second metabolic zone, and the third metabolic zone,
wherein:
the first data includes at least the first breathing data and the first heart rate data,
the second data includes at least the second breathing data and the second heart rate data, and
the third data includes at least the third breathing data and the third heart rate data.
16. The device of claim 14, wherein the processor is further configured to:
estimate at least one ventilatory threshold based on a first data, a second data, and a third data, the ventilatory threshold distinguishing at least two of the first metabolic zone, the second metabolic zone, and the third metabolic zone,
wherein:
the first data includes, the first breathing data, the first heart rate data, and the first acceleration data,
the second data includes the second breathing data, the second heart rate data, and the second acceleration data, and
the third data includes the third breathing data, the third heart rate data, and the third acceleration data.
17. The device of claim 12, wherein the communications interface is configured to:
transmit, to the user, a first instruction to perform a first physical exercise; transmit, to the user, a second instruction to perform a second physical exercise; and
transmit, to the user, a third instruction to perform a third physical exercise, wherein the breathing sensor is further configured to:
obtain a first breathing data of the user when the user performs the first physical exercise, the user deemed to be in a first metabolic zone of the set of metabolic zones while performing the first physical exercise; obtain a second breathing data of the user when the user performs the second physical exercise, the user deemed to be in a second metabolic zone of the set of metabolic zones while performing the second physical exercise; and
obtain a third breathing data of the user when the user performs the third physical exercise, the user deemed to be in a third metabolic zone of the set of metabolic zones while performing the third physical exercise; and
wherein the cardiac sensor is further configured to:
obtain a first heart rate data of the user when the user performs the first physical exercise;
obtain a second heart rate data of the user when the user performs the second physical exercise; and
obtain a third heart rate data of the user when the user performs the third physical exercise.
18. The device of claim 17, wherein the motion sensor is further configured to:
obtain a first acceleration data of the user when the user performs the first physical exercise;
obtain a second acceleration data of the user when the user performs the second physical exercise;
obtain a third acceleration data of the user when the user performs the third physical exercise.
19. The device of claim 17 or 18, wherein the processor is further configured to:
estimate at least one ventilatory threshold based on a first data, a second data, and a third data, the ventilatory threshold distinguishing at least two of the first metabolic zone, the second metabolic zone, and the third metabolic zone,
wherein:
the first data includes at least the first breathing data and the first heart rate data,
the second data includes at least the second breathing data and the second heart rate data, and
the third data includes at least the third breathing data and the third heart rate data.
20. The device of claim 18, wherein the processor is further configured to:
estimate at least one ventilatory threshold based on a first data, a second data, and a third data, the ventilatory threshold distinguishing at least two of the first metabolic zone, the second metabolic zone, and the third metabolic zone,
wherein:
the first data includes, the first breathing data, the first heart rate data, and the first acceleration data,
the second data includes the second breathing data, the second heart rate data and the second acceleration data, and
the third data includes the third breathing data, the third heart rate data, and the third acceleration data.
21. The device of claim 19 or 20, wherein the processor is further configured to:
detect noise data from the first data, the second data, and the third data; and filter the noise data to calculate the at least one ventilatory threshold.
22. The device of claim 19 or 20, wherein the biological metric includes at least one of:
a time spent by the user in the first metabolic zone, the second metabolic zone, or the third metabolic zone,
an average speed of the user in the first metabolic zone, the second metabolic zone, or the third metabolic zone, and
an average speed of the user when the user is under one of the at least one ventilatory threshold.
23. A method, comprising:
sensing, via a breathing sensor of a device associated with a user, breath of the user to generate breathing data of the user;
sensing, via a cardiac sensor of the device, a heart beat of the user to generate heart rate data of the user;
storing, in a memory of the device, the breathing data and the heart rate data; calculating, via a processor of the device, a biological metric based at least in part on the breathing data and the heart rate data; and
estimating, via the processor, a metabolic parameter of the user based on the biological metric.
24. The method of claim 23, further comprising:
sensing, via a motion sensor of the device, motion of the user to generate acceleration data of the user; and
calculating, via the processor, the biological metric based on the breathing data, the heart rate data, and the acceleration data.
25. The method of claim 23 or 24, wherein sensing the breathing data includes using a Respiratory Inductance Plethysmography (RIP) sensor to sense the breathing of the user and to generate the breathing data.
26. The method of claim 23 or 24, wherein sensing the heart rate data includes using an electrocardiogram sensor to sense electrical activity of the user's heart, the heart rate data including the electrical activity.
27. The method of claim 23 or 24, wherein the breathing of the user is sensed via the breathing sensor at a rate up to about 25 times/second.
28. The method of claim 23 or 24, wherein sensing the heart rate data includes using an optical sensor to sense the heart beat of the user and to generate the heart rate data.
29. The method of claim 23 or 24, further including:
transmitting, via a communications interface, the biological metric or the metabolic parameter to the user.
30. The method of claim 29, wherein the metabolic parameter is an indication of a metabolic zone of a set of metabolic zones for the user.
31. The method of claim 30, the method further comprising:
transmitting, via the communications interface, a first instruction to the user to perform a physical exercise at a first intensity;
obtaining a first data of the user when the user performs the physical exercise at the first intensity, the user deemed to be in a first metabolic zone of the set of metabolic zones at the first intensity;
transmitting, via the communications interface, a second instruction to the user to perform a physical exercise at a second intensity;
obtaining a second data of the user when the user performs the physical exercise at the second intensity, the user deemed to be in a second metabolic zone of the set of metabolic zones at the second intensity;
transmitting, via the communications interface, a third instruction to the user to perform a physical exercise at a third intensity;
obtaining a third data of the user when the user performs the physical exercise at the third intensity, the user deemed to be in a third metabolic zone of the set of metabolic zones at the third intensity; and
calculating, via the processor, at least one ventilatory threshold based on the first data, the second data, and the third data, the at least one ventilatory threshold
distinguishing at least two of the first metabolic zone, the second metabolic zone, and the third metabolic zone.
32. The method of claim 31, wherein:
the first data includes at least a first breathing data and a first heart rate data, the second data includes at least a second breathing data and a second heart rate data, and
the third data includes at least a third breathing data and a third heart rate data.
33. The method of claim 31, wherein:
the first data includes a first breathing data, a first heart rate data, and a first acceleration data,
the second data includes a second breathing data, a second heart rate data, and a second acceleration data, and
the third data includes a third breathing data, a third heart rate data, and a third acceleration data.
34. The method of claim 30, further comprising:
transmitting, via the communications interface, a first instruction to the user to perform a first physical exercise;
obtaining a first data of the user when the user performs the first physical exercise, the user deemed to be in a first metabolic zone of the set of metabolic zones while performing the first physical exercise;
transmitting, via the communications interface, a second instruction to the user to perform a second physical exercise;
obtaining a second data of the user when the user performs the second physical exercise, the user deemed to be in a second metabolic zone of the set of metabolic zones while performing the second physical exercise;
transmitting, via the communications interface, a third instruction to the user to perform a third physical exercise;
obtaining a third data of the user when the user performs the third physical exercise, the user deemed to be in a third metabolic zone of the set of metabolic zones while performing the third physical exercise; and
calculating at least one ventilatory threshold based on the first data, the second data, and the third data, the at least one ventilatory threshold distinguishing at least two of the first metabolic zone, the second metabolic zone, and the third metabolic zone.
35. The method of claim 34, wherein:
the first data includes at least a first breathing data and a first heart rate data, the second data includes at least a second breathing data and a second heart rate data,
the third data includes at least a third breathing data and a third heart rate data.
36. The method of claim 34, wherein:
the first data includes a first breathing data, a first heart rate data, and a first acceleration data,
the second data includes a second breathing data, a second heart rate data, and a second acceleration data,
the third data includes a third breathing data, a third heart rate data, and a third acceleration data.
37. The method of claim 35 or 36, further including:
detecting noise data from the first data, the second data, and the third data; and filtering the noise data to calculate the at least one ventilatory threshold.
38. The method of claim 37, wherein the noise data includes a plurality of data points, each data point indicating data being obtained when the user talks while performing the first physical exercise, the second physical exercise, or the third physical exercise.
39. The method of claim 23 or 24, further comprising:
transmitting to the user at least one of the metabolic parameter and the biological metric of the user in a periodic manner.
40. A method, comprising:
transmitting, via a communications interface, a first instruction to the user to perform an exercise from a set of exercises during a first training session;
obtaining, via a breathing sensor, a first breathing data of the user when the user performs the exercise during the first training session;
obtaining, via a cardiac sensor, a first heart rate data of the user when the user performs the exercise during the first training session;
obtaining, via the communications interface, a first effort index from the user, the first effort index being an effort index perceived by the user while performing the exercise during the first training session; and
generating, via a processor, a model to determine a second effort index of the user, the model being generated based at least in part on the first breathing data, the first heart rate data, and the first effort index.
41. The method of claim 40, further comprising:
obtaining, via the communications interface, a first information from the user, the first information indicating that the user is performing the exercise during a second training session;
obtaining, via the breathing sensor, a second breathing data of the user when the user performs the exercise during the second training session;
obtaining, via the cardiac sensor, a second heart rate data of the user when the user performs the exercise during the second training session;
determining, via the processor, the second effort index of the user based on the second breathing data, the second heart rate data, and the generated model; and
transmitting, via the communications interface, the second effort index to the user.
42. The method of claim 41, further comprising:
obtaining, via the communications interface, a feedback on the second effort index from the user; and
updating, via the processor, the model based on the feedback.
43. The method of claim 41, further comprising:
estimating, via the processor, a level of fatigue of the user based on the second effort index; and
transmitting, via the communications interface, the level of fatigue to the user.
44. The method of claim 43, wherein estimating the level of fatigue includes:
determining a third effort index before the user performs the exercise during the second training session;
determining the second effort index when the user performs the exercise during the second training session;
determining a fourth effort index after the user performs the exercise during the second training session; and
estimating the level of fatigue based on the second effort index, the third effort index, and the fourth effort index.
45. The method of claim 44, further comprising:
transmitting to the user an instruction to perform the exercise during a third training session based at least in part on the estimated level of fatigue.
46. A method, comprising:
acquiring, via a motion sensor, acceleration data of the user when the user performs an exercise;
estimating, via a processor, a step pattern for the user based on the acceleration data, the step pattern indicating a first set of time points at which the user's foot hits the ground while performing the exercise; acquiring, via a breathing sensor, breathing data of the user when the user performs the exercise; and
estimating, via the processor, a breath partem for the user based on the breathing data, the breath pattern indicating a second set of time points at which the user inhales and exhales while performing the exercise.
47. The method of claim 46, further comprising:
estimating, via the processor, synchronization of steps and breath of the user based on the step pattern and the breath partem.
48. The method of claim 46, further comprising:
estimating, via the processor, number of steps taken by the user during each cycle of breath based on the step partem and the breath partem.
PCT/CA2017/051091 2016-09-16 2017-09-15 Systems, devices, and methods for biometric assessment WO2018049531A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP17849972.9A EP3515301A4 (en) 2016-09-16 2017-09-15 Systems, devices, and methods for biometric assessment
CA3046375A CA3046375A1 (en) 2016-09-16 2017-09-15 Systems, devices, and methods for biometric assessment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662395643P 2016-09-16 2016-09-16
US62/395,643 2016-09-16

Publications (1)

Publication Number Publication Date
WO2018049531A1 true WO2018049531A1 (en) 2018-03-22

Family

ID=61619039

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2017/051091 WO2018049531A1 (en) 2016-09-16 2017-09-15 Systems, devices, and methods for biometric assessment

Country Status (3)

Country Link
EP (1) EP3515301A4 (en)
CA (1) CA3046375A1 (en)
WO (1) WO2018049531A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112040859A (en) * 2018-04-12 2020-12-04 日本电信电话株式会社 Anaerobic metabolism threshold estimation method and device
RU2760994C2 (en) * 2018-08-01 2021-12-02 Юрий Викторович Бабченко Device for measuring heart work parameters
CN115316985A (en) * 2022-10-13 2022-11-11 华南师范大学 Heart information detection method, device and equipment based on physiological signals
WO2022243235A1 (en) * 2021-05-18 2022-11-24 Age Impulse Portable device for accurately and concisely characterising the fitness state of active individuals as well as for accurately calculating and detecting in real-time their ventilatory thresholds
CN112040859B (en) * 2018-04-12 2024-06-04 日本电信电话株式会社 Anaerobic metabolism threshold estimation method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100286534A1 (en) * 2006-06-30 2010-11-11 Alterna, Limited Methods and systems for assessing metabolic transition points
US20130110265A1 (en) * 2011-11-01 2013-05-02 Polar Electro Oy Performance intensity zones
US8855756B2 (en) * 2009-05-18 2014-10-07 Adidas Ag Methods and program products for providing heart rate information
US20140350361A1 (en) * 2006-11-01 2014-11-27 Resmed Sensor Technologies Limited System and method for monitoring cardiorespiratory parameters
US20150209615A1 (en) * 2014-01-27 2015-07-30 Sally Edwards Zoning Method of Processing Threshold and Metabolic and Heart Rate Training Data and Sensors and Apparatus for Displaying the Same
US20150297133A1 (en) * 2012-11-28 2015-10-22 Iee International Electronics & Engineering S.A. Method and system for determining a ventilatory threshold
US20160058313A1 (en) * 2014-08-27 2016-03-03 Seiko Epson Corporation Biological information measuring device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100286534A1 (en) * 2006-06-30 2010-11-11 Alterna, Limited Methods and systems for assessing metabolic transition points
US20140350361A1 (en) * 2006-11-01 2014-11-27 Resmed Sensor Technologies Limited System and method for monitoring cardiorespiratory parameters
US8855756B2 (en) * 2009-05-18 2014-10-07 Adidas Ag Methods and program products for providing heart rate information
US20130110265A1 (en) * 2011-11-01 2013-05-02 Polar Electro Oy Performance intensity zones
US20150297133A1 (en) * 2012-11-28 2015-10-22 Iee International Electronics & Engineering S.A. Method and system for determining a ventilatory threshold
US20150209615A1 (en) * 2014-01-27 2015-07-30 Sally Edwards Zoning Method of Processing Threshold and Metabolic and Heart Rate Training Data and Sensors and Apparatus for Displaying the Same
US20160058313A1 (en) * 2014-08-27 2016-03-03 Seiko Epson Corporation Biological information measuring device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3515301A4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112040859A (en) * 2018-04-12 2020-12-04 日本电信电话株式会社 Anaerobic metabolism threshold estimation method and device
CN112040859B (en) * 2018-04-12 2024-06-04 日本电信电话株式会社 Anaerobic metabolism threshold estimation method and device
RU2760994C2 (en) * 2018-08-01 2021-12-02 Юрий Викторович Бабченко Device for measuring heart work parameters
WO2022243235A1 (en) * 2021-05-18 2022-11-24 Age Impulse Portable device for accurately and concisely characterising the fitness state of active individuals as well as for accurately calculating and detecting in real-time their ventilatory thresholds
FR3122983A1 (en) * 2021-05-18 2022-11-25 Age Impulse Portable device allowing to characterize with precision and in a synthetic way the state of physical form of individuals in activity as well as to calculate and detect in real time and with precision their ventilatory thresholds
CN115316985A (en) * 2022-10-13 2022-11-11 华南师范大学 Heart information detection method, device and equipment based on physiological signals

Also Published As

Publication number Publication date
EP3515301A4 (en) 2021-07-07
CA3046375A1 (en) 2018-03-22
EP3515301A1 (en) 2019-07-31

Similar Documents

Publication Publication Date Title
US11298036B2 (en) Wearable device including PPG and inertial sensors for assessing physical activity and biometric parameters
US20220256255A1 (en) System and method communicating biofeedback to a user through a wearable device
JP6502361B2 (en) System and method for estimating cardiovascular fitness of a person
US20190254590A1 (en) Method and apparatus for providing biofeedback during meditation exercise
US11679300B2 (en) Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US20150342518A1 (en) System and method to monitor, guide, and evaluate breathing, utilizing posture and diaphragm sensor signals
JP2019522534A (en) Determination system and method for determining the sleep stage of a subject
WO2016157217A2 (en) Technological device to assist user in workouts and healthy living
WO2018049531A1 (en) Systems, devices, and methods for biometric assessment
Angelucci et al. A wearable system for respiratory signal filtering based on activity: a preliminary validation
KR20180043517A (en) Method and device for the measurement of energy consumption based on vital/motion signals
US20220395181A1 (en) System and methods for sensor-based detection of sleep characteristics and generating animation depiction of the same
Tong et al. The influence of treadmill on postural control
JP2020010772A (en) Pulse detection method and pulse detection system
Jatobá et al. Obtaining energy expenditure and physical activity from acceleration signals for context-aware evaluation of cardiovascular parameters
Delgado-Gonzalo et al. Physical activity
JP5807700B2 (en) Calorie consumption calculation device and calorie consumption calculation method
US20230084864A1 (en) Method And Device That Generates A Respiration Signal
Cardoso Sensors fusion and movement analysis for sports performance optimization
JP2024018876A (en) Information processing system, server, information processing method, program, and learning model
JP2024018807A (en) Information processing system, server, information processing method, program, and learning model
GB2621222A (en) Method and system for estimating cardiovascular fitness and maximum heart rate for a user
CN116236187A (en) Biomechanics evaluation system, biomechanics sensing device and biomechanics evaluation platform
Mitchell A machine learning framework for automatic human activity classification from wearable sensors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17849972

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017849972

Country of ref document: EP

Effective date: 20190416

Ref document number: 3046375

Country of ref document: CA