US20240122486A1 - Physiological monitoring soundbar - Google Patents
Physiological monitoring soundbar Download PDFInfo
- Publication number
- US20240122486A1 US20240122486A1 US18/486,730 US202318486730A US2024122486A1 US 20240122486 A1 US20240122486 A1 US 20240122486A1 US 202318486730 A US202318486730 A US 202318486730A US 2024122486 A1 US2024122486 A1 US 2024122486A1
- Authority
- US
- United States
- Prior art keywords
- user
- soundbar
- sensor
- image data
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 21
- 230000035479 physiological effects, processes and functions Effects 0.000 claims abstract description 20
- 238000004891 communication Methods 0.000 claims description 55
- 238000009826 distribution Methods 0.000 claims description 52
- 238000000034 method Methods 0.000 claims description 46
- 230000036541 health Effects 0.000 claims description 40
- 230000008569 process Effects 0.000 claims description 17
- 230000037396 body weight Effects 0.000 claims description 16
- 230000005855 radiation Effects 0.000 claims description 15
- 230000005670 electromagnetic radiation Effects 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 8
- 238000009877 rendering Methods 0.000 claims description 7
- 210000003205 muscle Anatomy 0.000 claims description 6
- 230000036760 body temperature Effects 0.000 claims description 5
- 208000035473 Communicable disease Diseases 0.000 claims description 4
- 210000000577 adipose tissue Anatomy 0.000 claims description 4
- 230000008859 change Effects 0.000 claims description 4
- 208000015181 infectious disease Diseases 0.000 claims description 4
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 4
- 241000700605 Viruses Species 0.000 claims description 3
- 238000004091 panning Methods 0.000 claims description 3
- 230000005236 sound signal Effects 0.000 abstract description 15
- 230000003862 health status Effects 0.000 abstract description 8
- 238000012545 processing Methods 0.000 description 21
- 238000003860 storage Methods 0.000 description 21
- 238000013186 photoplethysmography Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 11
- 238000005259 measurement Methods 0.000 description 11
- 230000029058 respiratory gaseous exchange Effects 0.000 description 10
- 230000004048 modification Effects 0.000 description 9
- 238000012986 modification Methods 0.000 description 9
- 230000033001 locomotion Effects 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 238000004422 calculation algorithm Methods 0.000 description 7
- 230000001815 facial effect Effects 0.000 description 7
- 230000004044 response Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 238000003909 pattern recognition Methods 0.000 description 5
- 238000001228 spectrum Methods 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 206010011469 Crying Diseases 0.000 description 4
- 206010028923 Neonatal asphyxia Diseases 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000000747 cardiac effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 239000003814 drug Substances 0.000 description 3
- 210000002414 leg Anatomy 0.000 description 3
- 230000010412 perfusion Effects 0.000 description 3
- 230000035790 physiological processes and functions Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 206010012289 Dementia Diseases 0.000 description 2
- 208000004210 Pressure Ulcer Diseases 0.000 description 2
- 210000000617 arm Anatomy 0.000 description 2
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 230000017531 blood circulation Effects 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 210000000038 chest Anatomy 0.000 description 2
- 230000019771 cognition Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000002683 foot Anatomy 0.000 description 2
- 230000005021 gait Effects 0.000 description 2
- 210000004247 hand Anatomy 0.000 description 2
- 229910052760 oxygen Inorganic materials 0.000 description 2
- 239000001301 oxygen Substances 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 244000052769 pathogen Species 0.000 description 2
- 230000001717 pathogenic effect Effects 0.000 description 2
- 239000006187 pill Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 210000002784 stomach Anatomy 0.000 description 2
- 241000894006 Bacteria Species 0.000 description 1
- 208000006992 Color Vision Defects Diseases 0.000 description 1
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- 206010037660 Pyrexia Diseases 0.000 description 1
- 206010039740 Screaming Diseases 0.000 description 1
- 208000000453 Skin Neoplasms Diseases 0.000 description 1
- 206010064127 Solar lentigo Diseases 0.000 description 1
- 206010044565 Tremor Diseases 0.000 description 1
- 208000008784 apnea Diseases 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000004087 circulation Effects 0.000 description 1
- 230000003930 cognitive ability Effects 0.000 description 1
- 230000006999 cognitive decline Effects 0.000 description 1
- 208000010877 cognitive disease Diseases 0.000 description 1
- 230000006998 cognitive state Effects 0.000 description 1
- 230000036992 cognitive tasks Effects 0.000 description 1
- 201000007254 color blindness Diseases 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 230000036757 core body temperature Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000009429 distress Effects 0.000 description 1
- 230000000763 evoking effect Effects 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 210000001097 facial muscle Anatomy 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000012074 hearing test Methods 0.000 description 1
- 230000002631 hypothermal effect Effects 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- -1 lean muscle mass Substances 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000000926 neurological effect Effects 0.000 description 1
- 230000002232 neuromuscular Effects 0.000 description 1
- 230000004962 physiological condition Effects 0.000 description 1
- 230000036544 posture Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000010344 pupil dilation Effects 0.000 description 1
- 230000025474 response to light stimulus Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 201000000849 skin cancer Diseases 0.000 description 1
- 206010040882 skin lesion Diseases 0.000 description 1
- 230000002269 spontaneous effect Effects 0.000 description 1
- 210000000115 thoracic cavity Anatomy 0.000 description 1
- 210000003371 toe Anatomy 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 210000000689 upper leg Anatomy 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01G—WEIGHING
- G01G19/00—Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
- G01G19/44—Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups for weighing persons
- G01G19/50—Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups for weighing persons having additional measuring devices, e.g. for height
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
- A61B5/015—By temperature mapping of body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1079—Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4869—Determining body composition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/744—Displaying an avatar, e.g. an animated cartoon character
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7475—User input or interface means, e.g. keyboard, pointing device, joystick
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01G—WEIGHING
- G01G19/00—Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
- G01G19/44—Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups for weighing persons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R5/00—Stereophonic arrangements
- H04R5/02—Spatial or constructional arrangements of loudspeakers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0022—Monitoring a patient using a global network, e.g. telephone networks, internet
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/021—Measuring pressure in heart or blood vessels
- A61B5/02108—Measuring pressure in heart or blood vessels from analysis of pulse wave characteristics
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02416—Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/026—Measuring blood flow
- A61B5/029—Measuring or recording blood output from the heart, e.g. minute volume
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/091—Measuring volume of inspired or expired gases, e.g. to determine lung capacity
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
- A61B5/14551—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4869—Determining body composition
- A61B5/4872—Body fat
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4869—Determining body composition
- A61B5/4875—Hydration status, fluid retention of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7465—Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/02—Casings; Cabinets ; Supports therefor; Mountings therein
- H04R1/028—Casings; Cabinets ; Supports therefor; Mountings therein associated with devices performing functions other than acoustics, e.g. electric candles
Definitions
- the present disclosure relates to medical monitoring. Specifically, the disclosure describes, among other things, devices, systems, and methods for monitoring and/or displaying information regarding a user's physiological information.
- Soundbars are used to play audio signals such as music.
- Devices, such as scales can determine physiological related data of a subject.
- Scales can determine a subject's weight and/or body mass index (BMI).
- BMI body mass index
- Current determinations of BMI are based on limited information such as a subject's weight and percent body fat.
- a soundbar for monitoring a physiological health of a user can comprise a speaker, a sensor, and one or more hardware processors.
- the speaker can emit audio which can comprise one or more of music, an alert, information relating to a physiology of the user, or instructions to the user.
- the sensor can obtain sensor data as the user is within a proximity of the soundbar.
- the sensor can comprise one or more cameras configured to capture electromagnetic radiation including infrared radiation and visible light radiation.
- the sensor data can comprise image data of the user and can relate to a physiology of the user.
- the one or more hardware processors can determine a distribution of body heat of the user based on at least the image data of the user.
- the image data can comprise infrared image data indicating thermal energy.
- the one or more hardware processors can determine a distribution of body weight of the user based on at least the image data of the user.
- the image data can comprise visible light image data.
- the one or more hardware processors can determine a health index of the user from at least the distribution of body weight of the user or the distribution of body heat of the user.
- the senor can obtain the sensor data comprising the image data of the user as the user rotates in front of the soundbar, the image data of the user corresponding to a plurality of portions of the user's body.
- the soundbar can further comprise a communication component configured to communicate with one or more computing devices.
- the one or more computing devices can include a scale configured to measure a weight of the user.
- the one or more hardware processors can generate one or more instructions to the scale to cause the scale to rotate as the user stands on the scale and cause the sensor to obtain the sensor data as the scale rotates.
- the soundbar can further comprise a communication component configured to communicate with one or more computing devices.
- the one or more computing devices can include a scale configured to measure a weight of the user.
- the one or more hardware processors can access scale data obtained from the scale by the communication component, the scale data including at least a weight of the user; and determine the health index of the user based on at least the scale data.
- the scale data further includes at least one or more of a percent body fat of the user, a percent lean muscle mass of the user, a percent water of the user, a BMI of the user, a change in weight of the user, or ECG data of the user.
- the one or more hardware processors is further configured to determine a likelihood the user has an infectious disease based on at least the distribution of body heat of the user, the distribution of body heat of the user indicating a body temperature of the user, the infectious disease comprising a virus.
- the one or more hardware processors is further configured to process the image data to generate PPG data and determine one or more physiological parameters based on the PPG data.
- the one or more hardware processors is further configured to cause the speaker to emit one or more of an instruction to the user or information relating to the health index of the user.
- the soundbar further comprises a communication component configured to communicate with one or more computing devices, the one or more computing devices including a display, and the one or more hardware processors is further configured to: generate user interface data for rendering one or more user interfaces comprising indicia of the health index of the user; and cause the communication component to communicate the user interface data to the display to render the one or more user interfaces.
- the one or more user interfaces further comprises one or more images corresponding to the image data of the user, the one or more images comprising one or more historical images or one or more real-time images.
- the soundbar further comprises a communication component configured to communicate with one or more computing devices, the one or more computing devices including a scale configured to measure a weight of the user, and the one or more hardware processors is further configured to: cause the sensor to turn on or to obtain the sensor data responsive to one or more signals received from the scale by the communication component, the one or more signals generated by the scale responsive to a user standing on the scale.
- the one or more hardware processors is further configured to: determine the distribution of body heat of the user based on at least the image data of the user, the image data comprising historical image data or real-time image data; and determine the distribution of body weight of the user based on at least the image data of the user, the image data comprising historical image data or real-time image data.
- the one or more cameras comprise one or more of a plurality of cameras, a 3D camera, a depth camera, a stereovision camera, an infrared camera, or a light detection and ranging (LIDAR) sensor.
- a 3D camera a 3D camera
- a depth camera a 3D camera
- a stereovision camera a stereovision camera
- an infrared camera or a light detection and ranging (LIDAR) sensor.
- LIDAR light detection and ranging
- the one or more hardware processors is further configured to cause the one or more cameras to adjust a view of the one or more cameras, wherein adjusting the view of the one or more cameras comprises one or more of rotating the one or more cameras, adjusting a zoom of the one or more cameras, pivoting the one or more cameras, tilting the one or more cameras, or panning the one or more cameras.
- the one or more hardware processors is further configured to cause a communication component of the soundbar to implement wireless communication with a remote computing device to establish a video call; cause the communication component to transmit the image data of the user to the remote computing device; and generate user interface data for rendering one or more user interfaces comprising one or more images received from the remote computing device by the communication component.
- a method for monitoring a health of a user can comprise: accessing sensor data originating from a sensor of a soundbar, the sensor comprising one or more cameras configured to capture electromagnetic radiation including infrared radiation and visible light radiation, the sensor data comprising image data of the user and relating to a physiology of the user; determining a distribution of body heat of the user based on at least the image data of the user, the image data comprising infrared image data indicating thermal energy; determining a distribution of body weight of the user based on at least the image data of the user, the image data comprising visible light image data; determining a health index of the user from at least the distribution of body weight of the user or the distribution of body heat of the user; and causing a speaker of the soundbar to emit an audio comprising one or more of music, an alert, information relating to the health index, or instructions to the user.
- the method can further comprise: generating one or more instructions to a scale to cause the scale to rotate as the user stands on the scale; and causing the sensor to obtain the sensor data as the scale rotates.
- Non-transitory computer-readable media including computer-executable instructions that, when executed by a computing system, can cause the computing system to perform operations comprising: accessing sensor data originating from a sensor of a soundbar, the sensor comprising one or more cameras configured to capture electromagnetic radiation including infrared radiation and visible light radiation, the sensor data comprising image data of the user and relating to a physiology of the user; determining a distribution of body heat of the user based on at least the image data of the user, the image data comprising infrared image data indicating thermal energy; determining a distribution of body weight of the user based on at least the image data of the user, the image data comprising visible light image data; determining a health index of the user from at least the distribution of body weight of the user or the distribution of body heat of the user; and causing a speaker of the soundbar to emit an audio comprising one or more of music, an alert, information relating to the health index, or instructions to the user.
- the computer-executable instructions when executed by the computing system, further cause the computing system to perform operations comprising: generating one or more instructions to a scale to cause the scale to rotate as the user stands on the scale; and causing the sensor to obtain the sensor data as the scale rotates.
- the present disclosure provides a soundbar for performing physiological measurements.
- the soundbar may comprise a speaker, a sensor, a memory, and a hardware processor.
- the speaker can be configured to emit audio signals.
- the sensor can be configured to obtain sensor data relating to a physiology of a user.
- the sensor can include a camera.
- the sensor data can include one or more images.
- the memory can be configured to store the sensor data.
- the hardware processor can be configured to access the sensor data.
- the hardware processor can be configured to determine a distribution of body weight of the user based on at least the one or more images.
- the hardware processor can be configured to determine a health index of the user based on at least the distribution of body weight of the user.
- the senor includes a 3D camera.
- the senor includes an infrared camera configured to capture radiation in the infrared portion of the electromagnetic spectrum.
- the hardware processor is further configured to determine a distribution of heat of the user's body based on at least the sensor data obtained from the infrared camera.
- the hardware processor is further configured to compare multiple images of the user.
- the multiple images include one or more historical images and one or more current images.
- the hardware processor is further configured to perform image processing and/or pattern recognition on images obtained by the sensor.
- the hardware processor is further configured to determine an orientation of the user based on at least the image processing and/or pattern recognition of the images.
- the hardware processor is further configured to generate instructions to the user for the user to change orientation, wherein the instructions are based on at least the determined orientation.
- the soundbar is further in communication with a scale configured to determine a weight of the user.
- the hardware processor is further configured to generate instructions to cause the scale to rotate.
- the senor includes a light detection and ranging (LIDAR) sensor.
- LIDAR light detection and ranging
- the hardware processor is further configured to determine one or more dimensions of one or more portions of the user's body based on at least the sensor data.
- the hardware processor is further configured to generate user interface data for rendering a display; and communicate the user interface data to a computing device to be displayed.
- the present disclosure provides a soundbar for medical monitoring.
- the soundbar may comprise a speaker, a sensor, and a hardware processor.
- the speaker can be configured to emit audio signals.
- the sensor can be configured to obtain sensor data relating to a physiology of a subject.
- the sensor can include a camera.
- the sensor data can include one or more images.
- the hardware processor can be configured to access the sensor data.
- the hardware processor can be configured to determine an orientation of the subject based on at least the one or more images.
- the hardware processor can be configured to determine an amount of time the subject has been oriented in the determined orientation.
- the hardware processor can be configured to generate an alarm based on at least the determined orientation and/or the determine amount of time.
- the subject is an infant.
- the subject is a hospital patient.
- the determined orientation includes one or more of an upright orientation, a supine orientation, a prone orientation, a side orientation, or a fall orientation.
- the hardware processor is further configured to generate the alarm in response to determining that the determined amount of time exceeds a threshold.
- the present disclosure provides a soundbar for medical monitoring.
- the soundbar may comprise a speaker, a sensor, and a hardware processor.
- the speaker can be configured to emit audio signals.
- the sensor can be configured to obtain sensor data relating to a physiology of a subject.
- the sensor can include an infrared sensor.
- the sensor data can include infrared energy data indicating thermal energy.
- the hardware processor can be configured to access the sensor data.
- the hardware processor can be configured to determine a distribution of body heat of the subject based on at least the infrared energy data.
- the hardware processor can be configured to generate an alarm based on at least the distribution of body heat.
- the subject is a baby.
- the subject is a hospital patient.
- the present disclosure provides a soundbar for medical monitoring.
- the soundbar may comprise a speaker, a sensor, and a hardware processor.
- the speaker can be configured to emit audio signals.
- the sensor can be configured to obtain sensor data relating to a physiology of a subject.
- the sensor can include a camera.
- the sensor data can include one or more images.
- the hardware processor can be configured to generate instructions for a user to perform a health-related activity.
- the hardware processor can be configured to access the sensor data including sensor data obtained while the user performs the health-related activity.
- the hardware processor can be configured to determine a health status of the user based on at least the sensor data obtained while the user performs the health-related activity.
- the instructions include one or more audio signals emitted from the speaker and/or user interface data rendered on a displayed in communication with the soundbar.
- the present disclosure provides a soundbar for medical monitoring.
- the soundbar may comprise a speaker, a sensor, and a hardware processor.
- the speaker can be configured to emit audio signals.
- the sensor can be configured to obtain sensor data relating to a physiology of a subject.
- the sensor can include a camera.
- the sensor data can include one or more images.
- the hardware processor can be configured to access the one or more images including at least images of the user's face.
- the hardware processor can be configured to analyze one or more facial features of the user based on at least the images of the user's face.
- the hardware processor can be configured to determine a health status of the user based on at least the images of the user's face.
- the health status includes a stroke.
- the present disclosure provides a soundbar for medical monitoring.
- the soundbar may comprise a speaker, a sensor, and a hardware processor.
- the speaker can be configured to emit audio signals.
- the sensor can be configured to obtain sensor data relating to a physiology of a subject.
- the sensor can include a camera.
- the sensor data can include one or more images.
- the hardware processor can be configured to access the sensor data.
- the hardware processor can be configured to access statistical information including physiological data relating to a group of people.
- the hardware processor can be configured to determine a health status of the user based on at least the sensor data and the statistical information.
- the present disclosure provides a soundbar for medical monitoring.
- the soundbar may comprise a speaker, a sensor, and a hardware processor.
- the speaker can be configured to emit audio signals.
- the sensor can be configured to obtain sensor data relating to a physiology of a subject.
- the sensor can include a camera.
- the sensor data can include one or more images of a user while ambulating.
- the hardware processor can be configured to access the sensor data.
- the hardware processor can be configured to determine an ambulatory condition of the user based on at least the one or more images of a user while ambulating.
- the present disclosure provides a soundbar for medical monitoring.
- the soundbar may comprise a speaker, a sensor, and a hardware processor.
- the speaker can be configured to emit audio signals.
- the sensor can be configured to obtain sensor data relating to a physiology of a subject.
- the sensor can include a camera.
- the sensor data can include one or more images of a user.
- the hardware processor can be configured to access the sensor data.
- the hardware processor can be configured to determine a skin condition of the user based on at least one or more images of the user.
- the present disclosure provides a soundbar for medical monitoring.
- the soundbar may comprise a speaker, a sensor, and a hardware processor.
- the speaker can be configured to emit audio signals.
- the sensor can be configured to obtain sensor data of a location.
- the sensor can include a camera.
- the sensor data can include one or more images.
- the hardware processor can be configured to access the sensor data.
- the hardware processor can be configured to detect a subject in the one or more images.
- the hardware processor can be configured to determine whether the subject is authorized to be in the location.
- the subject is a health care provider.
- the location is a hospital room.
- the location is a home.
- the present disclosure provides a soundbar for medical monitoring.
- the soundbar may comprise a speaker, a sensor, and a hardware processor.
- the speaker can be configured to emit audio signals.
- the sensor can be configured to obtain sensor data relating to a physiology of a subject.
- the sensor can include a camera.
- the sensor data can include one or more images.
- the hardware processor can be configured to receive audio related data from a computing device remote to the soundbar.
- the hardware processor can be configured to modify an audio playback signal based on at least the audio related data.
- the hardware processor can be configured to transmit the modified audio playback signal to the speaker to be emitted by the soundbar.
- the computing device is an earbud, headphone, and/or earphone.
- the audio related data includes frequency dependent gains.
- the audio related data includes a hearing transfer function.
- the audio related data includes adjustments to one or more of a latency or phase.
- systems and/or computer systems comprise a computer-readable storage medium having program instructions embodied therewith, and one or more processors configured to execute the program instructions to cause the systems and/or computer systems to perform operations comprising one or more aspects of the above- and/or below-described implementations (including one or more aspects of the appended claims).
- computer program products comprising a computer-readable storage medium
- the computer-readable storage medium has program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to perform operations comprising one or more aspects of the above- and/or below-described implementations (including one or more aspects of the appended claims).
- FIG. 1 illustrates an example system for monitoring a user's physiological information.
- FIG. 2 is a schematic block diagram illustrating an example implementation of a soundbar.
- FIG. 1 illustrates an example implementation of a system 100 for monitoring physiological information of a user 103 .
- the system 100 can include a soundbar 101 .
- the system 100 may optionally include one or more of a display 109 , a mobile device 105 , and a scale 107 .
- Various components or devices of the system 100 may be in communication, such as wireless communication.
- the soundbar 101 can be perform one or more measurements relating to a physiology of the user 103 .
- the soundbar 101 can monitor a user 103 .
- the user may be a baby or a patient in a hospital or a user in a home.
- the soundbar 101 can include one or more sensor(s) 111 .
- the sensor(s) 111 can include one or more image sensors.
- the sensor(s) 111 can include one or more cameras.
- the one or more cameras may include CCD cameras and/or CMOS cameras.
- the sensor(s) 111 can include one or more types of cameras.
- the sensor(s) 111 can include a camera configured to capture optical radiation in the visible portion of the electromagnetic spectrum.
- the sensor(s) 111 can include a camera or sensor configured to capture optical radiation in the infrared portion of the electromagnetic spectrum.
- the sensor(s) 111 can include a 3D camera.
- the sensor(s) 111 can include a high-resolution camera.
- the sensor(s) 111 can include a plurality of cameras.
- the sensor(s) 111 can include a stereovision camera.
- the sensor(s) 111 can include a depth camera.
- the sensor(s) 111 can include a light detection and ranging (LIDAR) sensor.
- the sensor(s) 111 can include a millimeter wave (mmWave) sensor.
- the sensor(s) 111 can include an ultrawide band sensor.
- the sensor(s) 111 can be integrated into the soundbar 101 .
- the sensor(s) 111 may be disposed within a housing of the soundbar 101 .
- the sensor(s) 111 and the soundbar 101 can form a single integrated unit.
- the soundbar 101 can include one or more speakers configured to emit audio, such as music, voice audio, etc.
- the soundbar 101 can be configured to capture images of the user 103 to generate image data of the user 103 .
- the soundbar 101 such as one or more hardware processors of the soundbar 101 , can process the image data.
- the soundbar 101 can determine a health or wellness index of the user 103 based on at least the image data.
- the health index may indicate one or more physiological parameters of the user.
- the health index can indicate a body mass index (BMI) of the user 103 based on at least the information obtained from the sensor(s) 111 .
- the health index can indicate a thermal distribution of the user 103 .
- the health index can indicate a likelihood the user 103 is infected with a pathogen, such as a bacteria or virus.
- the soundbar 101 can determine how a weight of the user 103 is distributed on the user 103 .
- the soundbar 101 can determine dimensions of the user's 103 body.
- the soundbar 101 can determine how heat is distributed on the user's 103 body.
- the soundbar 101 may be in communication with a display 109 .
- the soundbar 101 may generate user interface data and may communicate the user interface data to the display 109 .
- the display 109 may render one or more user interfaces based on at least user interface data received from the soundbar 101 .
- the user interfaces can include indica of a health index determined by the soundbar 101 .
- the user interfaces can include physiological data of the user 103 , such as physiological parameters.
- the user interfaces can include information received from one or more remote devices, such as the scale 107 or mobile device 105 .
- the user interfaces can include images corresponding to image data generated by a camera of the soundbar 101 .
- the images can correspond to the user 103 .
- the images can include real-time images of the user 103 .
- the display 109 may render one or more user interfaces including images of the user 103 as the sensor(s) 111 generate the image data.
- the images can include historical images, such as images of the user 103 corresponding to image data previously generated by the sensor(s) 111 .
- the display 109 displays image 122 of the user 103 .
- the image 122 can be a historical and/or real-time image of the user 103 .
- the image 122 may be an image that was captured by a camera sensor 111 of the soundbar in real-time as the user 103 stands in front of the soundbar 101 and looks at the display 109 .
- the image 122 may be an image that was previously captured by a camera sensor 111 of the soundbar such as one week prior, one month prior, one year prior, etc.
- the user 103 may view such a historical image to view differences between the user's 103 body then and now.
- the image 122 may be superimposed with dimensions such as captured by sensor(s) 111 such as a LIDAR sensor.
- LIDAR may image objects using one or more of ultraviolet, visible light, or near infrared.
- the user interface can include thermal images.
- the image 124 can correspond to image data generated by an infrared sensor or camera of the soundbar 101 .
- the image 124 can include information relating to a body heat of the user 103 .
- the image 124 can be a thermal image of the user 103 with various shading or coloring indicating various temperatures.
- the image 124 can be a historical and/or real-time image of the user 103 .
- the user 103 may be able to control the display 109 .
- the user 103 can control the images that are displayed and how they are displayed.
- the user 103 can add and/or remove filters, such as infrared filters, to the image 124 to provide various effects.
- the images 122 and/or 124 may be an image captured by a camera and may accurately portray the user 103 in great detail.
- the images 122 and/or 124 may be a representation of the user 103 which may omit or alter visual details of the user to provide security or privacy.
- the images 122 and/or 124 may include an avatar representation of the user 103 , such as a stick figure, a cartoon, an animation, or the like.
- the sensor(s) 111 can include a privacy shutter.
- the privacy shutter can prevent the sensor(s) 111 from generating data relating to the user 103 .
- the user 103 may actuate the privacy shutter.
- the soundbar 101 may automatically implement the privacy shutter.
- the privacy shutter can, for example, cover a lens of a camera.
- the user interfaces rendered by the display 109 can also include instructions to the user 103 .
- the instructions can relate to performing a physiological measurement.
- the display 109 may display instructions to the user 103 to orient themselves in a certain direction for optimal detection by the sensor(s) 111 , or to stand still, or to rotate in a circle, or the like.
- the instructions can include text.
- the instructions can include one or more images, visual indicators, avatars, or the like.
- the soundbar 101 may generate the instructions based on real-time data generated by the sensor(s) 111 . In some implementations, the soundbar 101 may emit audio indicating instructions to the user 103 .
- the user interfaces rendered by the display 109 may include information or images that may not be related to the physiology of the user 103 and/or that may not be received from the soundbar 111 .
- the display 109 may display information received via satellite, broadcast, a network, internet, and such information can include broadcast media.
- the display 109 may display information received from the soundbar 111 (e.g., physiological data relating to the user 103 ) in one portion and may display information received from other sources in another portion.
- the display 109 may overlay or superimpose the information received from the soundbar 111 onto images received from other sources.
- the user 103 may watch a television broadcast on the display 109 and may simultaneously view their health information on the display 109 .
- the display 109 may display one or more instructions or requests to the user 103 .
- the display 109 may display one or more exercises for the user 103 to perform.
- the user 103 may follow along with visual cues provided on the display 109 of how to perform the exercise or what tasks to perform.
- the soundbar 101 may monitor the user 103 as they perform the exercise and may determine a health status or physiological condition of the user 103 such as based on processing image data of the user obtained by the sensor(s) 111 while performing the exercise.
- the soundbar 101 may be in communication with mobile device 105 .
- the mobile device 105 can include a smartphone.
- the mobile device 105 can receive information from the soundbar 109 such as user interface data, images obtained from a camera of the soundbar 101 , and/or physiological data of the user 103 .
- the mobile device 105 may be configured to render user interfaces via a display which may be similar to user interfaces shown and/or described with reference to display 109 .
- the mobile device 105 may communicate information to the soundbar 101 .
- the mobile device 105 can communicate instructions to the soundbar 101 such as to control operation of the soundbar 101 .
- the user 103 may control operation of the soundbar 101 via the mobile device 105 .
- the user can control the speakers of the soundbar 101 , the sensor(s) 111 of the soundbar 101 , and/or measurements or monitoring operations performed by the soundbar 101 .
- the mobile device 105 can communicate physiological information to the soundbar 101 , such as physiological data generated by the mobile device 105 and/or one or more wearable devices associated with the mobile device 105 .
- the soundbar 101 may be in communication with the scale 107 .
- the scale may be configured to measure a weight of the user 103 .
- the scale 107 may be configured to determine other physiological information of the user such as BMI, lean muscle mass, percent body fat, percent water weight of the user, changes in physiological information, etc.
- the scale 107 can include one or more electrodes. A user may contact the electrodes as the user 103 stands on the scale 107 (e.g., with their bare feet).
- the scale 107 can generate ECG data indicating a cardiac condition of the user 103 .
- the scale 107 can communicate physiological information to the soundbar 101 , such as physiological information generated and/or determined by the scale 107 (e.g., weight, BMI, etc.).
- the soundbar 101 may further refine physiological information received from the scale 107 , such as by using sensor data generated by the sensor(s) 111 .
- the soundbar 101 may further refine a BMI received from the scale based on at least a height of the user 103 and/or a weight distribution of the user 103 determined based on image data generated by the sensor(s) 111 .
- the scale 107 can be configured to rotate. In some implementations, the scale 107 may rotate in response to a motor of the scale 107 generating a force to cause the scale 107 to rotate. In some implementations, the scale 107 may rotate freely such as in response to an external force applied to the scale 107 .
- the scale 107 can receive information from the soundbar 101 . The information can include instructions. For example, the soundbar 101 may communicate instructions to the scale 107 to cause the scale to rotate as the user stands on the scale such that the sensor(s) 111 may perform sensing of a plurality of portions of the body of the user 103 . In some implementations, the scale 107 may rotate automatically. For example, the scale 107 may rotate independent of any instructions received from the soundbar 101 .
- the scale 107 may rotate in response to a user 103 standing on the scale 107 .
- the scale 107 may communicate information to the soundbar 101 to control one or more operations of the soundbar 101 .
- the scale 107 may communicate instructions to cause the soundbar 101 (and/or sensors 111 ) to activate, to turn on, to begin generating sensor data, etc.
- the scale 107 may communicate instructions to the soundbar 101 to cause the sensor(s) 111 to generate sensor data (e.g., collecting images) in response to a user 103 standing on the scale 107 .
- the soundbar 101 may perform measurement operations of the user 103 without the scale.
- the user 103 may stand in front of the soundbar 101 (without the scale 107 ) as the sensor(s) 111 sense the user 103 .
- the user 103 can rotate in front of the soundbar 101 (with or without the scale 107 ).
- FIG. 2 is a schematic block diagram illustrating an example implementation of a soundbar 201 .
- the soundbar 201 can include a hardware processor 203 , a storage component 205 , a communication component 207 , a power source 209 , one or more sensors 211 , and/or one or more speakers 213 .
- the hardware processor 203 can include one or more processors configured to execute program instructions to cause the soundbar 201 , or components thereof, or other systems or devices, to perform operations.
- the processor 203 can be configured, among other things, to process data, execute instructions to perform one or more functions, and/or control the operation of the soundbar 201 .
- the processor 203 can process data such as image data obtained from the sensor(s) 211 and can execute instructions to perform functions related to processing, storing, and/or transmitting such data.
- the processor 203 can calculate physiological information of a user, such as physiological parameters.
- the processor 203 can calculate physiological information based on at least data obtained from the sensor(s) 211 .
- the processor 203 can calculate a body mass index (BMI) of a user 223 based on image data obtained from an image sensor such as a camera and/or information from a LIDAR sensor.
- the processor 203 can determine a weight distribution of the user 223 such as where on the user's body their weight is distributed.
- a weight distribution determination may improve physiological determinations such as BMI determination.
- the processor 203 may determine, based on image data, that high percentage of the user's weight is distributed around the torso or mid-section of the user which may indicate an unhealthy BMI.
- the processor 203 may determine that a user has their weight distributed in the legs, chest, and arms, which may indicate a healthy BMI.
- the processor 203 may determine physiological information of a user based on at least real-time and/or historical physiological information of the user.
- the processor 203 may determine physiological information of the user based on at least statistical information such as physiological information relating to a plurality of other people.
- the processor 203 may determine physiological information of the user based on comparing image data or other data of the user with historical information of the user and/or with statistical information.
- the processor 203 may determine physiological information of a user based on at least data obtained from the scale 207 .
- the processor 203 may determine a health index of the user based on at least one or more of weight distribution of a user, such as determined by processing sensor data, heat distribution of a user, such as determined by processing sensor data, and/or scale data received from the scale 207 .
- the processor 203 may measure a user's body or body parts. For example, the processor 203 can determine the user's height. As another example, the processor 203 may determine dimensions of certain body parts of the user. For example, the processor 203 may determine the circumference of a user's stomach, chest, thighs, arms, or the like. The processor 203 can determine a length of a user's arms or legs or torso. The processor 203 can determine ratios of various portions of the user's body. The processor 203 can determine an alignment of various portions of the user's body. The processor 203 can determine a symmetry of various portions of the user's body.
- the processor 203 can determine the user's dimensions based on at least information obtained from the sensor(s) 211 such as a camera and/or LIDAR sensor. As an example, as a user rotates in front of the soundbar 201 , the sensor(s) 211 may determine distances to determine the user's body measurements. The processor 203 may use the user's body dimensions to further improve physiological determinations such as determining a health index, BMI, or the like.
- the processor 203 can determine a heat distribution of the user 223 .
- the processor 203 can determine a heat distribution based at least on information obtained from the sensors(s) 211 which can include an IR camera.
- the heat distribution may improve physiological determinations.
- the processor 203 may analyze the user's circulation or blood flow based on the user's heat distribution. For example, low thermal energy at the user's peripheries such as arms and legs may indicate poor blood flow.
- the processor 203 can determine a temperature of a user, such as based on information obtained from the sensor(s) 211 , such as an infrared sensor.
- the temperature may be a core body temperature.
- the temperature may be a surface body temperature.
- the processor 203 may be configured to determine one or more physiological states of the user based on at least the user's temperature, such as whether the user has a fever, is infected with a pathogen, is hypothermic, etc.
- the processor 203 can determine one or more physiological states of the user, such as weight distribution or heat distribution, based on at least processing sensor data, such as image data.
- the processor 203 may implement one or more image processing techniques to analyze image data obtained from the sensor(s) 211 .
- the processor 203 may implement one or more pattern recognition techniques to analyze image data obtained from the sensor(s) 211 .
- the processor 203 can compare one or more images obtained from sensor(s) 211 . For example, the processor 203 may compare a current image of the user 223 with a historical image of the user 223 .
- the processor 203 may perform image processing to determine health trends of the user 223 such as whether the user 223 is changing weight, how the distribution of weight on the user 223 has changed over time, how thermal distribution (which may indicate perfusion) of the user 223 has changed over time, or the like.
- the processor 203 can perform photoplethysmography (PPG) with sensor data originating from the sensor(s) 211 .
- the processor 203 can perform remote photoplethysmography (PPG).
- the sensor data can include optical data which may comprise image data.
- the sensor(s) 211 may detect ambient light reflected from the skin of a user and in response generate sensor data from the captured reflected light.
- the processor 203 can process the sensor data to generate PPG data.
- PPG data can relate to volumetric changes, such as volumetric changes of blood vessels resulting from cardiac activity and/or volumetric changes of a thoracic cavity resulting from breathing.
- the processor 203 can implement one or more PPG techniques (which can include remote PPG techniques) to analyze the sensor data to determine one or more physiological parameters of the user.
- Sensor data may comprise one or more pixels. Pixels may correspond to red, blue, or green wavelengths.
- the processor 203 may analyze pixel color and/or variation to determine physiological parameters.
- the processor 203 can determine one or more of pulse rate, blood pressure, respiration rate, respiration volume, cardiac output, perfusion index, pleth variability, blood oxygen saturation (SpO2), and/or a PPG waveform. Accordingly, the present disclosure provides for a contactless method to determine a user's physiological parameters.
- the processor 203 can determine a user's physiological parameters without a device, such as a sensor, contacting the user.
- the processor 203 can implement one or more image processing techniques to identify a portion of an image for processing for analyzing PPG data such as to generate physiological parameters.
- the processor 203 can identify a portion of image corresponding to specific body part of a user for analyzing PPG data.
- the body part may be referred to as a region of interest.
- the body part may be a body part containing a high density of blood vessels near a surface of the skin.
- the body part can be a face of a user, a forehead of a user, arms of a user, hands of a user, feet of a user, a throat of a user, a neck of a user, etc.
- the processor 203 may analyze sensor data corresponding to multiple regions of a user's body, such as a user's face and a user's hands. Analyzing sensor data corresponding to multiple regions of a user's body may improve the accuracy of determining physiological parameters of the user.
- the processor 203 can perform image processing on images obtained from a camera such as sensor(s) 211 .
- the processor 203 can perform pattern recognition of images.
- the processor 203 can determine an orientation of a user based on images received from the sensor(s) 211 .
- the processor 203 can determine which direction a user is facing, such as towards the soundbar 201 , away from the soundbar 201 , etc.
- the processor 203 can generate instructions to cause the user to change their orientation so that the soundbar 201 may perform an accurate measurement of the user.
- the soundbar 201 may cause the scale 225 to rotate and/or may cause display and/or audio instructing the user to rotate.
- the processor 203 may be configured to determine an orientation of a subject laying in a bed such as a patient in a hospital bed or sleeping baby. Orientations can include an upright orientation, a supine orientation, a prone orientation, a side orientation, or a fall orientation.
- the processor 203 may perform image processing and/or pattern recognition on images obtained by the sensor(s) 211 to determine whether a subject is laying on their side, on their stomach, on their back, whether a subject is standing up, whether a subject has fallen down, such as off of a bed, or the like.
- the processor 203 may be determine an amount of time a subject has been oriented in a certain orientation.
- the processor 203 can generate one or more alarms depending on a subject's determined orientation and/or amount of time in a certain orientation. For example, the processor 203 can generate an alarm if a patient has been oriented in a certain position in excess of a time threshold which could result in bed sores or pressure ulcers. As another example, the processor 203 can generate an alarm if a sleeping baby has fallen out of a bed or has not moved from a certain position for longer than a threshold amount of time.
- the alarm can be one or more audio signals such as played through the speaker(s) 213 and/or computing device 221 , and/or visual signals displayed on a device such as computing device 221 .
- the processor 203 can determine physiological information relating to a thermal energy of a subject such as the body heat of a patient or sleeping baby based on information obtained from an infrared (IR) sensor or camera.
- the processor 203 can generate one or more alarms based on information received from an IR camera. For example, the processor 203 may determine that a body temperature of a sleeping baby is unusually high or low based on the information from the IR camera and may generate an alarm.
- the processor 203 may generate user interface data for rendering user interface displays.
- the user interface displays can include physiological information of a user and/or instructions to a user to perform physiological measurement.
- the user interface displays can include images of the user captured by the sensor(s) 211 such as real-time images and/or historical images.
- the processor 203 can generate instructions to control the operation of one or more remote devices such as scale 225 .
- the processor 203 can perform facial recognition such as based on images obtained from the sensor(s) 211 .
- the processor 203 may determine whether one or more care provides are present such as a health care provider in a hospital or a care provider in a home such as based on facial recognition.
- the processor 203 may be configured to determine whether one or more unauthorized persons are present such as in a hospital room or a home, such as an intruder, such as based on facial recognition.
- the processor 203 may determine one or more voice commands such as from a user and detected by a microphone which may control one or more operations of the soundbar.
- the processor 203 may be configured to detect screaming, crying, or the like. For example, the processor 203 may determine that a baby is crying.
- the processor 203 may be configured to determine a user's gait or ambulatory condition. For example, the processor 203 may determine that a user is walking abnormally such as with a limp, or unevenly which may indicate joint problems, bone length irregularities, muscle irregularities or weakness, neurological issues, or the like.
- the processor 203 may be configured to determine one or more physiological parameters of a user, such as pulse rate, blood oxygen content (SpO2), respiration rate, or the like, which may be based at least in part on information obtained from the sensor(s) 211 such as information from a camera.
- the processor 203 may detect breathing patterns.
- the processor 203 may determine whether a user is breathing abnormally such as hyperventilating, or experiencing breathing apnea, or not breathing, etc.
- the processor 203 may be configured to determine a skin color, tone, or shade of a user based on information obtained from a sensor 211 such as a camera. For example, the processor 203 may determine that a user has been sunburned, may detect one or more skin irregularities, such as sunspots, tan lines, skin cancer, lesions, or the like. The processor 203 may detect a trend in the skin color of the user. In some implementations, skin color may indicate perfusion or respiration. For example, the processor 203 may determine whether a baby is breathing properly based on a skin color of the baby.
- the processor 203 may be configured to monitor a user's sleep cycles such as based on a user's movement as detected by a camera or motion sensor.
- the processor 203 may be configured to monitor and/or detect a user's eye activity. For example, the processor 203 may detect a user's eye movement patterns such as by implementing eye tracking of a camera. The processor 203 may detect pupillometry of a user such as a user's pupil dilation in response to light stimulus.
- the processor 203 may be configured to determine whether an infant has been moved.
- the processor 203 may determine whether an unauthorized person has moved the infant.
- the processor 203 may determine whether the infant is in an unauthorized location.
- the processor 203 may determine one or more positions, orientations, movements of an infant such as when sleeping or when awake.
- the processor 203 may determine whether an infant is crying, an amplitude or volume of the cry, a duration of the cry, or the like.
- the processor 203 may determine a level of distress of the infant based on one or more of the infant's cry, position, orientation, etc.
- the processor 203 may be configured to detect and/or analyze one or more facial features and/or expressions of a user.
- the processor 203 may analyze facial features to determine facial recognition, stroke detection, or the like. For example, the processor 203 may determine a user has experienced a stroke based on a user's smile or other facial muscles acting irregularly.
- the processor 203 may be configured to detect and/or analyze a speech of a user.
- the processor 203 may analyze a user's speech to recognize voice commands to control operation of the soundbar 201 , to detect a stroke of the user or the like. For example, the processor 203 may determine irregularities in the user's speech which may indicate the user has experienced a stroke.
- the processor 203 may analyze a user's speech, such as vocabulary, over time which may indicate cognitive abilities of the user. For example, the processor 203 may track an infant's increasing vocabulary over time or may track an elderly person's decreasing vocabulary usage over time which may indicate Alzheimer's, dementia, or other cognitive decline.
- the processor 203 may be configured to detect and/or analyze a user's movements such as movements relating to fine motor movements.
- the processor 203 may analyze a user's hand movements to detect shaking, tremors, etc. which may indicate a health status of a user such as Parkinson's, or other neuromuscular condition.
- the processor 203 may be configured to implement one or more tests or health checks to be taken by a user.
- the soundbar 201 may issue one or more request (e.g., audio or visual) for a user to do one or more tasks or exercises in front of the soundbar 201 .
- the soundbar 201 may monitor the user as they perform the exercises such as with a camera.
- the soundbar 201 may request the user to reach as high as they can, to walk on their toes, to bend over, to touch a certain portion of a display screen in communication with the soundbar 201 , or the like.
- the processor 203 may determine one or more health conditions of the user such as a hand-eye coordination, a flexibility, a muscle tone, or the like.
- the soundbar 201 may request the user to smile, or frown, or move their eyes in a certain direction.
- the processor 203 may determine a cognitive state of the user such as a dementia, stroke, Alzheimer's, Parkinson's, or the like.
- the soundbar 201 may request the user to perform one or more cognitive tasks such as simple puzzle, or memory exercise, or other task which may indicate user's cognition.
- the processor 203 may track the user's cognition over time.
- the processor 203 may be configured to perform a hearing test of user.
- the soundbar 201 may request the user to respond according to one or more sounds emitted by the soundbar 201 to determine a hearing capability of the user.
- the processor 203 may be configured to perform an eye test of the user.
- the soundbar 201 may request the user to respond according to one or more images displayed by a display in communication with the soundbar.
- the soundbar 201 may request the user to identify letters or numbers of various sizes displayed on the display to determine the user's eyesight.
- the soundbar 201 may request the user to identify various objects of certain colors displayed on the display to determine a user's color detection which may indicate color blindness.
- the processor 203 may be configured to determine whether a user has taken medication. For example, the processor 203 may analyze image data to determine whether one or more cupboards or drawers containing medicine was opened at a certain time. As another example, the processor 203 may recognize certain pills taken by the user based on image data as the user consumes the pills. The processor 203 may generate reminders for a user to take medication at the appropriate time.
- the storage component 205 can include one or more memory devices that store data, including without limitation, dynamic and/or static random-access memory (RAM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and the like.
- RAM dynamic and/or static random-access memory
- PROM programmable read-only memory
- EPROM erasable programmable read-only memory
- EEPROM electrically erasable programmable read-only memory
- the storage component 205 can be configured to store data such as data obtained from the sensor(s) 211 , such as camera images, processed and/or unprocessed physiological data, such as body mass index (BMI), or the like.
- BMI body mass index
- the communication component 207 can facilitate communication (via wired and/or wireless connection) between the soundbar 201 (and/or components thereof) and separate devices, such as separate monitoring, display, and/or mobile devices.
- the communication component 207 can be configured to allow the soundbar 201 to communicate with other devices, systems, and/or networks over any of a variety of communication protocols.
- the communication component 207 can be configured to use any of a variety of wired communication configurations such as HDMI, USB, ethernet, coaxial, fiber optics, twisted pair, or the like.
- the communication component 207 can be configured to use any of a variety of wireless communication protocols, such as Wi-Fi (802.11x), Bluetooth®, ZigBee®, Z-wave®, cellular telephony, infrared, near-field communications (NFC), RFID, satellite transmission, proprietary protocols, combinations of the same, and the like.
- Wi-Fi 802.11x
- Bluetooth® ZigBee®
- Z-wave® cellular telephony
- infrared infrared
- NFC near-field communications
- RFID satellite transmission
- proprietary protocols combinations of the same, and the like.
- the communication component 207 can allow data and/or instructions to be transmitted and/or received to and/or from the soundbar 201 and separate computing devices.
- the communication component 207 can be configured to transmit and/or receive (for example, wirelessly) processed and/or unprocessed data such as physiological data, sensor data, image data, user interface data, or other information to separate computing devices, which can include, among others, a mobile device (for example, an iOS or Android enabled smartphone, tablet, laptop), a desktop computer, a wearable device such as a smartwatch, a server or other computing or processing device for display and/or further processing, among other things.
- Such separate computing devices can be configured to store and/or further process the received data and/or other information, to display information indicative of or derived from the received information.
- the communication component 207 of the soundbar 201 can be configured to wirelessly transmit processed and/or unprocessed data such as sensor data and/or other information to a mobile phone which can include one or more hardware processors configured to execute an application that generates a graphical user interface displaying information representative of the data or other information obtained from the soundbar 201 .
- the communication component 207 can be embodied in one or more components that are in communication with each other.
- the communication component 207 can comprise a wireless transceiver, an antenna, and/or a near field communication (NFC) component.
- NFC near field communication
- the soundbar 201 can include a power source 209 .
- the power source 209 can provide power for hardware components of the soundbar 201 described herein.
- the power source 209 can be, for example, a lithium battery.
- the soundbar 201 can be configured to obtain power from a power source that is external to the soundbar 201 .
- the soundbar 201 can include or can be configured to connect to a cable which can itself connect to an external power source to provide power to the soundbar 201 .
- the soundbar 201 can include one or more sensors 211 .
- the sensor(s) 211 can include one or more types of sensors.
- the sensor(s) 211 may be sensitive and/or responsive to electromagnetic radiation.
- the sensor(s) 211 can generate image data responsive to electromagnetic radiation.
- the sensor(s) 211 can generate one or more voltages responsive to electromagnetic radiation.
- the sensor(s) 211 can include one or more light-sensitive sensors.
- the sensor(s) 211 can include one or more optical sensors.
- the sensor(s) 211 can include one or more photodiodes.
- the sensor(s) 211 can include one or more image sensors.
- the sensor(s) 211 can include one or more cameras.
- the camera can include a CCD camera and/or a CMOS camera.
- the camera can include a 3D camera, a depth camera, or a stereovision camera.
- the camera can include multiple lenses.
- the camera can include a stereo camera.
- the camera can include one or more lenses that shift viewpoints.
- the processor 203 can cause the one or more cameras to adjust a view of the one or more cameras. Adjusting the view of the one or more cameras can comprise one or more of rotating the one or more cameras, adjusting a zoom of the one or more cameras, pivoting the one or more cameras, tilting the one or more cameras, or panning the one or more cameras.
- the camera can be configured to capture images in the visible portion of the electromagnetic spectrum.
- the camera can be configured to capture images in the infrared portion of the electromagnetic spectrum.
- the camera can be an infrared (IR) camera.
- the camera can be configured to detect thermal energy.
- the sensor(s) 211 can include a light detection and ranging (LIDAR) sensor.
- the LIDAR sensor can be configured to emit a laser light and measure the time for the reflected light to return.
- the LIDAR sensor can be configured to determine distances between the LIDAR sensor and a point remote to the LIDAR sensor and/or between two points remote to the LIDAR sensor.
- the sensor(s) 211 can include a camera and a LIDAR sensor.
- the sensor(s) 211 can include multiple cameras and/or multiple types of cameras.
- the sensor(s) 211 can include a microphone configured to detect sounds such as voice, speech, crying, etc.
- the sensor(s) 211 can include a motion sensor or light sensor.
- the soundbar 201 can include one or more speakers 213 .
- the speaker(s) 213 can emit one or more audio signals such as music, voice commands, physiological information such as health parameters, alarms, and the like.
- the speaker(s) 213 can include one or more of tweeters, woofers, and/or subwoofers.
- the soundbar 201 may be in communication with one or more servers 227 remote to the soundbar 201 .
- the soundbar 201 or communication component 207 thereof, can communicate with the server(s) 227 via a network 210 .
- the network 210 can include any combination of networks, such as a personal area network (PAN), a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), or the like.
- the soundbar 201 may, via the network 210 , communicate data to the server(s) 227 and/or receive data from the server(s) 227 including sensor data such as image data, physiological data (e.g., to be stored as historical physiological data), or the like.
- the server(s) 227 may include, and/or have access to (e.g., be in communication with) a storage device or system which can include any computer readable storage medium and/or device (or collection of data storage mediums and/or devices), including, but not limited to, one or more memory devices that store data, including without limitation, dynamic and/or static random access memory (RAM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), optical disks (e.g., CD-ROM, DVD-ROM, etc.), magnetic disks (e.g., hard disks, floppy disks, etc.), memory circuits (e.g., solid state drives, random-access memory (RAM), etc.), and/or the like.
- RAM dynamic and/or static random access memory
- PROM programmable read-only memory
- EPROM erasable programmable read-only memory
- EEPROM electrically erasable programm
- the server(s) 227 may include and/or be in communication with a hosted storage environment that includes a collection of physical data storage devices that may be remotely accessible and may be rapidly provisioned as needed (commonly referred to as “cloud” storage).
- Data stored in and/or accessible by the server(s) 227 can include physiological data including historical physiological data previously received from the soundbar 201 and/or sensor data including, for example, images obtained from a camera, or the like.
- data stored in and/or accessible by the server(s) 227 can include statistical information, such as physiological information relating to a group of people.
- the statistical information can include information relating to one or more groups of people.
- the groups can be defined by age, gender, race, nationality, or the like.
- statistical information can include physiological information relating to adult males living in the United States.
- the statistical information can include physiological information relating to infants born in China.
- the statistical information can include information for large amounts of people and may by representative of large populations, such as adults in the United States.
- the soundbar 201 may access the statistical information to perform one or more determinations relating to the user such as determining physiological states of the user, such as BMI, body temperature, weight distribution, body measurements, posture, gait, health status, or the like.
- the processor 203 may compare the user's physiological information with the statistical information.
- the soundbar 201 may not be in communication with the one or more server(s) 227 which may enhance security or privacy of sensitive information.
- the soundbar 201 may selectively communicate certain information with the server(s) 227 and may not communicate other information with the server(s) 227 .
- the soundbar 201 may store sensitive information locally. Sensitive information can include information relating to a user such as physiological information, health information, images of the user, or the like. A user may selectively control which information is communicated from the soundbar 201 to remote computing devices such as the server 227 .
- the soundbar 201 can include a privacy operation mode which may be implemented using a privacy switch or button. For example, a user may actuate a privacy switch on the soundbar 201 which may implement one or more privacy routines. For example, during a privacy operation mode, communication between the soundbar 201 and the server(s) 227 (or other remote computing devices) may be disabled and/or communication of certain sensitive information between the soundbar 201 and the server(s) 227 may be disabled. In some implementations, during a privacy mode of operation, one or more sensor(s) 211 of the soundbar 201 may be disabled, such as one or more cameras.
- the soundbar 201 can communicate with one or more computing devices 221 .
- the computing device 221 may be remote to the soundbar 201 .
- the soundbar 201 may communicate with the computing device 221 via the network 210 .
- the computing device 221 can include one or more of a mobile phone such as a smartphone, a laptop, a tablet, a wearable device such as a smartwatch, a display such as a TV monitor, an earbud, a headphone, an earphone, a sensor such as a physiological sensor, such as a wearable physiological sensor (e.g., worn on a wrist, finger, ear, etc. of the user), another soundbar, or the like.
- the soundbar 221 may be configured to transmit data such as user interface data for rendering a graphical user interface.
- the computing device 221 can be configured to render a user interface based on the user interface data received from the soundbar 201 .
- the computing device 221 can render a display on a screen.
- the user interface can include instructions to a user to perform a measurement, such as instructions to a user to turn around in a circle in front of the sound bar, or to stand still, or the like.
- the user interface can include physiological data such as physiological parameters, such as a user's 223 weight, height, BMI, or the like.
- the user interface can include images obtained from a camera of the soundbar 201 .
- the soundbar 201 may be configured to receive information from the computing device 221 .
- the computing device 221 may transmit instructions to the soundbar 201 to control an operation of the soundbar 201 .
- a user 223 may control operation of the soundbar 201 via the computing device 221 while the user 223 may be remote to the soundbar 201 .
- the soundbar 201 may monitor a baby.
- the user 223 may control the soundbar 201 , via the computing device 221 , to play music via the speakers 213 to help the baby sleep.
- the soundbar 201 may obtain images of the baby via the sensor(s) 211 .
- the soundbar 201 can transmit the images to the computing device 221 .
- the user 223 can view the images of the baby via the computing device 221 while the user 223 is remote to the baby.
- the soundbar 201 may receive data such as physiological information form the computing device 221 such as where the computing device 221 includes physiological sensor(s).
- the computing device 221 may be associated with another user.
- the computing device may be associated with a healthcare provider.
- the soundbar 201 may implement a call, such as a video call or audio call, with the remote computing device 221 .
- the soundbar 201 can communicate image data to the computing device 221 .
- the image data may correspond to historical and/or real time images.
- the image data may facilitate a video call.
- the image data may aid a healthcare provider in diagnosing a physiological status of the user. Accordingly, the soundbar 201 can facilitate telehealth.
- the soundbar 201 may receive information from the computing device 221 relating to an audio playback modification and/or hearing profile of a user.
- the computing device 221 can transmit instructions to the soundbar 201 relating to an audio playback modification which may enhance an audio listening experience for the user depending on the user's particular hearing capabilities.
- the audio playback modifications can include frequency dependent gains such as amplitude adjustments to one or more frequencies in the audio playback signal.
- the audio playback modifications can include adjusting a phase of the audio playback signal.
- the audio playback modifications can include adjusting a latency of the audio playback signal.
- the computing device 221 can include an earphone, or earbud, or headphone. The earbud can perform one or tests to determine hearing capabilities of the user.
- the earbud can perform otoacoustic emissions (OAE) to determine a hearing profile or hearing transfer function of the user.
- OAE can include distortion product OAE (DP-OAE), spontaneous OAE (S-OAE), and/or transient evoked OAE (TE-OAE).
- DP-OAE distortion product OAE
- S-OAE spontaneous OAE
- TE-OAE transient evoked OAE
- the earbud may determine one or more audio playback modifications to make to the audio playback signal to personalize the listening experience for the user based on the user's hearing.
- the earbud can transmit the hearing transfer function and/or the audio playback modifications to the soundbar 201 .
- the soundbar 201 may modify the audio playback signal emitted from its speakers based on the information received from the earbud. This may enhance an audio listening experience for the user because the audio playback is personalized for the user's hearing profile such that the user will hear the audio as it was intended to be heard.
- the soundbar 201 can optionally communicate with a scale 225 such as via the network 210 .
- the scale 225 can be configured to obtain physiological information of the user 223 .
- the scale 225 can obtain the user's 223 weight, BMI, body composition including lean muscle mass, fat mass, water mass, or the like.
- the soundbar 201 may receive physiological information from the scale 225 obtained by the scale 225 .
- the soundbar 201 may communicate instructions to the scale 225 to control operation of the scale 225 .
- the soundbar 201 may communicate instructions to the scale 225 to cause the scale 225 to rotate such as while the user 223 is on the scale 225 to perform measurement of the user 223 .
- the soundbar 201 may receive instructions from the scale 225 to control an operation of the soundbar 201 .
- the soundbar can receive a signal to cause one or more components of the soundbar 201 to activate, such as to cause the sensor(s) 211 to begin operating to generate sensor data.
- real-time or “substantial real-time” may refer to events (e.g., receiving, processing, transmitting, displaying etc.) that occur at the same time or substantially the same time (e.g., neglecting any small delays such as those that are imperceptible and/or inconsequential to humans such as delays arising from electrical conduction or transmission).
- real-time may refer to events that occur within a time frame of each other that is on the order of milliseconds, seconds, tens of seconds, or minutes.
- “real-time” may refer to events that occur at a same time as, or during, another event.
- system generally encompass both the hardware (for example, mechanical and electronic) and, in some implementations, associated software (for example, specialized computer programs for graphics control) components.
- Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors including computer hardware.
- the code modules may be stored on any type of non-transitory computer-readable medium or computer storage device, such as hard drives, solid state memory, optical disc, and/or the like.
- the systems and modules may also be transmitted as generated data signals (for example, as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (for example, as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames).
- the processes and algorithms may be implemented partially or wholly in application-specific circuitry.
- the results of the disclosed processes and process steps may be stored, persistently or otherwise, in any type of non-transitory computer storage such as, for example, volatile or non-volatile storage.
- a general purpose processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like.
- a processor can include electrical circuitry configured to process computer-executable instructions.
- a processor in another embodiment, includes an FPGA or other programmable devices that performs logic operations without processing computer-executable instructions.
- a processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
- a processor may also include primarily analog components. For example, some, or all, of the signal processing algorithms described herein may be implemented in analog circuitry or mixed analog and digital circuitry.
- a computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
- a software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory computer-readable storage medium, media, or physical computer storage known in the art.
- An example storage medium can be coupled to the processor such that the processor can read information from, and write information to, the storage medium.
- the storage medium can be integral to the processor.
- the storage medium can be volatile or nonvolatile.
- the processor and the storage medium can reside in an ASIC.
- the ASIC can reside in a user terminal.
- the processor and the storage medium can reside as discrete components in a user terminal.
- Conditional language such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
- Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, and so forth, may be either X, Y, or Z, or any combination thereof (for example, X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
- the terms “generally parallel” and “substantially parallel” refer to a value, amount, or characteristic that departs from exactly parallel by less than or equal to 10 degrees, 5 degrees, 3 degrees, or 1 degree.
- the terms “generally perpendicular” and “substantially perpendicular” refer to a value, amount, or characteristic that departs from exactly perpendicular by less than or equal to 10 degrees, 5 degrees, 3 degrees, or 1 degree.
- a device configured to are intended to include one or more recited devices. Such one or more recited devices can also be collectively configured to carry out the stated recitations.
- a processor configured to carry out recitations A, B and C can include a first processor configured to carry out recitation A working in conjunction with a second processor configured to carry out recitations B and C.
- All of the methods and processes described herein may be embodied in, and partially or fully automated via, software code modules executed by one or more general purpose computers.
- the methods described herein may be performed by the computing system and/or any other suitable computing device.
- the methods may be executed on the computing devices in response to execution of software instructions or other executable code read from a tangible computer readable medium.
- a tangible computer readable medium is a data storage device that can store data that is readable by a computer system. Examples of computer readable mediums include read-only memory, random-access memory, other volatile or non-volatile memory devices, CD-ROMs, magnetic tape, flash drives, and optical data storage devices.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- General Physics & Mathematics (AREA)
- Cardiology (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Geometry (AREA)
- Pulmonology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Nursing (AREA)
- Quality & Reliability (AREA)
- Data Mining & Analysis (AREA)
- Primary Health Care (AREA)
- Databases & Information Systems (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Epidemiology (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
A soundbar for medical monitoring which may comprise a speaker, a sensor, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data relating to a physiology of a subject. The sensor can include a camera and the sensor data can include image data. The hardware processor can be configured to access the sensor data and determine a health status of the subject based on at least the sensor data.
Description
- Any and all applications, if any, for which a foreign or domestic priority claim is identified in the Application Data Sheet of the present application are hereby incorporated by reference under 37 CFR 1.57.
- The present disclosure relates to medical monitoring. Specifically, the disclosure describes, among other things, devices, systems, and methods for monitoring and/or displaying information regarding a user's physiological information.
- Soundbars are used to play audio signals such as music. Devices, such as scales can determine physiological related data of a subject. Scales can determine a subject's weight and/or body mass index (BMI). Current determinations of BMI are based on limited information such as a subject's weight and percent body fat.
- Various embodiments of systems, methods and devices within the scope of the appended claims each have several aspects, no single one of which is solely responsible for the desirable attributes described herein. Without limiting the scope of the appended claims, the description below describes some prominent features.
- Details of one or more embodiments of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims. Note that relative dimensions of the following figures may not be drawn to scale.
- A soundbar for monitoring a physiological health of a user can comprise a speaker, a sensor, and one or more hardware processors. The speaker can emit audio which can comprise one or more of music, an alert, information relating to a physiology of the user, or instructions to the user. The sensor can obtain sensor data as the user is within a proximity of the soundbar. The sensor can comprise one or more cameras configured to capture electromagnetic radiation including infrared radiation and visible light radiation. The sensor data can comprise image data of the user and can relate to a physiology of the user. The one or more hardware processors can determine a distribution of body heat of the user based on at least the image data of the user. The image data can comprise infrared image data indicating thermal energy. The one or more hardware processors can determine a distribution of body weight of the user based on at least the image data of the user. The image data can comprise visible light image data. The one or more hardware processors can determine a health index of the user from at least the distribution of body weight of the user or the distribution of body heat of the user.
- In some implementations, the sensor can obtain the sensor data comprising the image data of the user as the user rotates in front of the soundbar, the image data of the user corresponding to a plurality of portions of the user's body.
- In some implementations, the soundbar can further comprise a communication component configured to communicate with one or more computing devices. The one or more computing devices can include a scale configured to measure a weight of the user. The one or more hardware processors can generate one or more instructions to the scale to cause the scale to rotate as the user stands on the scale and cause the sensor to obtain the sensor data as the scale rotates.
- In some implementations, the soundbar can further comprise a communication component configured to communicate with one or more computing devices. The one or more computing devices can include a scale configured to measure a weight of the user. The one or more hardware processors can access scale data obtained from the scale by the communication component, the scale data including at least a weight of the user; and determine the health index of the user based on at least the scale data.
- In some implementations, the scale data further includes at least one or more of a percent body fat of the user, a percent lean muscle mass of the user, a percent water of the user, a BMI of the user, a change in weight of the user, or ECG data of the user.
- In some implementations, the one or more hardware processors is further configured to determine a likelihood the user has an infectious disease based on at least the distribution of body heat of the user, the distribution of body heat of the user indicating a body temperature of the user, the infectious disease comprising a virus.
- In some implementations, the one or more hardware processors is further configured to process the image data to generate PPG data and determine one or more physiological parameters based on the PPG data.
- In some implementations, the one or more hardware processors is further configured to cause the speaker to emit one or more of an instruction to the user or information relating to the health index of the user.
- In some implementations, the soundbar further comprises a communication component configured to communicate with one or more computing devices, the one or more computing devices including a display, and the one or more hardware processors is further configured to: generate user interface data for rendering one or more user interfaces comprising indicia of the health index of the user; and cause the communication component to communicate the user interface data to the display to render the one or more user interfaces.
- In some implementations, the one or more user interfaces further comprises one or more images corresponding to the image data of the user, the one or more images comprising one or more historical images or one or more real-time images.
- In some implementations, the soundbar further comprises a communication component configured to communicate with one or more computing devices, the one or more computing devices including a scale configured to measure a weight of the user, and the one or more hardware processors is further configured to: cause the sensor to turn on or to obtain the sensor data responsive to one or more signals received from the scale by the communication component, the one or more signals generated by the scale responsive to a user standing on the scale.
- In some implementations, the one or more hardware processors is further configured to: determine the distribution of body heat of the user based on at least the image data of the user, the image data comprising historical image data or real-time image data; and determine the distribution of body weight of the user based on at least the image data of the user, the image data comprising historical image data or real-time image data.
- In some implementations, the one or more cameras comprise one or more of a plurality of cameras, a 3D camera, a depth camera, a stereovision camera, an infrared camera, or a light detection and ranging (LIDAR) sensor.
- In some implementations, the one or more hardware processors is further configured to cause the one or more cameras to adjust a view of the one or more cameras, wherein adjusting the view of the one or more cameras comprises one or more of rotating the one or more cameras, adjusting a zoom of the one or more cameras, pivoting the one or more cameras, tilting the one or more cameras, or panning the one or more cameras.
- In some implementations, the one or more hardware processors is further configured to cause a communication component of the soundbar to implement wireless communication with a remote computing device to establish a video call; cause the communication component to transmit the image data of the user to the remote computing device; and generate user interface data for rendering one or more user interfaces comprising one or more images received from the remote computing device by the communication component.
- A method for monitoring a health of a user can comprise: accessing sensor data originating from a sensor of a soundbar, the sensor comprising one or more cameras configured to capture electromagnetic radiation including infrared radiation and visible light radiation, the sensor data comprising image data of the user and relating to a physiology of the user; determining a distribution of body heat of the user based on at least the image data of the user, the image data comprising infrared image data indicating thermal energy; determining a distribution of body weight of the user based on at least the image data of the user, the image data comprising visible light image data; determining a health index of the user from at least the distribution of body weight of the user or the distribution of body heat of the user; and causing a speaker of the soundbar to emit an audio comprising one or more of music, an alert, information relating to the health index, or instructions to the user.
- In some implementations, the method can further comprise: generating one or more instructions to a scale to cause the scale to rotate as the user stands on the scale; and causing the sensor to obtain the sensor data as the scale rotates.
- Non-transitory computer-readable media including computer-executable instructions that, when executed by a computing system, can cause the computing system to perform operations comprising: accessing sensor data originating from a sensor of a soundbar, the sensor comprising one or more cameras configured to capture electromagnetic radiation including infrared radiation and visible light radiation, the sensor data comprising image data of the user and relating to a physiology of the user; determining a distribution of body heat of the user based on at least the image data of the user, the image data comprising infrared image data indicating thermal energy; determining a distribution of body weight of the user based on at least the image data of the user, the image data comprising visible light image data; determining a health index of the user from at least the distribution of body weight of the user or the distribution of body heat of the user; and causing a speaker of the soundbar to emit an audio comprising one or more of music, an alert, information relating to the health index, or instructions to the user.
- In some implementations, the computer-executable instructions, when executed by the computing system, further cause the computing system to perform operations comprising: generating one or more instructions to a scale to cause the scale to rotate as the user stands on the scale; and causing the sensor to obtain the sensor data as the scale rotates.
- The present disclosure provides a soundbar for performing physiological measurements. The soundbar may comprise a speaker, a sensor, a memory, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data relating to a physiology of a user. The sensor can include a camera. The sensor data can include one or more images. The memory can be configured to store the sensor data. The hardware processor can be configured to access the sensor data. The hardware processor can be configured to determine a distribution of body weight of the user based on at least the one or more images. The hardware processor can be configured to determine a health index of the user based on at least the distribution of body weight of the user.
- In some implementations, the sensor includes a 3D camera.
- In some implementations, the sensor includes an infrared camera configured to capture radiation in the infrared portion of the electromagnetic spectrum.
- In some implementations, the hardware processor is further configured to determine a distribution of heat of the user's body based on at least the sensor data obtained from the infrared camera.
- In some implementations, the hardware processor is further configured to compare multiple images of the user.
- In some implementations, the multiple images include one or more historical images and one or more current images.
- In some implementations, the hardware processor is further configured to perform image processing and/or pattern recognition on images obtained by the sensor.
- In some implementations, the hardware processor is further configured to determine an orientation of the user based on at least the image processing and/or pattern recognition of the images.
- In some implementations, the hardware processor is further configured to generate instructions to the user for the user to change orientation, wherein the instructions are based on at least the determined orientation.
- In some implementations, the soundbar is further in communication with a scale configured to determine a weight of the user.
- In some implementations, the hardware processor is further configured to generate instructions to cause the scale to rotate.
- In some implementations, the sensor includes a light detection and ranging (LIDAR) sensor.
- In some implementations, the hardware processor is further configured to determine one or more dimensions of one or more portions of the user's body based on at least the sensor data.
- In some implementations, the hardware processor is further configured to generate user interface data for rendering a display; and communicate the user interface data to a computing device to be displayed.
- The present disclosure provides a soundbar for medical monitoring. The soundbar may comprise a speaker, a sensor, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data relating to a physiology of a subject. The sensor can include a camera. The sensor data can include one or more images. The hardware processor can be configured to access the sensor data. The hardware processor can be configured to determine an orientation of the subject based on at least the one or more images. The hardware processor can be configured to determine an amount of time the subject has been oriented in the determined orientation. The hardware processor can be configured to generate an alarm based on at least the determined orientation and/or the determine amount of time.
- In some implementations, the subject is an infant.
- In some implementations, the subject is a hospital patient.
- In some implementations, the determined orientation includes one or more of an upright orientation, a supine orientation, a prone orientation, a side orientation, or a fall orientation.
- In some implementations, the hardware processor is further configured to generate the alarm in response to determining that the determined amount of time exceeds a threshold.
- The present disclosure provides a soundbar for medical monitoring. The soundbar may comprise a speaker, a sensor, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data relating to a physiology of a subject. The sensor can include an infrared sensor. The sensor data can include infrared energy data indicating thermal energy. The hardware processor can be configured to access the sensor data. The hardware processor can be configured to determine a distribution of body heat of the subject based on at least the infrared energy data. The hardware processor can be configured to generate an alarm based on at least the distribution of body heat.
- In some implementations, the subject is a baby.
- In some implementations, the subject is a hospital patient.
- The present disclosure provides a soundbar for medical monitoring. The soundbar may comprise a speaker, a sensor, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data relating to a physiology of a subject. The sensor can include a camera. The sensor data can include one or more images. The hardware processor can be configured to generate instructions for a user to perform a health-related activity. The hardware processor can be configured to access the sensor data including sensor data obtained while the user performs the health-related activity. The hardware processor can be configured to determine a health status of the user based on at least the sensor data obtained while the user performs the health-related activity.
- In some implementations, the instructions include one or more audio signals emitted from the speaker and/or user interface data rendered on a displayed in communication with the soundbar.
- The present disclosure provides a soundbar for medical monitoring. The soundbar may comprise a speaker, a sensor, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data relating to a physiology of a subject. The sensor can include a camera. The sensor data can include one or more images. The hardware processor can be configured to access the one or more images including at least images of the user's face. The hardware processor can be configured to analyze one or more facial features of the user based on at least the images of the user's face. The hardware processor can be configured to determine a health status of the user based on at least the images of the user's face.
- In some implementations, the health status includes a stroke.
- The present disclosure provides a soundbar for medical monitoring. The soundbar may comprise a speaker, a sensor, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data relating to a physiology of a subject. The sensor can include a camera. The sensor data can include one or more images. The hardware processor can be configured to access the sensor data. The hardware processor can be configured to access statistical information including physiological data relating to a group of people. The hardware processor can be configured to determine a health status of the user based on at least the sensor data and the statistical information.
- The present disclosure provides a soundbar for medical monitoring. The soundbar may comprise a speaker, a sensor, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data relating to a physiology of a subject. The sensor can include a camera. The sensor data can include one or more images of a user while ambulating. The hardware processor can be configured to access the sensor data. The hardware processor can be configured to determine an ambulatory condition of the user based on at least the one or more images of a user while ambulating.
- The present disclosure provides a soundbar for medical monitoring. The soundbar may comprise a speaker, a sensor, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data relating to a physiology of a subject. The sensor can include a camera. The sensor data can include one or more images of a user. The hardware processor can be configured to access the sensor data. The hardware processor can be configured to determine a skin condition of the user based on at least one or more images of the user.
- The present disclosure provides a soundbar for medical monitoring. The soundbar may comprise a speaker, a sensor, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data of a location. The sensor can include a camera. The sensor data can include one or more images. The hardware processor can be configured to access the sensor data. The hardware processor can be configured to detect a subject in the one or more images. The hardware processor can be configured to determine whether the subject is authorized to be in the location.
- In some implementations, the subject is a health care provider.
- In some implementations, the location is a hospital room.
- In some implementations, the location is a home.
- The present disclosure provides a soundbar for medical monitoring. The soundbar may comprise a speaker, a sensor, and a hardware processor. The speaker can be configured to emit audio signals. The sensor can be configured to obtain sensor data relating to a physiology of a subject. The sensor can include a camera. The sensor data can include one or more images. The hardware processor can be configured to receive audio related data from a computing device remote to the soundbar. The hardware processor can be configured to modify an audio playback signal based on at least the audio related data. The hardware processor can be configured to transmit the modified audio playback signal to the speaker to be emitted by the soundbar.
- In some implementations, the computing device is an earbud, headphone, and/or earphone.
- In some implementations, the audio related data includes frequency dependent gains.
- In some implementations, the audio related data includes a hearing transfer function.
- In some implementations, the audio related data includes adjustments to one or more of a latency or phase.
- Various combinations of the above and below recited features, embodiments, implementations, and aspects are also disclosed and contemplated by the present disclosure.
- Additional implementations of the disclosure are described below in reference to the appended claims, which may serve as an additional summary of the disclosure.
- In various implementations, systems and/or computer systems are disclosed that comprise a computer-readable storage medium having program instructions embodied therewith, and one or more processors configured to execute the program instructions to cause the systems and/or computer systems to perform operations comprising one or more aspects of the above- and/or below-described implementations (including one or more aspects of the appended claims).
- In various implementations, computer-implemented methods are disclosed in which, by one or more processors executing program instructions, one or more aspects of the above- and/or below-described implementations (including one or more aspects of the appended claims) are implemented and/or performed.
- In various implementations, computer program products comprising a computer-readable storage medium are disclosed, wherein the computer-readable storage medium has program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to perform operations comprising one or more aspects of the above- and/or below-described implementations (including one or more aspects of the appended claims).
- Various embodiments will be described hereinafter with reference to the accompanying drawings. These embodiments are illustrated and described by example only, and are not intended to limit the scope of the disclosure. In the drawings, similar elements may have similar reference numerals.
-
FIG. 1 illustrates an example system for monitoring a user's physiological information. -
FIG. 2 is a schematic block diagram illustrating an example implementation of a soundbar. - The present disclosure will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The following description is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. It should be understood that steps within a method may be executed in different order without altering the principles of the present disclosure. Furthermore, the devices, systems, and/or methods disclosed herein can include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the devices, systems, and/or methods disclosed herein.
-
FIG. 1 illustrates an example implementation of asystem 100 for monitoring physiological information of auser 103. Thesystem 100 can include asoundbar 101. In some implementations, thesystem 100 may optionally include one or more of adisplay 109, amobile device 105, and ascale 107. Various components or devices of thesystem 100 may be in communication, such as wireless communication. - The
soundbar 101 can be perform one or more measurements relating to a physiology of theuser 103. Thesoundbar 101 can monitor auser 103. The user may be a baby or a patient in a hospital or a user in a home. Thesoundbar 101 can include one or more sensor(s) 111. The sensor(s) 111 can include one or more image sensors. The sensor(s) 111 can include one or more cameras. The one or more cameras may include CCD cameras and/or CMOS cameras. The sensor(s) 111 can include one or more types of cameras. The sensor(s) 111 can include a camera configured to capture optical radiation in the visible portion of the electromagnetic spectrum. The sensor(s) 111 can include a camera or sensor configured to capture optical radiation in the infrared portion of the electromagnetic spectrum. The sensor(s) 111 can include a 3D camera. The sensor(s) 111 can include a high-resolution camera. The sensor(s) 111 can include a plurality of cameras. The sensor(s) 111 can include a stereovision camera. The sensor(s) 111 can include a depth camera. The sensor(s) 111 can include a light detection and ranging (LIDAR) sensor. The sensor(s) 111 can include a millimeter wave (mmWave) sensor. The sensor(s) 111 can include an ultrawide band sensor. The sensor(s) 111 can be integrated into thesoundbar 101. The sensor(s) 111 may be disposed within a housing of thesoundbar 101. The sensor(s) 111 and thesoundbar 101 can form a single integrated unit. Thesoundbar 101 can include one or more speakers configured to emit audio, such as music, voice audio, etc. - The
soundbar 101, such as sensor(s) 111 thereof, can be configured to capture images of theuser 103 to generate image data of theuser 103. Thesoundbar 101, such as one or more hardware processors of thesoundbar 101, can process the image data. Thesoundbar 101 can determine a health or wellness index of theuser 103 based on at least the image data. The health index may indicate one or more physiological parameters of the user. The health index can indicate a body mass index (BMI) of theuser 103 based on at least the information obtained from the sensor(s) 111. The health index can indicate a thermal distribution of theuser 103. The health index can indicate a likelihood theuser 103 is infected with a pathogen, such as a bacteria or virus. Thesoundbar 101 can determine how a weight of theuser 103 is distributed on theuser 103. Thesoundbar 101 can determine dimensions of the user's 103 body. Thesoundbar 101 can determine how heat is distributed on the user's 103 body. - The
soundbar 101 may be in communication with adisplay 109. Thesoundbar 101 may generate user interface data and may communicate the user interface data to thedisplay 109. Thedisplay 109 may render one or more user interfaces based on at least user interface data received from thesoundbar 101. The user interfaces can include indica of a health index determined by thesoundbar 101. The user interfaces can include physiological data of theuser 103, such as physiological parameters. The user interfaces can include information received from one or more remote devices, such as thescale 107 ormobile device 105. The user interfaces can include images corresponding to image data generated by a camera of thesoundbar 101. The images can correspond to theuser 103. The images can include real-time images of theuser 103. For example, thedisplay 109 may render one or more user interfaces including images of theuser 103 as the sensor(s) 111 generate the image data. The images can include historical images, such as images of theuser 103 corresponding to image data previously generated by the sensor(s) 111. Thedisplay 109displays image 122 of theuser 103. Theimage 122 can be a historical and/or real-time image of theuser 103. For example, theimage 122 may be an image that was captured by acamera sensor 111 of the soundbar in real-time as theuser 103 stands in front of thesoundbar 101 and looks at thedisplay 109. As another example, theimage 122 may be an image that was previously captured by acamera sensor 111 of the soundbar such as one week prior, one month prior, one year prior, etc. Theuser 103 may view such a historical image to view differences between the user's 103 body then and now. Theimage 122 may be superimposed with dimensions such as captured by sensor(s) 111 such as a LIDAR sensor. LIDAR may image objects using one or more of ultraviolet, visible light, or near infrared. - The user interface can include thermal images. The
image 124 can correspond to image data generated by an infrared sensor or camera of thesoundbar 101. Theimage 124 can include information relating to a body heat of theuser 103. For example, theimage 124 can be a thermal image of theuser 103 with various shading or coloring indicating various temperatures. Theimage 124 can be a historical and/or real-time image of theuser 103. - The
user 103 may be able to control thedisplay 109. For example, theuser 103 can control the images that are displayed and how they are displayed. For example, theuser 103 can add and/or remove filters, such as infrared filters, to theimage 124 to provide various effects. - In some implementations, the
images 122 and/or 124 may be an image captured by a camera and may accurately portray theuser 103 in great detail. In some implementations, theimages 122 and/or 124 may be a representation of theuser 103 which may omit or alter visual details of the user to provide security or privacy. For example, theimages 122 and/or 124 may include an avatar representation of theuser 103, such as a stick figure, a cartoon, an animation, or the like. The sensor(s) 111 can include a privacy shutter. The privacy shutter can prevent the sensor(s) 111 from generating data relating to theuser 103. Theuser 103 may actuate the privacy shutter. In some implementations, thesoundbar 101 may automatically implement the privacy shutter. The privacy shutter can, for example, cover a lens of a camera. - The user interfaces rendered by the
display 109 can also include instructions to theuser 103. The instructions can relate to performing a physiological measurement. For example, thedisplay 109 may display instructions to theuser 103 to orient themselves in a certain direction for optimal detection by the sensor(s) 111, or to stand still, or to rotate in a circle, or the like. The instructions can include text. The instructions can include one or more images, visual indicators, avatars, or the like. Thesoundbar 101 may generate the instructions based on real-time data generated by the sensor(s) 111. In some implementations, thesoundbar 101 may emit audio indicating instructions to theuser 103. - The user interfaces rendered by the
display 109 may include information or images that may not be related to the physiology of theuser 103 and/or that may not be received from thesoundbar 111. For example, thedisplay 109 may display information received via satellite, broadcast, a network, internet, and such information can include broadcast media. In some implementations, thedisplay 109 may display information received from the soundbar 111 (e.g., physiological data relating to the user 103) in one portion and may display information received from other sources in another portion. In some implementations, thedisplay 109 may overlay or superimpose the information received from thesoundbar 111 onto images received from other sources. For example, theuser 103 may watch a television broadcast on thedisplay 109 and may simultaneously view their health information on thedisplay 109. - In some implementations, the
display 109 may display one or more instructions or requests to theuser 103. For example, thedisplay 109 may display one or more exercises for theuser 103 to perform. Theuser 103 may follow along with visual cues provided on thedisplay 109 of how to perform the exercise or what tasks to perform. Thesoundbar 101 may monitor theuser 103 as they perform the exercise and may determine a health status or physiological condition of theuser 103 such as based on processing image data of the user obtained by the sensor(s) 111 while performing the exercise. - The
soundbar 101 may be in communication withmobile device 105. Themobile device 105 can include a smartphone. Themobile device 105 can receive information from thesoundbar 109 such as user interface data, images obtained from a camera of thesoundbar 101, and/or physiological data of theuser 103. Themobile device 105 may be configured to render user interfaces via a display which may be similar to user interfaces shown and/or described with reference to display 109. Themobile device 105 may communicate information to thesoundbar 101. Themobile device 105 can communicate instructions to thesoundbar 101 such as to control operation of thesoundbar 101. For example, theuser 103 may control operation of thesoundbar 101 via themobile device 105. For example, the user can control the speakers of thesoundbar 101, the sensor(s) 111 of thesoundbar 101, and/or measurements or monitoring operations performed by thesoundbar 101. Themobile device 105 can communicate physiological information to thesoundbar 101, such as physiological data generated by themobile device 105 and/or one or more wearable devices associated with themobile device 105. - The
soundbar 101 may be in communication with thescale 107. The scale may be configured to measure a weight of theuser 103. Thescale 107 may be configured to determine other physiological information of the user such as BMI, lean muscle mass, percent body fat, percent water weight of the user, changes in physiological information, etc. Thescale 107 can include one or more electrodes. A user may contact the electrodes as theuser 103 stands on the scale 107 (e.g., with their bare feet). Thescale 107 can generate ECG data indicating a cardiac condition of theuser 103. Thescale 107 can communicate physiological information to thesoundbar 101, such as physiological information generated and/or determined by the scale 107 (e.g., weight, BMI, etc.). In some implementations, thesoundbar 101 may further refine physiological information received from thescale 107, such as by using sensor data generated by the sensor(s) 111. For example, thesoundbar 101 may further refine a BMI received from the scale based on at least a height of theuser 103 and/or a weight distribution of theuser 103 determined based on image data generated by the sensor(s) 111. - The
scale 107 can be configured to rotate. In some implementations, thescale 107 may rotate in response to a motor of thescale 107 generating a force to cause thescale 107 to rotate. In some implementations, thescale 107 may rotate freely such as in response to an external force applied to thescale 107. Thescale 107 can receive information from thesoundbar 101. The information can include instructions. For example, thesoundbar 101 may communicate instructions to thescale 107 to cause the scale to rotate as the user stands on the scale such that the sensor(s) 111 may perform sensing of a plurality of portions of the body of theuser 103. In some implementations, thescale 107 may rotate automatically. For example, thescale 107 may rotate independent of any instructions received from thesoundbar 101. As another example, thescale 107 may rotate in response to auser 103 standing on thescale 107. In some implementations, thescale 107 may communicate information to thesoundbar 101 to control one or more operations of thesoundbar 101. For example, thescale 107 may communicate instructions to cause the soundbar 101 (and/or sensors 111) to activate, to turn on, to begin generating sensor data, etc. In some implementations, thescale 107 may communicate instructions to thesoundbar 101 to cause the sensor(s) 111 to generate sensor data (e.g., collecting images) in response to auser 103 standing on thescale 107. In some implementations, thesoundbar 101 may perform measurement operations of theuser 103 without the scale. For example, theuser 103 may stand in front of the soundbar 101 (without the scale 107) as the sensor(s) 111 sense theuser 103. Theuser 103 can rotate in front of the soundbar 101 (with or without the scale 107). -
FIG. 2 is a schematic block diagram illustrating an example implementation of asoundbar 201. Thesoundbar 201 can include ahardware processor 203, astorage component 205, acommunication component 207, apower source 209, one ormore sensors 211, and/or one ormore speakers 213. Thehardware processor 203 can include one or more processors configured to execute program instructions to cause thesoundbar 201, or components thereof, or other systems or devices, to perform operations. Theprocessor 203 can be configured, among other things, to process data, execute instructions to perform one or more functions, and/or control the operation of thesoundbar 201. For example, theprocessor 203 can process data such as image data obtained from the sensor(s) 211 and can execute instructions to perform functions related to processing, storing, and/or transmitting such data. - The
processor 203 can calculate physiological information of a user, such as physiological parameters. Theprocessor 203 can calculate physiological information based on at least data obtained from the sensor(s) 211. For example, theprocessor 203 can calculate a body mass index (BMI) of a user 223 based on image data obtained from an image sensor such as a camera and/or information from a LIDAR sensor. Theprocessor 203 can determine a weight distribution of the user 223 such as where on the user's body their weight is distributed. A weight distribution determination may improve physiological determinations such as BMI determination. For example, theprocessor 203 may determine, based on image data, that high percentage of the user's weight is distributed around the torso or mid-section of the user which may indicate an unhealthy BMI. As another example, theprocessor 203 may determine that a user has their weight distributed in the legs, chest, and arms, which may indicate a healthy BMI. Theprocessor 203 may determine physiological information of a user based on at least real-time and/or historical physiological information of the user. In some implementations, theprocessor 203 may determine physiological information of the user based on at least statistical information such as physiological information relating to a plurality of other people. For example, theprocessor 203 may determine physiological information of the user based on comparing image data or other data of the user with historical information of the user and/or with statistical information. Theprocessor 203 may determine physiological information of a user based on at least data obtained from thescale 207. Theprocessor 203 may determine a health index of the user based on at least one or more of weight distribution of a user, such as determined by processing sensor data, heat distribution of a user, such as determined by processing sensor data, and/or scale data received from thescale 207. - The
processor 203 may measure a user's body or body parts. For example, theprocessor 203 can determine the user's height. As another example, theprocessor 203 may determine dimensions of certain body parts of the user. For example, theprocessor 203 may determine the circumference of a user's stomach, chest, thighs, arms, or the like. Theprocessor 203 can determine a length of a user's arms or legs or torso. Theprocessor 203 can determine ratios of various portions of the user's body. Theprocessor 203 can determine an alignment of various portions of the user's body. Theprocessor 203 can determine a symmetry of various portions of the user's body. Theprocessor 203 can determine the user's dimensions based on at least information obtained from the sensor(s) 211 such as a camera and/or LIDAR sensor. As an example, as a user rotates in front of thesoundbar 201, the sensor(s) 211 may determine distances to determine the user's body measurements. Theprocessor 203 may use the user's body dimensions to further improve physiological determinations such as determining a health index, BMI, or the like. - The
processor 203 can determine a heat distribution of the user 223. Theprocessor 203 can determine a heat distribution based at least on information obtained from the sensors(s) 211 which can include an IR camera. The heat distribution may improve physiological determinations. For example, theprocessor 203 may analyze the user's circulation or blood flow based on the user's heat distribution. For example, low thermal energy at the user's peripheries such as arms and legs may indicate poor blood flow. Theprocessor 203 can determine a temperature of a user, such as based on information obtained from the sensor(s) 211, such as an infrared sensor. The temperature may be a core body temperature. The temperature may be a surface body temperature. Theprocessor 203 may be configured to determine one or more physiological states of the user based on at least the user's temperature, such as whether the user has a fever, is infected with a pathogen, is hypothermic, etc. - The
processor 203 can determine one or more physiological states of the user, such as weight distribution or heat distribution, based on at least processing sensor data, such as image data. Theprocessor 203 may implement one or more image processing techniques to analyze image data obtained from the sensor(s) 211. Theprocessor 203 may implement one or more pattern recognition techniques to analyze image data obtained from the sensor(s) 211. Theprocessor 203 can compare one or more images obtained from sensor(s) 211. For example, theprocessor 203 may compare a current image of the user 223 with a historical image of the user 223. Using the multiple images, theprocessor 203 may perform image processing to determine health trends of the user 223 such as whether the user 223 is changing weight, how the distribution of weight on the user 223 has changed over time, how thermal distribution (which may indicate perfusion) of the user 223 has changed over time, or the like. - The
processor 203 can perform photoplethysmography (PPG) with sensor data originating from the sensor(s) 211. Theprocessor 203 can perform remote photoplethysmography (PPG). The sensor data can include optical data which may comprise image data. The sensor(s) 211 may detect ambient light reflected from the skin of a user and in response generate sensor data from the captured reflected light. Theprocessor 203 can process the sensor data to generate PPG data. PPG data can relate to volumetric changes, such as volumetric changes of blood vessels resulting from cardiac activity and/or volumetric changes of a thoracic cavity resulting from breathing. Theprocessor 203 can implement one or more PPG techniques (which can include remote PPG techniques) to analyze the sensor data to determine one or more physiological parameters of the user. Sensor data may comprise one or more pixels. Pixels may correspond to red, blue, or green wavelengths. Theprocessor 203 may analyze pixel color and/or variation to determine physiological parameters. Theprocessor 203 can determine one or more of pulse rate, blood pressure, respiration rate, respiration volume, cardiac output, perfusion index, pleth variability, blood oxygen saturation (SpO2), and/or a PPG waveform. Accordingly, the present disclosure provides for a contactless method to determine a user's physiological parameters. Theprocessor 203 can determine a user's physiological parameters without a device, such as a sensor, contacting the user. In some implementations, theprocessor 203 can implement one or more image processing techniques to identify a portion of an image for processing for analyzing PPG data such as to generate physiological parameters. In one example implementation, theprocessor 203 can identify a portion of image corresponding to specific body part of a user for analyzing PPG data. The body part may be referred to as a region of interest. The body part may be a body part containing a high density of blood vessels near a surface of the skin. The body part can be a face of a user, a forehead of a user, arms of a user, hands of a user, feet of a user, a throat of a user, a neck of a user, etc. Theprocessor 203 may analyze sensor data corresponding to multiple regions of a user's body, such as a user's face and a user's hands. Analyzing sensor data corresponding to multiple regions of a user's body may improve the accuracy of determining physiological parameters of the user. - The
processor 203 can perform image processing on images obtained from a camera such as sensor(s) 211. Theprocessor 203 can perform pattern recognition of images. As an example, theprocessor 203 can determine an orientation of a user based on images received from the sensor(s) 211. For example, based on images obtained from the sensor(s) 211, theprocessor 203 can determine which direction a user is facing, such as towards thesoundbar 201, away from thesoundbar 201, etc. Based on the user's determined orientation, theprocessor 203 can generate instructions to cause the user to change their orientation so that thesoundbar 201 may perform an accurate measurement of the user. For example, thesoundbar 201 may cause thescale 225 to rotate and/or may cause display and/or audio instructing the user to rotate. As another example of image processing to determine subject orientation, theprocessor 203 may be configured to determine an orientation of a subject laying in a bed such as a patient in a hospital bed or sleeping baby. Orientations can include an upright orientation, a supine orientation, a prone orientation, a side orientation, or a fall orientation. For example, theprocessor 203 may perform image processing and/or pattern recognition on images obtained by the sensor(s) 211 to determine whether a subject is laying on their side, on their stomach, on their back, whether a subject is standing up, whether a subject has fallen down, such as off of a bed, or the like. Theprocessor 203 may be determine an amount of time a subject has been oriented in a certain orientation. Theprocessor 203 can generate one or more alarms depending on a subject's determined orientation and/or amount of time in a certain orientation. For example, theprocessor 203 can generate an alarm if a patient has been oriented in a certain position in excess of a time threshold which could result in bed sores or pressure ulcers. As another example, theprocessor 203 can generate an alarm if a sleeping baby has fallen out of a bed or has not moved from a certain position for longer than a threshold amount of time. The alarm can be one or more audio signals such as played through the speaker(s) 213 and/orcomputing device 221, and/or visual signals displayed on a device such ascomputing device 221. - The
processor 203 can determine physiological information relating to a thermal energy of a subject such as the body heat of a patient or sleeping baby based on information obtained from an infrared (IR) sensor or camera. Theprocessor 203 can generate one or more alarms based on information received from an IR camera. For example, theprocessor 203 may determine that a body temperature of a sleeping baby is unusually high or low based on the information from the IR camera and may generate an alarm. - The
processor 203 may generate user interface data for rendering user interface displays. The user interface displays can include physiological information of a user and/or instructions to a user to perform physiological measurement. The user interface displays can include images of the user captured by the sensor(s) 211 such as real-time images and/or historical images. Theprocessor 203 can generate instructions to control the operation of one or more remote devices such asscale 225. Theprocessor 203 can perform facial recognition such as based on images obtained from the sensor(s) 211. - The
processor 203 may determine whether one or more care provides are present such as a health care provider in a hospital or a care provider in a home such as based on facial recognition. Theprocessor 203 may be configured to determine whether one or more unauthorized persons are present such as in a hospital room or a home, such as an intruder, such as based on facial recognition. - The
processor 203 may determine one or more voice commands such as from a user and detected by a microphone which may control one or more operations of the soundbar. Theprocessor 203 may be configured to detect screaming, crying, or the like. For example, theprocessor 203 may determine that a baby is crying. - The
processor 203 may be configured to determine a user's gait or ambulatory condition. For example, theprocessor 203 may determine that a user is walking abnormally such as with a limp, or unevenly which may indicate joint problems, bone length irregularities, muscle irregularities or weakness, neurological issues, or the like. - The
processor 203 may be configured to determine one or more physiological parameters of a user, such as pulse rate, blood oxygen content (SpO2), respiration rate, or the like, which may be based at least in part on information obtained from the sensor(s) 211 such as information from a camera. Theprocessor 203 may detect breathing patterns. Theprocessor 203 may determine whether a user is breathing abnormally such as hyperventilating, or experiencing breathing apnea, or not breathing, etc. - The
processor 203 may be configured to determine a skin color, tone, or shade of a user based on information obtained from asensor 211 such as a camera. For example, theprocessor 203 may determine that a user has been sunburned, may detect one or more skin irregularities, such as sunspots, tan lines, skin cancer, lesions, or the like. Theprocessor 203 may detect a trend in the skin color of the user. In some implementations, skin color may indicate perfusion or respiration. For example, theprocessor 203 may determine whether a baby is breathing properly based on a skin color of the baby. - The
processor 203 may be configured to monitor a user's sleep cycles such as based on a user's movement as detected by a camera or motion sensor. - The
processor 203 may be configured to monitor and/or detect a user's eye activity. For example, theprocessor 203 may detect a user's eye movement patterns such as by implementing eye tracking of a camera. Theprocessor 203 may detect pupillometry of a user such as a user's pupil dilation in response to light stimulus. - The
processor 203 may be configured to determine whether an infant has been moved. Theprocessor 203 may determine whether an unauthorized person has moved the infant. Theprocessor 203 may determine whether the infant is in an unauthorized location. Theprocessor 203 may determine one or more positions, orientations, movements of an infant such as when sleeping or when awake. Theprocessor 203 may determine whether an infant is crying, an amplitude or volume of the cry, a duration of the cry, or the like. Theprocessor 203 may determine a level of distress of the infant based on one or more of the infant's cry, position, orientation, etc. - The
processor 203 may be configured to detect and/or analyze one or more facial features and/or expressions of a user. Theprocessor 203 may analyze facial features to determine facial recognition, stroke detection, or the like. For example, theprocessor 203 may determine a user has experienced a stroke based on a user's smile or other facial muscles acting irregularly. - The
processor 203 may be configured to detect and/or analyze a speech of a user. Theprocessor 203 may analyze a user's speech to recognize voice commands to control operation of thesoundbar 201, to detect a stroke of the user or the like. For example, theprocessor 203 may determine irregularities in the user's speech which may indicate the user has experienced a stroke. Theprocessor 203 may analyze a user's speech, such as vocabulary, over time which may indicate cognitive abilities of the user. For example, theprocessor 203 may track an infant's increasing vocabulary over time or may track an elderly person's decreasing vocabulary usage over time which may indicate Alzheimer's, dementia, or other cognitive decline. - The
processor 203 may be configured to detect and/or analyze a user's movements such as movements relating to fine motor movements. For example, theprocessor 203 may analyze a user's hand movements to detect shaking, tremors, etc. which may indicate a health status of a user such as Parkinson's, or other neuromuscular condition. - The
processor 203 may be configured to implement one or more tests or health checks to be taken by a user. For example, thesoundbar 201 may issue one or more request (e.g., audio or visual) for a user to do one or more tasks or exercises in front of thesoundbar 201. Thesoundbar 201 may monitor the user as they perform the exercises such as with a camera. For example, thesoundbar 201 may request the user to reach as high as they can, to walk on their toes, to bend over, to touch a certain portion of a display screen in communication with thesoundbar 201, or the like. Based on the user performing the one or more tasks or exercises, theprocessor 203 may determine one or more health conditions of the user such as a hand-eye coordination, a flexibility, a muscle tone, or the like. As another example, thesoundbar 201 may request the user to smile, or frown, or move their eyes in a certain direction. Theprocessor 203 may determine a cognitive state of the user such as a dementia, stroke, Alzheimer's, Parkinson's, or the like. As another example, thesoundbar 201 may request the user to perform one or more cognitive tasks such as simple puzzle, or memory exercise, or other task which may indicate user's cognition. Theprocessor 203 may track the user's cognition over time. - The
processor 203 may be configured to perform a hearing test of user. For example, thesoundbar 201 may request the user to respond according to one or more sounds emitted by thesoundbar 201 to determine a hearing capability of the user. - The
processor 203 may be configured to perform an eye test of the user. For example, thesoundbar 201 may request the user to respond according to one or more images displayed by a display in communication with the soundbar. For example, thesoundbar 201 may request the user to identify letters or numbers of various sizes displayed on the display to determine the user's eyesight. As another example, thesoundbar 201 may request the user to identify various objects of certain colors displayed on the display to determine a user's color detection which may indicate color blindness. - The
processor 203 may be configured to determine whether a user has taken medication. For example, theprocessor 203 may analyze image data to determine whether one or more cupboards or drawers containing medicine was opened at a certain time. As another example, theprocessor 203 may recognize certain pills taken by the user based on image data as the user consumes the pills. Theprocessor 203 may generate reminders for a user to take medication at the appropriate time. - The
storage component 205 can include one or more memory devices that store data, including without limitation, dynamic and/or static random-access memory (RAM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and the like. Thestorage component 205 can be configured to store data such as data obtained from the sensor(s) 211, such as camera images, processed and/or unprocessed physiological data, such as body mass index (BMI), or the like. - The
communication component 207 can facilitate communication (via wired and/or wireless connection) between the soundbar 201 (and/or components thereof) and separate devices, such as separate monitoring, display, and/or mobile devices. For example, thecommunication component 207 can be configured to allow thesoundbar 201 to communicate with other devices, systems, and/or networks over any of a variety of communication protocols. Thecommunication component 207 can be configured to use any of a variety of wired communication configurations such as HDMI, USB, ethernet, coaxial, fiber optics, twisted pair, or the like. Thecommunication component 207 can be configured to use any of a variety of wireless communication protocols, such as Wi-Fi (802.11x), Bluetooth®, ZigBee®, Z-wave®, cellular telephony, infrared, near-field communications (NFC), RFID, satellite transmission, proprietary protocols, combinations of the same, and the like. Thecommunication component 207 can allow data and/or instructions to be transmitted and/or received to and/or from thesoundbar 201 and separate computing devices. Thecommunication component 207 can be configured to transmit and/or receive (for example, wirelessly) processed and/or unprocessed data such as physiological data, sensor data, image data, user interface data, or other information to separate computing devices, which can include, among others, a mobile device (for example, an iOS or Android enabled smartphone, tablet, laptop), a desktop computer, a wearable device such as a smartwatch, a server or other computing or processing device for display and/or further processing, among other things. Such separate computing devices can be configured to store and/or further process the received data and/or other information, to display information indicative of or derived from the received information. As another example, thecommunication component 207 of thesoundbar 201 can be configured to wirelessly transmit processed and/or unprocessed data such as sensor data and/or other information to a mobile phone which can include one or more hardware processors configured to execute an application that generates a graphical user interface displaying information representative of the data or other information obtained from thesoundbar 201. Thecommunication component 207 can be embodied in one or more components that are in communication with each other. Thecommunication component 207 can comprise a wireless transceiver, an antenna, and/or a near field communication (NFC) component. - The
soundbar 201 can include apower source 209. Thepower source 209 can provide power for hardware components of thesoundbar 201 described herein. Thepower source 209 can be, for example, a lithium battery. Additionally or alternatively, thesoundbar 201 can be configured to obtain power from a power source that is external to thesoundbar 201. For example, thesoundbar 201 can include or can be configured to connect to a cable which can itself connect to an external power source to provide power to thesoundbar 201. - The
soundbar 201 can include one ormore sensors 211. The sensor(s) 211 can include one or more types of sensors. The sensor(s) 211 may be sensitive and/or responsive to electromagnetic radiation. The sensor(s) 211 can generate image data responsive to electromagnetic radiation. The sensor(s) 211 can generate one or more voltages responsive to electromagnetic radiation. The sensor(s) 211 can include one or more light-sensitive sensors. The sensor(s) 211 can include one or more optical sensors. The sensor(s) 211 can include one or more photodiodes. The sensor(s) 211 can include one or more image sensors. The sensor(s) 211 can include one or more cameras. The camera can include a CCD camera and/or a CMOS camera. The camera can include a 3D camera, a depth camera, or a stereovision camera. The camera can include multiple lenses. The camera can include a stereo camera. The camera can include one or more lenses that shift viewpoints. Theprocessor 203 can cause the one or more cameras to adjust a view of the one or more cameras. Adjusting the view of the one or more cameras can comprise one or more of rotating the one or more cameras, adjusting a zoom of the one or more cameras, pivoting the one or more cameras, tilting the one or more cameras, or panning the one or more cameras. The camera can be configured to capture images in the visible portion of the electromagnetic spectrum. The camera can be configured to capture images in the infrared portion of the electromagnetic spectrum. For example, the camera can be an infrared (IR) camera. The camera can be configured to detect thermal energy. The sensor(s) 211 can include a light detection and ranging (LIDAR) sensor. The LIDAR sensor can be configured to emit a laser light and measure the time for the reflected light to return. The LIDAR sensor can be configured to determine distances between the LIDAR sensor and a point remote to the LIDAR sensor and/or between two points remote to the LIDAR sensor. In some implementations, the sensor(s) 211 can include a camera and a LIDAR sensor. In some implementations, the sensor(s) 211 can include multiple cameras and/or multiple types of cameras. The sensor(s) 211 can include a microphone configured to detect sounds such as voice, speech, crying, etc. The sensor(s) 211 can include a motion sensor or light sensor. - The
soundbar 201 can include one ormore speakers 213. The speaker(s) 213 can emit one or more audio signals such as music, voice commands, physiological information such as health parameters, alarms, and the like. The speaker(s) 213 can include one or more of tweeters, woofers, and/or subwoofers. - The
soundbar 201 may be in communication with one ormore servers 227 remote to thesoundbar 201. Thesoundbar 201, orcommunication component 207 thereof, can communicate with the server(s) 227 via anetwork 210. Thenetwork 210 can include any combination of networks, such as a personal area network (PAN), a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), or the like. Thesoundbar 201 may, via thenetwork 210, communicate data to the server(s) 227 and/or receive data from the server(s) 227 including sensor data such as image data, physiological data (e.g., to be stored as historical physiological data), or the like. - The server(s) 227 may include, and/or have access to (e.g., be in communication with) a storage device or system which can include any computer readable storage medium and/or device (or collection of data storage mediums and/or devices), including, but not limited to, one or more memory devices that store data, including without limitation, dynamic and/or static random access memory (RAM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), optical disks (e.g., CD-ROM, DVD-ROM, etc.), magnetic disks (e.g., hard disks, floppy disks, etc.), memory circuits (e.g., solid state drives, random-access memory (RAM), etc.), and/or the like. In some implementations, the server(s) 227 may include and/or be in communication with a hosted storage environment that includes a collection of physical data storage devices that may be remotely accessible and may be rapidly provisioned as needed (commonly referred to as “cloud” storage). Data stored in and/or accessible by the server(s) 227 can include physiological data including historical physiological data previously received from the
soundbar 201 and/or sensor data including, for example, images obtained from a camera, or the like. In some implementations, data stored in and/or accessible by the server(s) 227 can include statistical information, such as physiological information relating to a group of people. The statistical information can include information relating to one or more groups of people. The groups can be defined by age, gender, race, nationality, or the like. For example, statistical information can include physiological information relating to adult males living in the United States. As another example, the statistical information can include physiological information relating to infants born in China. The statistical information can include information for large amounts of people and may by representative of large populations, such as adults in the United States. Thesoundbar 201 may access the statistical information to perform one or more determinations relating to the user such as determining physiological states of the user, such as BMI, body temperature, weight distribution, body measurements, posture, gait, health status, or the like. For example, theprocessor 203 may compare the user's physiological information with the statistical information. - In some implementations, the
soundbar 201 may not be in communication with the one or more server(s) 227 which may enhance security or privacy of sensitive information. In some implementations, thesoundbar 201 may selectively communicate certain information with the server(s) 227 and may not communicate other information with the server(s) 227. In some implementations, thesoundbar 201 may store sensitive information locally. Sensitive information can include information relating to a user such as physiological information, health information, images of the user, or the like. A user may selectively control which information is communicated from thesoundbar 201 to remote computing devices such as theserver 227. - In some implementations, the
soundbar 201 can include a privacy operation mode which may be implemented using a privacy switch or button. For example, a user may actuate a privacy switch on thesoundbar 201 which may implement one or more privacy routines. For example, during a privacy operation mode, communication between thesoundbar 201 and the server(s) 227 (or other remote computing devices) may be disabled and/or communication of certain sensitive information between thesoundbar 201 and the server(s) 227 may be disabled. In some implementations, during a privacy mode of operation, one or more sensor(s) 211 of thesoundbar 201 may be disabled, such as one or more cameras. - The
soundbar 201 can communicate with one ormore computing devices 221. Thecomputing device 221 may be remote to thesoundbar 201. Thesoundbar 201 may communicate with thecomputing device 221 via thenetwork 210. Thecomputing device 221 can include one or more of a mobile phone such as a smartphone, a laptop, a tablet, a wearable device such as a smartwatch, a display such as a TV monitor, an earbud, a headphone, an earphone, a sensor such as a physiological sensor, such as a wearable physiological sensor (e.g., worn on a wrist, finger, ear, etc. of the user), another soundbar, or the like. Thesoundbar 221 may be configured to transmit data such as user interface data for rendering a graphical user interface. Thecomputing device 221 can be configured to render a user interface based on the user interface data received from thesoundbar 201. For example, thecomputing device 221 can render a display on a screen. The user interface can include instructions to a user to perform a measurement, such as instructions to a user to turn around in a circle in front of the sound bar, or to stand still, or the like. The user interface can include physiological data such as physiological parameters, such as a user's 223 weight, height, BMI, or the like. The user interface can include images obtained from a camera of thesoundbar 201. - The
soundbar 201 may be configured to receive information from thecomputing device 221. For example, thecomputing device 221 may transmit instructions to thesoundbar 201 to control an operation of thesoundbar 201. Accordingly, a user 223 may control operation of thesoundbar 201 via thecomputing device 221 while the user 223 may be remote to thesoundbar 201. In one example implementation, thesoundbar 201 may monitor a baby. The user 223 may control thesoundbar 201, via thecomputing device 221, to play music via thespeakers 213 to help the baby sleep. Thesoundbar 201 may obtain images of the baby via the sensor(s) 211. Thesoundbar 201 can transmit the images to thecomputing device 221. The user 223 can view the images of the baby via thecomputing device 221 while the user 223 is remote to the baby. Thesoundbar 201 may receive data such as physiological information form thecomputing device 221 such as where thecomputing device 221 includes physiological sensor(s). - In some implementations, the
computing device 221 may be associated with another user. For example, the computing device may be associated with a healthcare provider. Thesoundbar 201 may implement a call, such as a video call or audio call, with theremote computing device 221. Thesoundbar 201 can communicate image data to thecomputing device 221. The image data may correspond to historical and/or real time images. The image data may facilitate a video call. The image data may aid a healthcare provider in diagnosing a physiological status of the user. Accordingly, thesoundbar 201 can facilitate telehealth. - In some implementations, the
soundbar 201 may receive information from thecomputing device 221 relating to an audio playback modification and/or hearing profile of a user. For example, thecomputing device 221 can transmit instructions to thesoundbar 201 relating to an audio playback modification which may enhance an audio listening experience for the user depending on the user's particular hearing capabilities. The audio playback modifications can include frequency dependent gains such as amplitude adjustments to one or more frequencies in the audio playback signal. The audio playback modifications can include adjusting a phase of the audio playback signal. The audio playback modifications can include adjusting a latency of the audio playback signal. In one example implementation, thecomputing device 221 can include an earphone, or earbud, or headphone. The earbud can perform one or tests to determine hearing capabilities of the user. For example, the earbud can perform otoacoustic emissions (OAE) to determine a hearing profile or hearing transfer function of the user. The OAE can include distortion product OAE (DP-OAE), spontaneous OAE (S-OAE), and/or transient evoked OAE (TE-OAE). Based on the OAE the earbud may determine one or more audio playback modifications to make to the audio playback signal to personalize the listening experience for the user based on the user's hearing. The earbud can transmit the hearing transfer function and/or the audio playback modifications to thesoundbar 201. Thesoundbar 201 may modify the audio playback signal emitted from its speakers based on the information received from the earbud. This may enhance an audio listening experience for the user because the audio playback is personalized for the user's hearing profile such that the user will hear the audio as it was intended to be heard. - The
soundbar 201 can optionally communicate with ascale 225 such as via thenetwork 210. Thescale 225 can be configured to obtain physiological information of the user 223. For example, thescale 225 can obtain the user's 223 weight, BMI, body composition including lean muscle mass, fat mass, water mass, or the like. Thesoundbar 201 may receive physiological information from thescale 225 obtained by thescale 225. Thesoundbar 201 may communicate instructions to thescale 225 to control operation of thescale 225. For example, thesoundbar 201 may communicate instructions to thescale 225 to cause thescale 225 to rotate such as while the user 223 is on thescale 225 to perform measurement of the user 223. Thesoundbar 201 may receive instructions from thescale 225 to control an operation of thesoundbar 201. For example, the soundbar can receive a signal to cause one or more components of thesoundbar 201 to activate, such as to cause the sensor(s) 211 to begin operating to generate sensor data. - As used herein, “real-time” or “substantial real-time” may refer to events (e.g., receiving, processing, transmitting, displaying etc.) that occur at the same time or substantially the same time (e.g., neglecting any small delays such as those that are imperceptible and/or inconsequential to humans such as delays arising from electrical conduction or transmission). As a non-limiting example, “real-time” may refer to events that occur within a time frame of each other that is on the order of milliseconds, seconds, tens of seconds, or minutes. In some embodiments, “real-time” may refer to events that occur at a same time as, or during, another event.
- As used herein, “system,” “instrument,” “apparatus,” and “device” generally encompass both the hardware (for example, mechanical and electronic) and, in some implementations, associated software (for example, specialized computer programs for graphics control) components.
- It is to be understood that not necessarily all objects or advantages may be achieved in accordance with any particular embodiment described herein. Thus, for example, those skilled in the art will recognize that certain embodiments may be configured to operate in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other objects or advantages as may be taught or suggested herein.
- Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors including computer hardware. The code modules may be stored on any type of non-transitory computer-readable medium or computer storage device, such as hard drives, solid state memory, optical disc, and/or the like. The systems and modules may also be transmitted as generated data signals (for example, as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (for example, as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The results of the disclosed processes and process steps may be stored, persistently or otherwise, in any type of non-transitory computer storage such as, for example, volatile or non-volatile storage.
- Many other variations than those described herein will be apparent from this disclosure. For example, depending on the embodiment, certain acts, events, or functions of any of the algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (for example, not all described acts or events are necessary for the practice of the algorithms). Moreover, in certain embodiments, acts or events can be performed concurrently, for example, through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially. In addition, different tasks or processes can be performed by different machines and/or computing systems that can function together.
- The various illustrative logical blocks, modules, and algorithm elements described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, and elements have been described herein generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.
- The various features and processes described herein may be used independently of one another, or may be combined in various ways. All possible combinations and sub-combinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.
- The various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a general purpose processor, a digital signal processor (“DSP”), an application specific integrated circuit (“ASIC”), a field programmable gate array (“FPGA”) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor includes an FPGA or other programmable devices that performs logic operations without processing computer-executable instructions. A processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Although described herein primarily with respect to digital technology, a processor may also include primarily analog components. For example, some, or all, of the signal processing algorithms described herein may be implemented in analog circuitry or mixed analog and digital circuitry. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
- The elements of a method, process, or algorithm described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module stored in one or more memory devices and executed by one or more processors, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory computer-readable storage medium, media, or physical computer storage known in the art. An example storage medium can be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The storage medium can be volatile or nonvolatile. The processor and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor and the storage medium can reside as discrete components in a user terminal.
- Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
- Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, and so forth, may be either X, Y, or Z, or any combination thereof (for example, X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
- Language of degree used herein, such as the terms “approximately,” “about,” “generally,” and “substantially” as used herein represent a value, amount, or characteristic close to the stated value, amount, or characteristic that still performs a desired function or achieves a desired result. For example, the terms “approximately”, “about”, “generally,” and “substantially” may refer to an amount that is within less than 10% of, within less than 5% of, within less than 1% of, within less than 0.1% of, and within less than 0.01% of the stated amount. As another example, in certain embodiments, the terms “generally parallel” and “substantially parallel” refer to a value, amount, or characteristic that departs from exactly parallel by less than or equal to 10 degrees, 5 degrees, 3 degrees, or 1 degree. As another example, in certain embodiments, the terms “generally perpendicular” and “substantially perpendicular” refer to a value, amount, or characteristic that departs from exactly perpendicular by less than or equal to 10 degrees, 5 degrees, 3 degrees, or 1 degree.
- Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.
- Unless otherwise explicitly stated, articles such as “a” or “an” should generally be interpreted to include one or more described items. Accordingly, phrases such as “a device configured to” are intended to include one or more recited devices. Such one or more recited devices can also be collectively configured to carry out the stated recitations. For example, “a processor configured to carry out recitations A, B and C” can include a first processor configured to carry out recitation A working in conjunction with a second processor configured to carry out recitations B and C.
- All of the methods and processes described herein may be embodied in, and partially or fully automated via, software code modules executed by one or more general purpose computers. For example, the methods described herein may be performed by the computing system and/or any other suitable computing device. The methods may be executed on the computing devices in response to execution of software instructions or other executable code read from a tangible computer readable medium. A tangible computer readable medium is a data storage device that can store data that is readable by a computer system. Examples of computer readable mediums include read-only memory, random-access memory, other volatile or non-volatile memory devices, CD-ROMs, magnetic tape, flash drives, and optical data storage devices.
- It should be emphasized that many variations and modifications may be made to the herein-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The section headings used herein are merely provided to enhance readability and are not intended to limit the scope of the embodiments disclosed in a particular section to the features or elements disclosed in that section. The foregoing description details certain embodiments. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the systems and methods can be practiced in many ways. As is also stated herein, it should be noted that the use of particular terminology when describing certain features or aspects of the systems and methods should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the systems and methods with which that terminology is associated.
- Those of skill in the art would understand that information, messages, and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof
Claims (19)
1. A soundbar for emitting audio and monitoring a physiological health of a user, the soundbar comprising:
a speaker configured to emit audio, the audio comprising one or more of music, an alert, information relating to a physiology of the user, or instructions to the user;
a sensor configured to obtain sensor data as the user is within a proximity of the soundbar, the sensor comprising one or more cameras configured to capture electromagnetic radiation including infrared radiation and visible light radiation, the sensor data comprising image data of the user and relating to a physiology of the user; and
one or more hardware processors configured to:
determine a distribution of body heat of the user based on at least the image data of the user, the image data comprising infrared image data indicating thermal energy;
determine a distribution of body weight of the user based on at least the image data of the user, the image data comprising visible light image data; and
determine a health index of the user from at least the distribution of body weight of the user or the distribution of body heat of the user.
2. The soundbar of claim 1 , wherein the sensor comprising the one or more cameras is further configured to obtain the sensor data comprising the image data of the user as the user rotates in front of the soundbar, the image data of the user corresponding to a plurality of portions of the user's body.
3. The soundbar of claim 1 , further comprising a communication component configured to communicate with one or more computing devices, the one or more computing devices including a scale configured to measure a weight of the user, wherein the one or more hardware processors is further configured to:
generate one or more instructions to the scale to cause the scale to rotate as the user stands on the scale; and
cause the sensor to obtain the sensor data as the scale rotates.
4. The soundbar of claim 1 , further comprising a communication component configured to communicate with one or more computing devices, the one or more computing devices including a scale configured to measure a weight of the user, wherein the one or more hardware processors is further configured to:
access scale data obtained from the scale by the communication component, the scale data including at least a weight of the user; and
determine the health index of the user based on at least the scale data.
5. The soundbar of claim 4 , wherein the scale data further includes at least one or more of a percent body fat of the user, a percent lean muscle mass of the user, a percent water of the user, a BMI of the user, a change in weight of the user, or ECG data of the user.
6. The soundbar of claim 1 , wherein the one or more hardware processors is further configured to:
determine a likelihood the user has an infectious disease based on at least the distribution of body heat of the user, the distribution of body heat of the user indicating a body temperature of the user, the infectious disease comprising a virus.
7. The soundbar of claim 1 , wherein the one or more hardware processors is further configured to:
process the image data to generate PPG data and determine one or more physiological parameters based on the PPG data.
8. The soundbar of claim 1 , wherein the one or more hardware processors is further configured to:
cause the speaker to emit one or more of an instruction to the user or information relating to the health index of the user.
9. The soundbar of claim 1 , further comprising a communication component configured to communicate with one or more computing devices, the one or more computing devices including a display, wherein the one or more hardware processors is further configured to:
generate user interface data for rendering one or more user interfaces comprising indicia of the health index of the user; and
cause the communication component to communicate the user interface data to the display to render the one or more user interfaces.
10. The soundbar of claim 9 , wherein the one or more user interfaces further comprises one or more images corresponding to the image data of the user, the one or more images comprising one or more historical images or one or more real-time images.
11. The soundbar of claim 1 , further comprising a communication component configured to communicate with one or more computing devices, the one or more computing devices including a scale configured to measure a weight of the user, wherein the one or more hardware processors is further configured to:
cause the sensor to turn on or to obtain the sensor data responsive to one or more signals received from the scale by the communication component, the one or more signals generated by the scale responsive to a user standing on the scale.
12. The soundbar of claim 1 , wherein the one or more hardware processors is further configured to:
determine the distribution of body heat of the user based on at least the image data of the user, the image data comprising historical image data or real-time image data; and
determine the distribution of body weight of the user based on at least the image data of the user, the image data comprising historical image data or real-time image data.
13. The soundbar of claim 1 , wherein the one or more cameras comprise one or more of a plurality of cameras, a 3D camera, a depth camera, a stereovision camera, an infrared camera, or a light detection and ranging (LIDAR) sensor.
14. The soundbar of claim 1 , wherein the one or more hardware processors is further configured to:
cause the one or more cameras to adjust a view of the one or more cameras, wherein adjusting the view of the one or more cameras comprises one or more of rotating the one or more cameras, adjusting a zoom of the one or more cameras, pivoting the one or more cameras, tilting the one or more cameras, or panning the one or more cameras.
15. The soundbar of claim 1 , wherein the one or more hardware processors is further configured to:
cause a communication component of the soundbar to implement wireless communication with a remote computing device to establish a video call;
cause the communication component to transmit the image data of the user to the remote computing device; and
generate user interface data for rendering one or more user interfaces comprising one or more images received from the remote computing device by the communication component.
16. A method for monitoring a health of a user, the method comprising:
accessing sensor data originating from a sensor of a soundbar, the sensor comprising one or more cameras configured to capture electromagnetic radiation including infrared radiation and visible light radiation, the sensor data comprising image data of the user and relating to a physiology of the user;
determining a distribution of body heat of the user based on at least the image data of the user, the image data comprising infrared image data indicating thermal energy;
determining a distribution of body weight of the user based on at least the image data of the user, the image data comprising visible light image data;
determining a health index of the user from at least the distribution of body weight of the user or the distribution of body heat of the user; and
causing a speaker of the soundbar to emit an audio comprising one or more of music, an alert, information relating to the health index, or instructions to the user.
17. The method of claim 16 further comprising:
generating one or more instructions to a scale to cause the scale to rotate as the user stands on the scale; and
causing the sensor to obtain the sensor data as the scale rotates.
18. Non-transitory computer-readable media including computer-executable instructions that, when executed by a computing system, cause the computing system to perform operations comprising:
accessing sensor data originating from a sensor of a soundbar, the sensor comprising one or more cameras configured to capture electromagnetic radiation including infrared radiation and visible light radiation, the sensor data comprising image data relating to a physiology of a user;
determining a distribution of body heat of the user based on at least the image data of the user, the image data comprising infrared image data indicating thermal energy;
determining a distribution of body weight of the user based on at least the image data of the user, the image data comprising visible light image data;
determining a health index of the user from at least the distribution of body weight of the user or the distribution of body heat of the user; and
causing a speaker of the soundbar to emit an audio comprising one or more of music, an alert, information relating to the health index, or instructions to the user.
19. The non-transitory computer-readable media of claim 18 wherein the computer-executable instructions, when executed by the computing system, further cause the computing system to perform operations comprising:
generating one or more instructions to a scale to cause the scale to rotate as the user stands on the scale; and
causing the sensor to obtain the sensor data as the scale rotates.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/486,730 US20240122486A1 (en) | 2022-10-17 | 2023-10-13 | Physiological monitoring soundbar |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202263379864P | 2022-10-17 | 2022-10-17 | |
US18/486,730 US20240122486A1 (en) | 2022-10-17 | 2023-10-13 | Physiological monitoring soundbar |
Publications (1)
Publication Number | Publication Date |
---|---|
US20240122486A1 true US20240122486A1 (en) | 2024-04-18 |
Family
ID=90627562
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/486,730 Pending US20240122486A1 (en) | 2022-10-17 | 2023-10-13 | Physiological monitoring soundbar |
Country Status (1)
Country | Link |
---|---|
US (1) | US20240122486A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12082926B2 (en) | 2020-08-04 | 2024-09-10 | Masimo Corporation | Optical sensor with multiple detectors or multiple emitters |
USD1042852S1 (en) | 2021-06-24 | 2024-09-17 | Masimo Corporation | Physiological nose sensor |
USD1042596S1 (en) | 2022-12-12 | 2024-09-17 | Masimo Corporation | Monitoring camera |
US12107960B2 (en) | 2016-07-06 | 2024-10-01 | Masimo Corporation | Secure and zero knowledge data sharing for cloud applications |
US12109048B2 (en) | 2006-06-05 | 2024-10-08 | Masimo Corporation | Parameter upgrade system |
US12109012B2 (en) | 2006-12-09 | 2024-10-08 | Masimo Corporation | Plethysmograph variability processor |
-
2023
- 2023-10-13 US US18/486,730 patent/US20240122486A1/en active Pending
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12109048B2 (en) | 2006-06-05 | 2024-10-08 | Masimo Corporation | Parameter upgrade system |
US12109012B2 (en) | 2006-12-09 | 2024-10-08 | Masimo Corporation | Plethysmograph variability processor |
US12107960B2 (en) | 2016-07-06 | 2024-10-01 | Masimo Corporation | Secure and zero knowledge data sharing for cloud applications |
US12082926B2 (en) | 2020-08-04 | 2024-09-10 | Masimo Corporation | Optical sensor with multiple detectors or multiple emitters |
USD1042852S1 (en) | 2021-06-24 | 2024-09-17 | Masimo Corporation | Physiological nose sensor |
USD1042596S1 (en) | 2022-12-12 | 2024-09-17 | Masimo Corporation | Monitoring camera |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240122486A1 (en) | Physiological monitoring soundbar | |
US20230222805A1 (en) | Machine learning based monitoring system | |
US10492721B2 (en) | Method and apparatus for improving and monitoring sleep | |
US9795324B2 (en) | System for monitoring individuals as they age in place | |
JP6330199B2 (en) | Body position optimization and biological signal feedback of smart wearable device | |
EP3250110B1 (en) | Method and apparatus for improving and monitoring sleep | |
Lamonaca et al. | Health parameters monitoring by smartphone for quality of life improvement | |
CN108882875A (en) | Equipment for neural blood vessel stimulation | |
TW201935186A (en) | Robot assisted interaction system and method thereof | |
JP7028787B2 (en) | Timely triggers for measuring physiological parameters using visual context | |
US10959646B2 (en) | Image detection method and image detection device for determining position of user | |
US20120194648A1 (en) | Video/ audio controller | |
WO2018222589A1 (en) | System and method for treating disorders with a virtual reality system | |
Chung et al. | Design and implementation of a novel system for correcting posture through the use of a wearable necklace sensor | |
US20230253103A1 (en) | Systems and methods for monitoring user activity | |
US20210265055A1 (en) | Smart Meditation and Physiological System for the Cloud | |
US20190274630A1 (en) | Output control device, output control method, and program | |
JP2022059140A (en) | Information processing device and program | |
JP2019155078A (en) | Posture and deep respiration improvement device, system and method | |
US20230390608A1 (en) | Systems and methods including ear-worn devices for vestibular rehabilitation exercises | |
WO2017016941A1 (en) | Wearable device, method and computer program product | |
KR20180045101A (en) | Biometric data display system using actual image and computer graphics image and method for displaying thereof | |
KR101641027B1 (en) | Remote monitoring wearable apparatus for objects to protect and apparatus and method for monitoring objects to protect | |
US20230107691A1 (en) | Closed Loop System Using In-ear Infrasonic Hemodynography and Method Therefor | |
US20220240802A1 (en) | In-ear device for blood pressure monitoring |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: MASIMO CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KIANI, MASSI JOE E.;REEL/FRAME:066137/0567 Effective date: 20240116 |