US20210057093A1 - Remote monitoring systems and methods for elderly and patient in-home and senior living facilities care - Google Patents

Remote monitoring systems and methods for elderly and patient in-home and senior living facilities care Download PDF

Info

Publication number
US20210057093A1
US20210057093A1 US16/947,816 US202016947816A US2021057093A1 US 20210057093 A1 US20210057093 A1 US 20210057093A1 US 202016947816 A US202016947816 A US 202016947816A US 2021057093 A1 US2021057093 A1 US 2021057093A1
Authority
US
United States
Prior art keywords
activity
patient
sensor
elderly
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/947,816
Inventor
Michael E. DeSa
II Johnie Rose
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vinya Intelligence Inc
Original Assignee
Vinya Intelligence Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vinya Intelligence Inc filed Critical Vinya Intelligence Inc
Priority to US16/947,816 priority Critical patent/US20210057093A1/en
Publication of US20210057093A1 publication Critical patent/US20210057093A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/06Electricity, gas or water supply
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1102Ballistocardiography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1115Monitoring leaving of a patient support, e.g. a bed or a wheelchair
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4833Assessment of subject's compliance to treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/02Knowledge representation; Symbolic representation
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/0423Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting deviation from an expected pattern of behaviour or schedule
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/043Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0453Sensor means for detecting worn on the body to detect health condition by physiological monitoring, e.g. electrocardiogram, temperature, breathing
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0484Arrangements monitoring consumption of a utility or use of an appliance which consumes a utility to detect unsafe condition, e.g. metering of water, gas or electricity, use of taps, toilet flush, gas stove or electric kettle
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0492Sensor dual technology, i.e. two or more technologies collaborate to extract unsafe condition, e.g. video tracking and RFID tracking
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/10ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H70/00ICT specially adapted for the handling or processing of medical references
    • G16H70/60ICT specially adapted for the handling or processing of medical references relating to pathologies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/08Elderly
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01RMEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
    • G01R15/00Details of measuring arrangements of the types provided for in groups G01R17/00 - G01R29/00, G01R33/00 - G01R33/26 or G01R35/00
    • G01R15/14Adaptations providing voltage or current isolation, e.g. for high-voltage or high-current networks
    • G01R15/18Adaptations providing voltage or current isolation, e.g. for high-voltage or high-current networks using inductive devices, e.g. transformers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/01Dynamic search techniques; Heuristics; Dynamic trees; Branch-and-bound
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms

Definitions

  • the present application relates generally to in-home and senior living facilities monitoring of elderly and patients with dementia or chronic diseases.
  • RPM remote patient monitoring
  • a system for labeling daily living activities for a patient comprises at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the system at least to: receive at least one measurement from at least one sensor; determine an activity of the patient based on the received at least one measurement; and label an activity of the patient or elderly based on the determined behavior.
  • a system for determining an activity of a patient comprises at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the system to: receive at least one measurement from at least one sensor; and determine the activity of the patient based on at least one of: (i) rule-based heuristics, (ii) training data from a home of one or more patients, and (iii) the received at least one measurement from the at least one sensor.
  • the activity includes at least one common activity of daily living and common instrumental activity of daily living;
  • a method comprises: receiving at least one measurement from at least one sensor; determining an activity of a patient based on the received at least one measurement; and generating alerts when trends deviate from a set of pre-defined thresholds.
  • One advantage resides in an in-home and senior living facility remote monitoring system that operates without the use of microphones, camera visualization, or other invasive data gathering sources.
  • the invention may take form in various components and arrangements of components, and in various steps and arrangements of steps.
  • the drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention.
  • FIG. 1 is a block diagram of an exemplary integrated and interoperable contactless (meaning not requiring direct contact with the monitored person) sensor technology system with embedded microprocessors and a gateway that communicates with a server where processing develops an informational and actionable dashboard.
  • FIG. 2 is a block diagram of an exemplary data engineering process that applies a set of distinct sorting and algorithmic models at each separate stage from raw data input to the output of behavioral activity labels and vital signs enabling actionable steps delivered through a system that enables intervention and informational system visualized on a dashboard.
  • FIG. 3 illustrates the information technology methodology applied towards a consumer and clinical layered process of dashboard accessibility to human subjects, doctors, nurses, family caregivers, insurers, medical device controllers, chatbots, telehealth, home-healthcare services, and so forth.
  • FIG. 4 illustrates six distinct combined processes of underlying sciences as related to the hardware, software, controller hubs, algorithms with edge computing and communication, dashboards and gateways.
  • FIG. 5 illustrates an example dwelling that may be equipped with the systems and methods described herein.
  • a hardware engineering system includes sensors 1 , 1 a , 2 , 3 , 3 a 4 , and 5 , and a home-hub product that leads to backend servers embedded with proprietary software systems.
  • the installation occurs in a customer home, which, in some examples, may be a single family, town house, apartment or suite, senior or assisted living unit with a standalone electric circuit breaker management system (see, e.g., FIG. 5 ).
  • the installation can be performed by a certified electrician and completed in less than half-a-day.
  • the engineering architecture includes, electrical current transformer system, ultrasound acoustic water system, a sleeping, respiratory and heart rate system, a mobility tracking system, a local microprocessor for edge computing, a load sensing controller system, a local area network Wi-Fi system and a WiFi and Cellular communications gateway embedded in a hub which may include a microprocessor, a wide area network to enable distributed data management, remote servers, algorithms and a dashboard data visualization system.
  • a hub which may include a microprocessor, a wide area network to enable distributed data management, remote servers, algorithms and a dashboard data visualization system.
  • there are four core sensors used although more sensors and modalities may be employed.
  • the described four sensors are directly and indirectly controlled by a single home-hub which is comprised of a microprocessor and network gateway.
  • the technology applied in this application is contactless and passive and does not require visual cameras or any form of personal interaction with chatbots, wearables or medical devices on the part of the monitored individual.
  • sensor may refer to a single sensor, or to a grouping of sensors that are the same or different kind of sensors.
  • sensor 1 may refer to a single electrical sensor, or to multiple electrical sensors placed throughout a house.
  • sensor 5 may refer to a grouping of sensors that includes both an ultra-wide band radar unit as well as an accelerometer.
  • sensor 1 is an electrical sensor.
  • sensor 1 is a set of electro-magnetic current transformer rings that are located on the main circuit board in the home. These can be located either at specific circuit breakers that are being monitored or at the main electric lines.
  • the system has a set of algorithms that enable the identification of to specific appliances, devices or lights based on the characteristics of electric load.
  • Sensor 1 a is a subset of electro-magnetic current transformer rings that are located on the circuit breakers related to the kitchen and other rooms.
  • the circuit breaker load identification algorithms permit usage tracking at the unit level (e.g. microwave, refrigerator, kettle, dishwasher, electric cooktop).
  • the sensor 1 a need not have a battery and can connect through a local (Wi-Fi) wireless network interface, which may be embedded in a separate hub.
  • Wi-Fi wireless network interface
  • Sensor 2 is a water sensor that may be located on a main water pipe or any branch of pipes, which can be either PVC material or various types of metal. Sensor 2 applies an ultrasound acoustic measuring methodology to ascertain the on/off usage of water, flow and quantity of clean water drawn down through the measured pipe or channel.
  • the described algorithms are designed to learn (e.g., via a processor that executes learning algorithms, classifiers, Bayesian nets, or the like) how to attribute on/off usage of water in certain pipes, water drawdowns, and related flow duration into toileting and bathing within a predetermined period (e.g., a day, a week, two weeks, a month, or some other predetermined water flow period).
  • Sensor 2 may additionally or alternatively be located on water pipes directly linked into a most-frequently-used bathroom.
  • the sensor 2 connects through a local (Wi-Fi) wireless network interface embedded in the hub and, in some embodiments, does not operate with batteries.
  • Wi-Fi wireless network interface
  • This noninvasive sensor technology does not necessitate cutting pipes or wires and does not rely on utility provider participation. That is, the described ultrasonic technology unit emits inaudible sound waves (i.e., ultrasound waves) that are used to measure water flow.
  • Sensor 3 is a three-signal sensor that is located between the mattress and frame of the bed (this is also illustrated in FIG. 3 ).
  • sensor 3 is an accelerometer.
  • the sensor 3 may include a camera(s); in some embodiments, the sensor 3 does not include any cameras.
  • the system may use a ballistocardiographic (BCG) method, which is a non-invasive method that is based on the measurement of the body motion generated by the ejection of blood at each cardiac cycle.
  • BCG ballistocardiographic
  • the system uses the information from sensor 3 to define periods of sleep as rapid eye movement (REM), deep, and light phases.
  • REM rapid eye movement
  • the sensor 3 may be connected locally to a power receptacle and, in some embodiments, does not operate on batteries.
  • Sensor 3 a collects data related to respiratory rate (RR) and heart rate (HR) measurements using a set of algorithms. Sensor 3 a measures the mechanical aspects of respiratory activity through chest wall movement, and of heart activity through movements produced by ejection of blood from the heart into the great vessels. Accordingly, the system can produce a set of vital signs on an interval basis or continuously. In some embodiments, the sensor does not require a battery but rather connects through a local microprocessor to the hub 20 . The sensors 3 , 3 a are able to detect whether a subject is on the bed and for what duration.
  • the described sensors convert mechanical force into proportional electrical energy based on a permanent electric charge inside the cellular structure of the sensor core.
  • the transducer behaves like an “active” capacitor, consequently, the loading of the signal by the input impedance of the measuring device must be considered.
  • the low mass contributed by the transducer is useful due to its non-resonate behavior. Frequency response is inherently flat to over 20 KHz with only the R-C roll off at low frequencies distorting the profile.
  • Sensor 4 is a three-dimensional (3D) mobility sensor.
  • this sensor uses ultrawideband radar to track the mobility of the human. It is a 3D sensor that emits ultrawideband radar waves and senses waves which are reflected from objects. The energy used can penetrate most building walls.
  • the sensor 4 is positioned to maximize its coverage area and capture relevant mobility. In some embodiments, it does not operate on batteries and connects through a local microprocessor to the hub 20 .
  • Sensor 5 combines two high-resolution sensors that use ultrawide band radar (UWR) with BCG to track mobility, presence and respiratory rate (RR) (e.g. sensor 5 may include an ultra-wideband radar unit as well as accelerometers for the BCG).
  • UWR ultrawide band radar
  • RR presence and respiratory rate
  • the UWR sensor part of sensor 5 uses radio frequency reflections to collect position, mobility, and gait information on a human in an indoor environment, even when the person may be in another room from where the device is located.
  • BCG is a technique for detecting repetitive motions of the human body arising from the sudden ejection of blood into the great vessels with each heartbeat.
  • Sensor 5 can also be configured to combine electromagnetic waves produced by Wi-Fi to assess movement created by waves among the wireless signals.
  • Sensor 5 is configured to combine low pixel (e.g., under 16 ⁇ 16) passive infrared sensor rays so the sensor is able to augment detection of mobility of a target human in situations where there are many occupants.
  • Local microprocessor 10 is programmed to combine and to ‘read’ the aggregated data emanating from sensor 3 and sensor 4 to the hub.
  • the microprocessor program subsequently uses the local area network Wi-Fi 26 embedded in the hub to transmit the data to the home-hub.
  • the gateway 24 uses either a broadband internet or cellular sim card router system to transmit data to the server 22 ; this functionality can be switched off remotely or automatically with a proprietary software program.
  • Wi-Fi 26 it should be understood that communication, in some embodiments, may occur via any communication medium (e.g., a Wi-Fi Network, any type of LTE network, a Zigbee network, Bluetooth, Cellular network and so forth).
  • Transmitter 12 which in some embodiments comprises a load identifier, pushes aggregated data from electrical waveform sensor 1 (example, SENSE), or sensor 1 a (example, EMCB), and sensor 2 and sensor 3 through an inbuilt processor to the Wi-Fi 26 embedded in the hub.
  • the aggregated and disaggregated electrical load, aggregated and disaggregated water flow, are identified through a disaggregation method at the circuit breaker level, the main water pipe and localized water pipes, respectively.
  • Transmitter 12 pushed Sensor 3 data related to sleeping, heart rate and respiratory rate to the hub gateway 24 . Once data passes through the gateway to a remote server 32 , it undergoes further machine learning to increase validation, classification and attribution.
  • Hub 20 is a hardware unit comprised of a microprocessor 22 and a communications gateway 24 .
  • the hub 20 integrates aggregated data from the sensors 1 , 1 a , 2 , 3 , 3 a , 4 , and 5 by reading the data at different time intervals and pushes the data via the gateway 24 to a remote server 32 through an external wide area network 28 .
  • the hub microprocessor 20 provides edge processing.
  • the hub 20 provides memory and a gateway interface comprised of long-term evolution for machines (LTE-M) broadband interface, WiFi and/or cellular networks to the remote server in the cloud 30 .
  • LTE-M long-term evolution for machines
  • Wide Area Network 28 connects external cloud application programming interfaces (API) into proprietary Cloud 30 as well as transmits data directly from Gateway 24 to Cloud 30 .
  • API application programming interfaces
  • Cloud 30 pushes the data into a dedicated server 32 located in a secure controlled storage environment and/or to an on-premises server.
  • the data is processed, transformed and loaded into structured and semi-structured data matrices in the database 34 and is made available for applied algorithms to create the output of, e.g., at least five daily living activities: food preparation/eating, sleeping, bathing, toileting, and mobility, and at least two vital signs such as heart rate and respiratory rate.
  • Algorithms 36 reside either on the edge device such as controller or within Cloud 30 and are selected and applied distinctly for each machine learning step towards determining the plurality of activities with a predefined confidence level.
  • Dashboard 38 receives daily living activities data from the database 34 from the respective sensors, and the data is subsequently converted into separate layered visualizations for caregivers, family members, home care administrators, clinicians, and others.
  • the database sets up a secure data pipe to transmit data that feeds into the user experience (UX) architecture set up for iOS, Android and/or any other suitable Web-based dashboard systems.
  • UX user experience
  • FIG. 2 represents the algorithmic methods utilized to collect and process raw input data 210 that are read from the sensors 205 (see also sensors 1 , 1 a , 2 , 3 , 3 a , 4 , and 5 of FIG. 1 ).
  • the microprocessor 22 has built-in algorithms that dynamically change based on the human's daily living activities.
  • the code operates alongside selected algorithms to determine the structured and semi-structured data types that are required to develop the optimal output results.
  • Structured data are data whose elements are addressable for effective analysis. It includes all data which can be stored in database AWS S3 in a table with rows and columns.
  • these algorithms are adjusted dynamically; for example, the models may be “tuned” or further trained.
  • Several standard machine learning methods are applied and non-standard models to develop time sequences 212 , feature generation 215 , activity classification 220 , activity discovery 225 , and human subject attribution 230 before the output 235 is developed that is related to routine activities 240 , 250 and anomaly detections 232 .
  • Visuals 245 , 255 may be created based on the activities 240 , 250 .
  • a model database 60 from which models, algorithms, etc., are retrieved for performing various steps of the method of FIG. 2 .
  • FIG. 3 shows an embedded, layered system of information and visualization.
  • the administrator, healthcare provider, patient, elderly, home healthcare, or any individual or entity approved by the monitored individual or their surrogate is provided access to patterns of five daily living activities and two vital signs on the dashboard 310 ; and this information is available to authorized family caregivers as well.
  • the patient cannot tamper with any of the technologies with exception of the bed sensor (e.g., sensor 3 of FIG. 1 ), which provides the individual optionality to switch on/off the power source to the bed sensor.
  • the dashboard's layered system provides insight into daily routine of living activities, history of specific activities and related anomalies, and vital signs of heart rate (HR) and respiratory rate (RR) as well as vital sign histories.
  • HR heart rate
  • RR respiratory rate
  • An algorithm creates an optional layered alarm system of user-tailored alerts via the dashboard from low-threshold signals to caregivers and high-threshold alerts to healthcare professionals.
  • the individual can set thresholds based on personal behavioral experiences or the system can set a default range which improves with time; however the individual is not required to activate any alarms.
  • the dashboard can be integrated into family caregivers' phone iOS and Android based systems, web-based systems and also into electronic health record systems, e.g. EPIC and CERNER-GENESIS.
  • the microcontroller located in the hub 20 has software to enable trouble-shooting readings within the hub 20 .
  • a java-based software program is coded to read data from Sensor 4 and uses WebSocket for secure handshake protocols and then transmits data to the external server in a JSON format. WebSockets are used to connect and send data and provide full duplex communication with the server with speeds of up to twice those of REST APIs. The over the air (OTA) protocols are established to permit remote monitoring and updates.
  • OTA over the air
  • sensor 4 data are read by the microcontroller 22 located in the hub 20 , and the sensor receives commands from WebSockets to transmit data to servers.
  • Sensor 1 , 1 a , and 2 have independent application programming interfaces (APIs), which permit “crawling” and this has been enabled at a frequency based on our models.
  • a “crawler” is best described as a program that simulates the user's behavior on a website, following all the steps a user does with the browser such as entering search parameters (e.g. destination, date, etc.), requesting a result by clicking on the search button and then scanning through them.
  • search parameters e.g. destination, date, etc.
  • a REST API server is established to receive crawled data to provide flexibility.
  • the remote server uses an AWS S3 and DynamoDB® which is controlled by an administrative panel that permits users to create and download csv files of structured and semi-structured data.
  • the home hub 20 is a container that co-locates a microprocessor 22 and communications gateway 24 .
  • the distinct feature of the hub 20 and its related micro-processor is the ability to integrate, control and enable the interoperability of different underlying sciences (electricity, water, BGC, infrared, WiFi and ultra-wideband radar) to produce time series histories of at least five common behavioral activities (eating, sleeping, bathing, toileting, walking) and two vital signs i.e. heart rate and respiratory rate.
  • the software program controls the sequencing of each data input to generate a value-based structure to enable modeling.
  • the hub has a in-built gateway that can adopt a low capacity or high capacity data router using either Cellular, WiFi, NB-IoT or LTE-M, these routers can also be programmed to shut down so that the hub can be connected to third party gateways.
  • Table 1 below identifies the programming languages that are associated with each hardware unit within a context for a specific purpose and the type of output and related value it provides.
  • Algorithms Summarizing features are generated from raw sensor inputs. These features serve as inputs to classification algorithms which label characteristic patterns of features as certain daily behavioral activities (e.g. bathing, toileting, eating, etc.).
  • the first-level approach is one of applying logic-based heuristics (e.g. 12 gallons or more of water running in less than 20 minutes, on/off water usage in specific pipes related to bathing and toileting, combined with presence in the bathroom, +/ ⁇ electricity use from the bathroom suggests bathing). Heuristic results are combined with machine learning classification algorithm outputs. Classification approaches utilized include but not limited to support vector machines, logistic regression, and random forest models.
  • Sensor windows Because the streams of sensor data to be categorized are continually flowing, a method is needed to define a discrete series of contiguous sensor events for analysis.
  • a sliding window method can be employed in which an activity window S i containing N sensor events is defined by sensor event i and the N ⁇ 1 sensor firings preceding it.
  • Each activity window has an associated “feature vector” which contains the time of the first sensor event s 1 , the time of the last sensor event s i , and one element for each sensor in the home describing the number of times each respective sensor has fired during window S i .
  • a given window may encompass sensor firings from different functional areas of the home over different time intervals
  • the influence of more physically remote sensors may be discounted based on the mutual information method outlined by Krishnan and Cook.
  • a mutual information matrix describing the extent to which all possible pairs of sensors are activated simultaneously i.e. adjacent sensors will be most closely related
  • Neural network specifically Long Short-Term Networks
  • deep learning models are continually updated with time series inputs of labeled activities, mobility measures, discovered activities sequences, and vital signs. These models identify significant changes in these inputs over time, allowing the recognition of anomalous activity on the part of the monitored individual.
  • time-stamped ground truth data is collected at the time of installation based on activation of alternating current electrical devices and water fixtures, plus scripted human activities (which may involve the monitored individual or others).
  • a monitored individual may wear an accelerometer, gyroscope, and/or radio frequency identification tag to compile additional ground truth model training data (for a period of two weeks or less).
  • data from multiple monitored individuals in different homes may be compiled to further train/tune classification models for improved classification accuracy. This training period typically lasts fewer than 14 days.
  • additional ground truth data is gathered through periodic interaction with monitored individuals.
  • FIG. 3 illustrates the methodology applied towards a layered process of dashboard accessibility to human subjects, doctors, nurses, caregivers, insurers, medical device controllers, chatbots, telehealth, home-healthcare services, and so forth.
  • the dashboard 290 is programmed to be visually available and integrated into the clinical workflow process at nurse stations, EPIC, CERNER-GENESIS, and other electronic health record systems and on mobile phone applications or web-based portals.
  • the system has an optional inbuilt crisis alert mechanism that is triggered into a red-yellow-green signal to caregivers and clinical managers. This is dynamically programmed based on individual input into the settings of the dashboard.
  • An example is a monitored individual does not eat for an entire day or uses the toilet several more times than usual in a day, thus triggering a yellow alarm. Red would correspond to a scenario where contact needs to be made with the patient or elderly immediately.
  • the data available on the dashboard 290 has several available time horizons depending upon the analysis required. For example, it is possible to seek the history and timings of toilet use over the past 30 or 60 days.
  • the dashboard architecture comprises a plurality of sensors 300 (e.g., the sensors 1 - 5 of FIG. 1 ) that provide collected information to an edge computing device or module 302 (e.g., a processor) on premises (i.e., at the location, home, etc., of the monitored patient).
  • the edge computing device transmits the collected information to the cloud 30 ( FIG. 1 ).
  • the transmitted information is received by a server 32 ( FIG. 1 ) via the cloud 30 , and the information is stored in a database 34 ( FIG. 1 ).
  • Algorithms 306 discussed below herein and above with regard to reference numeral 36 of FIG. 1 ) for processing the information are executed on the information, and processed information is stored in a second databased 308 .
  • the processed information is provided to an API 310 for additional processing (discussed below herein).
  • a summary screen 312 if generated by the API and presented to a user (e.g., a monitoring technician, caregiver or physician) for viewing on a computing device (e.g., a computer, a mobile device or smartphone, etc.).
  • the Summary screen comprises a plurality of selectable features (e.g., clickable icons or the like) including but not limited to an activities feature 314 , which when selected causes a list of selectable activities to be presented (e.g., via a drop down or pop up menu or the like).
  • the selectable features include without being limited to: bathing 316 , mobility 318 ; sleeping 320 ; eating 322 , and toileting 324 .
  • a vital signs feature 326 which upon selection causes a plurality of selectable vital sign features (e.g., icons or the like) to be presented to the user via drop down or pop up menu or the like.
  • the additional vital signs features include without being limited to: heart rate 328 , respiratory rate 330 , etc.
  • the dashboard architecture further comprises a details screen 332 that is presented to the user upon selection of a particular feature on the summary screen. In the illustrated example, the sleeping activity 320 has been selected from the summary screen and is presented in greater detail on the detail screen.
  • the detail screen also provides a time range selector 336 via which a user can select a time range for viewing data associated with the selected activity or vital. Once the time range is selected, a history of activity 338 is presented to the user.
  • a personal thresholds alarm configuration tab or icon 340 via which the user can personalize alarm thresholds for activities or vitals.
  • Detecting declines in functional or cognitive status, and early indicators of chronic disease exacerbation in the home setting can provide the opportunity to intervene earlier to prevent accidents, complications, and more severe exacerbations.
  • Such intervention has the potential to reduce patient morbidity and risk of mortality, decrease emergency department (ED) and hospital utilization, and reduce system cost.
  • ED emergency department
  • the ability for caregivers and loved ones to monitor the functional status and safety of patients over time offers greater opportunity for seniors to age at home, delaying institutionalization.
  • the systems and methods disclosed here can also permit monitoring of adverse therapeutic drug effects and functioning of implantable medical devices such as heart pacemakers and neural stimulators through auxiliary sensors.
  • This sensor configuration is enabled by the integration and interoperability of a set of collaborative sensors.
  • the ability to unobtrusively and passively monitor chronic disease patients in the home offers the potential for earlier identification of exacerbations or decline beyond clinically important thresholds with less reliance on patient or family history and adherence, and less patient burden.
  • tracking changes in daily activities of the elderly and those with some degree of cognitive impairment can allow loved ones and providers to monitor the overall well-being of a patient and to identify areas where the individual is having difficulty safely performing the activities necessary for independent living.
  • FIG. 4 illustrates a process flow 400 -showing six distinct processes related to the hardware, software, controller hubs, algorithms, dashboards and gateways.
  • Exemplary hardware is based on a set of underlying sciences (electricity, ultrasound water, ballistocardiography (BGC), and Infrared and reflected electromagnetic waves) that have been proprietarily selected for an optimum level of adoptability, scalability and validity.
  • electrical usage and load disaggregation research has validated energy efficiency models through electrical non-intrusive load monitoring of residential buildings (Berges, Goldman, Matthews, Soibelman, & Anderson, 2011; Zoha, Gluhak, Imran, & Rajasegarar, 2012)
  • Non-intrusive electric load is monitored through electro-magnetic waves enabled with Rogowski coils that are embedded in current transformer sensors and located in the circuit breaker board.
  • Wi-Fi systems are used to assess movement created by waves among the wireless signals (C.-Y Hsu, R. Hristov, G.-H Lee, M. Zhao and D. Katabi 2019 “Enabling Identification and Behavioral Sensing in Homes using Radio Reflections” and in some cases the use of infrared sensors with pixel levels below 16 ⁇ 16 (Wei-Han Chen and Hsi-Pin Ma, 2015 “A fall detection system based on infrared array sensors with tracking capability for the elderly at home,”
  • HR heart rate
  • RR respiratory rate
  • BCG ballistocardiography method
  • Ballistocardiography measures movements linked with cardiac contraction & ejection of blood and with the deceleration of blood flow through blood vessels (Pinheiro, Postolache, & Gir ⁇ o, 2010). Chest wall and gross movement is also detected through this method.
  • the controller hub 20 combines hardware and software to enable distinct capabilities and edge processing in various implementations. This is a distinct home hub that integrates and enables the interoperability of the sensors.
  • the quad core microprocessor 22 may combine a single wide coverage WiFi low power chip with connectivity, allowing the system to reliably transmit data. It has multiple power modes and provides dynamic power scaling. It integrates an antenna switch, radio frequency, power amplifier, filters and other power management modules.
  • the hub 20 is embedded with a communications gateway 24 that transmits data to an external server 32 . Data is read from the IoT sensors on a proprietarily defined time interval-based system that relies on degree of relevance, which changes dynamically.
  • each sensor differs by sensor based upon relevance and human subject conditions; thus, it can stream data in batches at intervals ranging from seconds to 12-hours; this rate can vary within its sub-components.
  • the hub has the capability of beaming raw sensor data to the remote server 32 through the communications gateway 24 or processing the raw sensor data locally (at the “edge”) to transmit data which has been transformed for input into algorithms. Whether edge processing occurs may change dynamically based on human subject health.
  • the controller hub is embedded with a cellular transmission system, a local area network WiFi capability and a microprocessor, which has a set of instructions programmed to dynamically manage human data based on a set of health rules.
  • the hub 20 provides processing, memory, WiFi, and a gateway interface comprised of long-term evolution for machines (LTE-M) broadband interface or 3G cellular networks to the remote server in the cloud.
  • LTE-M long-term evolution for machines
  • 3G cellular networks 3G cellular networks
  • LTE-M long-term evolution for machines
  • NB-IoT network system which is a low power wide area technology that enables the sensor to improve power consumption and spectrum efficiency.
  • This new physical layer signal can provide extended coverage into deep indoors environments with ultra-low device complexity.
  • the LTE-M optimally addresses the low-powered sensors being used.
  • the application combines standards of IEEE, that include WIFI®, ZigBee, Z WAVE®, BLUETOOTH®, local area network (LAN) including using Ethernet, cellular networks and wide area networks (WAN). All data packets have unique encrypted security codes that aims to protect human subjects' data.
  • the software code creates a distinct set of processes that enables the hub platform to operationalize an interoperable sensor system across water, electricity and human body actions.
  • Programming languages and formats used include, but are not limited to, Java, Java Script, C++, Python. These communication protocols are programmed in Java Script and no user interface is permitted with the proprietary hub.
  • the hub 20 is encrypted and HIPAA and FCC compliant and does not provide raw sensor data to any external entity or individual.
  • the remote server 32 is located in the cloud 30 and uses Node.js as an open source cross-platform to execute its Java Script programming functions.
  • a set of distinct firmware Java code is used to integrate individual sensors to related microprocessors.
  • a back-end data engineering program function to receive, validate; organize, and store data is conducted in the server to transform the raw-sensor data into structured and semi-structured formats (data elements) and sent to the database 34 where it can be accessed by registered internal individuals.
  • the database 34 downloads are made available to authorized desktops for data engineering and data science processes.
  • the database 34 is an AWS S3.
  • the AWS S3 is a document database with the scalability and flexibility that permits querying and indexing.
  • the platform has strict data encapsulation, meaning there are several layers built in that enforce limited access to data.
  • Data output for visualization is located in AWS and is exported through an API connect. All external access is mediated through our application programming interface (API), where we have implemented security and audit checks to authenticate access to data.
  • API application programming interface
  • raw sensor inputs feed a sequential pattern mining algorithm which recognizes, but does not label, similar sensor sequences.
  • Labeled activities and stereotypical unlabeled sequences are probabilistically attributed to individuals in a multi-occupant dwelling based on location in the dwelling, body habitus, and/or gait characteristics.
  • Neural network specifically Long Short-Term Networks
  • deep learning models are continually updated with time series inputs of labeled activities, mobility measures, discovered activities sequences, and vital signs. These models identify significant changes in these inputs over time, allowing the recognition of anomalous activity on the part of the monitored individual.
  • FIG. 5 illustrates an example dwelling that may be equipped with the systems and methods described herein.
  • Electrical sensors 1 , 1 a are in each of the main rooms of the dwelling, including a living room, a bedroom, a kitchen, a dining room, and a bathroom.
  • a water usage sensor 2 is shown in the kitchen or wherever the water lines may be, as well as in the bathroom.
  • a seep sensor 3 and vital signs sensor 3 a is provided in the bedroom.
  • Mobility sensors 4 are positioned in the living room and dining room, although other rooms can be equipped therewith.
  • a vital sign sensor 5 for respiratory rate is also located in the living room. It will be understood that that any desired room may be equipped with any desired sensor configuration.

Abstract

The present application relates generally to a monitoring system to assist with aging-in-place for elderly individuals and patients with chronic diseases either living at home, senior living or assisted living facilities. In one aspect, integrating the use machine learning and signals from an interoperable system of electricity usage, water usage, ballistocardiography (BGC), and ultra-wideband radar, WiFi based computer visioning, infrared. The system and methods may be used to identify and track common daily human activities, instrumental daily living activities, the patient or elderly personal physical status in the home or at senior and assisted living facilities. Anomalies to patterns can be determined by identifying disruptions in previously established patterns.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • Under 35 U.S.C. § 119, this application claims priority to, and the benefit of, U.S. provisional patent application No. 62/889,306, entitled “REMOTE MONITORING SYSTEMS AND METHODS FOR ELDERLY & PATIENT IN-HOME CARE”, and filed on Aug. 20, 2019, the entirety of which is hereby incorporated by reference.
  • BACKGROUND
  • The present application relates generally to in-home and senior living facilities monitoring of elderly and patients with dementia or chronic diseases.
  • Many elderly human subjects are aging-at-home alone as well as at senior living facilities, and in most situations suffer from dementia and/or chronic diseases. These chronic health conditions and conditions of aging require constant remote patient monitoring (RPM). There are several limitations to the prevailing knowledge and technologies that assist with RPM. For example, clinicians have limited insight into human subject behavior leading up to an acute event or indicating gradual decline and rely on clinical or caregiver interaction for data. For example, the technologies that monitor mobility or falls are confronted by poor adherence, identifying a specific individual and limited range coverage efficacy. Additionally, most technologies do not comprehensively encompass the combined core daily activities of a human subject (such as eating, sleeping, bathing, toileting and mobility in the home) and are faced with weak inter-operability of sensor systems. The need to monitor a subject's daily core behavioral activities and vital signs is central to chronic disease management and safe aging-in-place. Most of the prevailing methods use fixed visiting nursing program schedules, telehealth, wearable technologies and family caregiver insights that all have limitations.
  • The following provides new and improved systems, technologies and methods which overcome the above-referenced problems and others.
  • BRIEF DESCRIPTION
  • According to one aspect, a system for labeling daily living activities for a patient comprises at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the system at least to: receive at least one measurement from at least one sensor; determine an activity of the patient based on the received at least one measurement; and label an activity of the patient or elderly based on the determined behavior.
  • According to another aspect, a system for determining an activity of a patient comprises at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the system to: receive at least one measurement from at least one sensor; and determine the activity of the patient based on at least one of: (i) rule-based heuristics, (ii) training data from a home of one or more patients, and (iii) the received at least one measurement from the at least one sensor. The activity includes at least one common activity of daily living and common instrumental activity of daily living;
  • According to another aspect, a method comprises: receiving at least one measurement from at least one sensor; determining an activity of a patient based on the received at least one measurement; and generating alerts when trends deviate from a set of pre-defined thresholds.
  • One advantage resides in an in-home and senior living facility remote monitoring system that operates without the use of microphones, camera visualization, or other invasive data gathering sources.
  • Other advantages will become apparent to one of ordinary skill in the art upon reading and understanding this disclosure. It is to be understood that a specific embodiment may attain, none, one, two, more, or all of these advantages.
  • The invention may take form in various components and arrangements of components, and in various steps and arrangements of steps. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of an exemplary integrated and interoperable contactless (meaning not requiring direct contact with the monitored person) sensor technology system with embedded microprocessors and a gateway that communicates with a server where processing develops an informational and actionable dashboard.
  • FIG. 2 is a block diagram of an exemplary data engineering process that applies a set of distinct sorting and algorithmic models at each separate stage from raw data input to the output of behavioral activity labels and vital signs enabling actionable steps delivered through a system that enables intervention and informational system visualized on a dashboard.
  • FIG. 3 illustrates the information technology methodology applied towards a consumer and clinical layered process of dashboard accessibility to human subjects, doctors, nurses, family caregivers, insurers, medical device controllers, chatbots, telehealth, home-healthcare services, and so forth.
  • FIG. 4 illustrates six distinct combined processes of underlying sciences as related to the hardware, software, controller hubs, algorithms with edge computing and communication, dashboards and gateways.
  • FIG. 5 illustrates an example dwelling that may be equipped with the systems and methods described herein.
  • DETAILED DESCRIPTION
  • Engineering Architecture: The following illustrates an example implementation of the systems and methods described herein.
  • With reference to the example of FIG. 1, a hardware engineering system includes sensors 1, 1 a, 2, 3, 3 a 4, and 5, and a home-hub product that leads to backend servers embedded with proprietary software systems. The installation occurs in a customer home, which, in some examples, may be a single family, town house, apartment or suite, senior or assisted living unit with a standalone electric circuit breaker management system (see, e.g., FIG. 5). The installation can be performed by a certified electrician and completed in less than half-a-day. The engineering architecture includes, electrical current transformer system, ultrasound acoustic water system, a sleeping, respiratory and heart rate system, a mobility tracking system, a local microprocessor for edge computing, a load sensing controller system, a local area network Wi-Fi system and a WiFi and Cellular communications gateway embedded in a hub which may include a microprocessor, a wide area network to enable distributed data management, remote servers, algorithms and a dashboard data visualization system. In one embodiment, there are four core sensors used, although more sensors and modalities may be employed. The described four sensors are directly and indirectly controlled by a single home-hub which is comprised of a microprocessor and network gateway. The technology applied in this application is contactless and passive and does not require visual cameras or any form of personal interaction with chatbots, wearables or medical devices on the part of the monitored individual.
  • The following describes sensors used in the systems and methods of the present application. It should be understood that the term “sensor” may refer to a single sensor, or to a grouping of sensors that are the same or different kind of sensors. For example, sensor 1 may refer to a single electrical sensor, or to multiple electrical sensors placed throughout a house. In another example, sensor 5 may refer to a grouping of sensors that includes both an ultra-wide band radar unit as well as an accelerometer.
  • In one embodiment, sensor 1 is an electrical sensor. In one implementation, sensor 1 is a set of electro-magnetic current transformer rings that are located on the main circuit board in the home. These can be located either at specific circuit breakers that are being monitored or at the main electric lines. The system has a set of algorithms that enable the identification of to specific appliances, devices or lights based on the characteristics of electric load.
  • For instance, where Sensor 1 a is a subset of electro-magnetic current transformer rings that are located on the circuit breakers related to the kitchen and other rooms. The circuit breaker load identification algorithms permit usage tracking at the unit level (e.g. microwave, refrigerator, kettle, dishwasher, electric cooktop). The sensor 1 a need not have a battery and can connect through a local (Wi-Fi) wireless network interface, which may be embedded in a separate hub. These high-resolution technologies measure the use of specific electrical devices (e.g. dishwasher, refrigeration, plug/outlet loads) from a single point in the home through an electric load disaggregation model.
  • Sensor 2 is a water sensor that may be located on a main water pipe or any branch of pipes, which can be either PVC material or various types of metal. Sensor 2 applies an ultrasound acoustic measuring methodology to ascertain the on/off usage of water, flow and quantity of clean water drawn down through the measured pipe or channel. The described algorithms are designed to learn (e.g., via a processor that executes learning algorithms, classifiers, Bayesian nets, or the like) how to attribute on/off usage of water in certain pipes, water drawdowns, and related flow duration into toileting and bathing within a predetermined period (e.g., a day, a week, two weeks, a month, or some other predetermined water flow period). Sensor 2 may additionally or alternatively be located on water pipes directly linked into a most-frequently-used bathroom. The sensor 2 connects through a local (Wi-Fi) wireless network interface embedded in the hub and, in some embodiments, does not operate with batteries. This noninvasive sensor technology does not necessitate cutting pipes or wires and does not rely on utility provider participation. That is, the described ultrasonic technology unit emits inaudible sound waves (i.e., ultrasound waves) that are used to measure water flow.
  • Sensor 3 is a three-signal sensor that is located between the mattress and frame of the bed (this is also illustrated in FIG. 3). In one implementation, sensor 3 is an accelerometer. In some embodiments, the sensor 3 may include a camera(s); in some embodiments, the sensor 3 does not include any cameras. The system may use a ballistocardiographic (BCG) method, which is a non-invasive method that is based on the measurement of the body motion generated by the ejection of blood at each cardiac cycle. The system uses the information from sensor 3 to define periods of sleep as rapid eye movement (REM), deep, and light phases. There is one sensor required (although more may be used) and can be located in proximity to the monitored human even if the bed has two occupants. The sensor 3 may be connected locally to a power receptacle and, in some embodiments, does not operate on batteries.
  • Sensor 3 a collects data related to respiratory rate (RR) and heart rate (HR) measurements using a set of algorithms. Sensor 3 a measures the mechanical aspects of respiratory activity through chest wall movement, and of heart activity through movements produced by ejection of blood from the heart into the great vessels. Accordingly, the system can produce a set of vital signs on an interval basis or continuously. In some embodiments, the sensor does not require a battery but rather connects through a local microprocessor to the hub 20. The sensors 3, 3 a are able to detect whether a subject is on the bed and for what duration.
  • In contrast to the crystalline structure of piezo sensors, the described sensors convert mechanical force into proportional electrical energy based on a permanent electric charge inside the cellular structure of the sensor core. The transducer behaves like an “active” capacitor, consequently, the loading of the signal by the input impedance of the measuring device must be considered. The low mass contributed by the transducer is useful due to its non-resonate behavior. Frequency response is inherently flat to over 20 KHz with only the R-C roll off at low frequencies distorting the profile.
  • Sensor 4 is a three-dimensional (3D) mobility sensor. In one embodiment, this sensor uses ultrawideband radar to track the mobility of the human. It is a 3D sensor that emits ultrawideband radar waves and senses waves which are reflected from objects. The energy used can penetrate most building walls. The sensor 4 is positioned to maximize its coverage area and capture relevant mobility. In some embodiments, it does not operate on batteries and connects through a local microprocessor to the hub 20.
  • Sensor 5 combines two high-resolution sensors that use ultrawide band radar (UWR) with BCG to track mobility, presence and respiratory rate (RR) (e.g. sensor 5 may include an ultra-wideband radar unit as well as accelerometers for the BCG). The UWR sensor part of sensor 5 uses radio frequency reflections to collect position, mobility, and gait information on a human in an indoor environment, even when the person may be in another room from where the device is located. BCG is a technique for detecting repetitive motions of the human body arising from the sudden ejection of blood into the great vessels with each heartbeat.
  • Sensor 5 can also be configured to combine electromagnetic waves produced by Wi-Fi to assess movement created by waves among the wireless signals. In another embodiment, Sensor 5 is configured to combine low pixel (e.g., under 16×16) passive infrared sensor rays so the sensor is able to augment detection of mobility of a target human in situations where there are many occupants.
  • Local microprocessor 10 is programmed to combine and to ‘read’ the aggregated data emanating from sensor 3 and sensor 4 to the hub. The microprocessor program subsequently uses the local area network Wi-Fi 26 embedded in the hub to transmit the data to the home-hub. In some embodiments, the gateway 24 uses either a broadband internet or cellular sim card router system to transmit data to the server 22; this functionality can be switched off remotely or automatically with a proprietary software program. Although the example of FIGURE shows Wi-Fi 26, it should be understood that communication, in some embodiments, may occur via any communication medium (e.g., a Wi-Fi Network, any type of LTE network, a Zigbee network, Bluetooth, Cellular network and so forth).
  • Transmitter 12, which in some embodiments comprises a load identifier, pushes aggregated data from electrical waveform sensor 1 (example, SENSE), or sensor 1 a (example, EMCB), and sensor 2 and sensor 3 through an inbuilt processor to the Wi-Fi 26 embedded in the hub. The aggregated and disaggregated electrical load, aggregated and disaggregated water flow, are identified through a disaggregation method at the circuit breaker level, the main water pipe and localized water pipes, respectively. Transmitter 12 pushed Sensor 3 data related to sleeping, heart rate and respiratory rate to the hub gateway 24. Once data passes through the gateway to a remote server 32, it undergoes further machine learning to increase validation, classification and attribution.
  • Hub 20 is a hardware unit comprised of a microprocessor 22 and a communications gateway 24. The hub 20 integrates aggregated data from the sensors 1, 1 a, 2, 3, 3 a, 4, and 5 by reading the data at different time intervals and pushes the data via the gateway 24 to a remote server 32 through an external wide area network 28. The hub microprocessor 20 provides edge processing. The hub 20 provides memory and a gateway interface comprised of long-term evolution for machines (LTE-M) broadband interface, WiFi and/or cellular networks to the remote server in the cloud 30.
  • Wide Area Network 28 connects external cloud application programming interfaces (API) into proprietary Cloud 30 as well as transmits data directly from Gateway 24 to Cloud 30.
  • Cloud 30 pushes the data into a dedicated server 32 located in a secure controlled storage environment and/or to an on-premises server. The data is processed, transformed and loaded into structured and semi-structured data matrices in the database 34 and is made available for applied algorithms to create the output of, e.g., at least five daily living activities: food preparation/eating, sleeping, bathing, toileting, and mobility, and at least two vital signs such as heart rate and respiratory rate.
  • Algorithms 36 reside either on the edge device such as controller or within Cloud 30 and are selected and applied distinctly for each machine learning step towards determining the plurality of activities with a predefined confidence level.
  • Dashboard 38 receives daily living activities data from the database 34 from the respective sensors, and the data is subsequently converted into separate layered visualizations for caregivers, family members, home care administrators, clinicians, and others. In one embodiment, the database sets up a secure data pipe to transmit data that feeds into the user experience (UX) architecture set up for iOS, Android and/or any other suitable Web-based dashboard systems.
  • With continued reference to FIG. 1, FIG. 2 represents the algorithmic methods utilized to collect and process raw input data 210 that are read from the sensors 205 (see also sensors 1, 1 a, 2, 3, 3 a, 4, and 5 of FIG. 1). At the raw data stage of the hub 20, the microprocessor 22 has built-in algorithms that dynamically change based on the human's daily living activities. As raw data arrives into the remote server, the code operates alongside selected algorithms to determine the structured and semi-structured data types that are required to develop the optimal output results. Structured data are data whose elements are addressable for effective analysis. It includes all data which can be stored in database AWS S3 in a table with rows and columns. They have relational keys that can easily be mapped into pre-designed fields. An example of this is heart rate or water flow. Semi-structured data is information that does not reside in a relational database but that have some organizational properties that make it easier to analyze. With some process, this data can be stored in the relational database. An example of this is Extensible Markup Language (XML) data.
  • Based on health conditions, these algorithms are adjusted dynamically; for example, the models may be “tuned” or further trained. Several standard machine learning methods are applied and non-standard models to develop time sequences 212, feature generation 215, activity classification 220, activity discovery 225, and human subject attribution 230 before the output 235 is developed that is related to routine activities 240, 250 and anomaly detections 232. Visuals 245, 255 may be created based on the activities 240, 250. Also included is a model database 60 from which models, algorithms, etc., are retrieved for performing various steps of the method of FIG. 2.
  • The example of FIG. 3 shows an embedded, layered system of information and visualization. The administrator, healthcare provider, patient, elderly, home healthcare, or any individual or entity approved by the monitored individual or their surrogate is provided access to patterns of five daily living activities and two vital signs on the dashboard 310; and this information is available to authorized family caregivers as well. In some embodiments, the patient cannot tamper with any of the technologies with exception of the bed sensor (e.g., sensor 3 of FIG. 1), which provides the individual optionality to switch on/off the power source to the bed sensor. The dashboard's layered system provides insight into daily routine of living activities, history of specific activities and related anomalies, and vital signs of heart rate (HR) and respiratory rate (RR) as well as vital sign histories. An algorithm creates an optional layered alarm system of user-tailored alerts via the dashboard from low-threshold signals to caregivers and high-threshold alerts to healthcare professionals. The individual can set thresholds based on personal behavioral experiences or the system can set a default range which improves with time; however the individual is not required to activate any alarms. Furthermore, the dashboard can be integrated into family caregivers' phone iOS and Android based systems, web-based systems and also into electronic health record systems, e.g. EPIC and CERNER-GENESIS.
  • “HUB” Hardware & Software Engineering: In the example of FIG. 1, the microcontroller located in the hub 20 has software to enable trouble-shooting readings within the hub 20. A java-based software program is coded to read data from Sensor 4 and uses WebSocket for secure handshake protocols and then transmits data to the external server in a JSON format. WebSockets are used to connect and send data and provide full duplex communication with the server with speeds of up to twice those of REST APIs. The over the air (OTA) protocols are established to permit remote monitoring and updates. Similarly, sensor 4 data are read by the microcontroller 22 located in the hub 20, and the sensor receives commands from WebSockets to transmit data to servers. Sensor 1, 1 a, and 2 have independent application programming interfaces (APIs), which permit “crawling” and this has been enabled at a frequency based on our models. A “crawler” is best described as a program that simulates the user's behavior on a website, following all the steps a user does with the browser such as entering search parameters (e.g. destination, date, etc.), requesting a result by clicking on the search button and then scanning through them. In addition, a REST API server is established to receive crawled data to provide flexibility. The remote server uses an AWS S3 and DynamoDB® which is controlled by an administrative panel that permits users to create and download csv files of structured and semi-structured data. The home hub 20 is a container that co-locates a microprocessor 22 and communications gateway 24. The distinct feature of the hub 20 and its related micro-processor is the ability to integrate, control and enable the interoperability of different underlying sciences (electricity, water, BGC, infrared, WiFi and ultra-wideband radar) to produce time series histories of at least five common behavioral activities (eating, sleeping, bathing, toileting, walking) and two vital signs i.e. heart rate and respiratory rate. The software program controls the sequencing of each data input to generate a value-based structure to enable modeling. The hub has a in-built gateway that can adopt a low capacity or high capacity data router using either Cellular, WiFi, NB-IoT or LTE-M, these routers can also be programmed to shut down so that the hub can be connected to third party gateways. Table 1 below identifies the programming languages that are associated with each hardware unit within a context for a specific purpose and the type of output and related value it provides.
  • TABLE 1
    PURPOSE CONTEXT INPUT LANGUAGE ATTRIBUTES
    monitoring home ultrawideband python/C++/ health alert
    movement radar sensor, WiFi, algorithms health
    infrared assessment
    monitoring master ultrasound acoustic python/ home risk
    water usage bathroom sensor algorithms health
    assessment
    monitoring residence electromagnetic python/ health
    cooking & kitchen signals sensor algorithms assessment
    eating WIFI home risk
    infrared
    monitoring subject's bed ballistocardiography python/C++/ clinical
    RR/HR sensor algorithms management
    health data
    monitoring subject's bed ballistocardiography python/C++/ health
    sleeping sensor algorithms assessment
    health data
    caregiver & home integrated sensors NoJS/algorithms dashboard
    clinical JSON
    monitoring
    hub controller home integrate sensors C++/python interoperability
    pull/push data
    remote server remote pull data java script/ data security
    algorithms
  • Algorithms: Summarizing features are generated from raw sensor inputs. These features serve as inputs to classification algorithms which label characteristic patterns of features as certain daily behavioral activities (e.g. bathing, toileting, eating, etc.). The first-level approach is one of applying logic-based heuristics (e.g. 12 gallons or more of water running in less than 20 minutes, on/off water usage in specific pipes related to bathing and toileting, combined with presence in the bathroom, +/− electricity use from the bathroom suggests bathing). Heuristic results are combined with machine learning classification algorithm outputs. Classification approaches utilized include but not limited to support vector machines, logistic regression, and random forest models. To provide the most robust classifications, we use a model fusion technique to create a single label from the combined outputs of each model type (heuristics+ one or more machine learning classification models). Labeled activities and stereotypical unlabeled sequence are probabilistically attributed to individuals in a multi-occupant dwelling based on location in the dwelling, body habitus, and/or gait characteristics.
  • Sensor windows: Because the streams of sensor data to be categorized are continually flowing, a method is needed to define a discrete series of contiguous sensor events for analysis. A sliding window method can be employed in which an activity window Si containing N sensor events is defined by sensor event i and the N−1 sensor firings preceding it. Each activity window has an associated “feature vector” which contains the time of the first sensor event s1, the time of the last sensor event si, and one element for each sensor in the home describing the number of times each respective sensor has fired during window Si. Because a given window (with length defined by number of sensor events) may encompass sensor firings from different functional areas of the home over different time intervals, the influence of more physically remote sensors may be discounted based on the mutual information method outlined by Krishnan and Cook. A mutual information matrix describing the extent to which all possible pairs of sensors are activated simultaneously (i.e. adjacent sensors will be most closely related) will be established based on an equipment calibration routine at installation. Neural network (specifically Long Short-Term Networks) and deep learning models are continually updated with time series inputs of labeled activities, mobility measures, discovered activities sequences, and vital signs. These models identify significant changes in these inputs over time, allowing the recognition of anomalous activity on the part of the monitored individual.
  • To train classification algorithms, time-stamped ground truth data is collected at the time of installation based on activation of alternating current electrical devices and water fixtures, plus scripted human activities (which may involve the monitored individual or others). In some embodiments, a monitored individual may wear an accelerometer, gyroscope, and/or radio frequency identification tag to compile additional ground truth model training data (for a period of two weeks or less). In some embodiments, data from multiple monitored individuals in different homes may be compiled to further train/tune classification models for improved classification accuracy. This training period typically lasts fewer than 14 days. In some embodiments, additional ground truth data is gathered through periodic interaction with monitored individuals. These algorithms are distinct due to the nature of the data source that predicts a highly validated behavioral activities of daily living of a human subject along with RR/HR. It is unique to obtain one million observations on human subject's real core activities of daily living in order to train the machine to predict daily human activities.
  • Dashboards: FIG. 3 illustrates the methodology applied towards a layered process of dashboard accessibility to human subjects, doctors, nurses, caregivers, insurers, medical device controllers, chatbots, telehealth, home-healthcare services, and so forth. The dashboard 290 is programmed to be visually available and integrated into the clinical workflow process at nurse stations, EPIC, CERNER-GENESIS, and other electronic health record systems and on mobile phone applications or web-based portals. The system has an optional inbuilt crisis alert mechanism that is triggered into a red-yellow-green signal to caregivers and clinical managers. This is dynamically programmed based on individual input into the settings of the dashboard. An example is a monitored individual does not eat for an entire day or uses the toilet several more times than usual in a day, thus triggering a yellow alarm. Red would correspond to a scenario where contact needs to be made with the patient or elderly immediately. The data available on the dashboard 290 has several available time horizons depending upon the analysis required. For example, it is possible to seek the history and timings of toilet use over the past 30 or 60 days.
  • To this end, the dashboard architecture comprises a plurality of sensors 300 (e.g., the sensors 1-5 of FIG. 1) that provide collected information to an edge computing device or module 302 (e.g., a processor) on premises (i.e., at the location, home, etc., of the monitored patient). The edge computing device transmits the collected information to the cloud 30 (FIG. 1). The transmitted information is received by a server 32 (FIG. 1) via the cloud 30, and the information is stored in a database 34 (FIG. 1). Algorithms 306 (discussed below herein and above with regard to reference numeral 36 of FIG. 1) for processing the information are executed on the information, and processed information is stored in a second databased 308. The processed information is provided to an API 310 for additional processing (discussed below herein).
  • A summary screen 312 if generated by the API and presented to a user (e.g., a monitoring technician, caregiver or physician) for viewing on a computing device (e.g., a computer, a mobile device or smartphone, etc.). The Summary screen comprises a plurality of selectable features (e.g., clickable icons or the like) including but not limited to an activities feature 314, which when selected causes a list of selectable activities to be presented (e.g., via a drop down or pop up menu or the like). The selectable features include without being limited to: bathing 316, mobility 318; sleeping 320; eating 322, and toileting 324. Also provided is a vital signs feature 326, which upon selection causes a plurality of selectable vital sign features (e.g., icons or the like) to be presented to the user via drop down or pop up menu or the like. The additional vital signs features include without being limited to: heart rate 328, respiratory rate 330, etc. The dashboard architecture further comprises a details screen 332 that is presented to the user upon selection of a particular feature on the summary screen. In the illustrated example, the sleeping activity 320 has been selected from the summary screen and is presented in greater detail on the detail screen. The detail screen also provides a time range selector 336 via which a user can select a time range for viewing data associated with the selected activity or vital. Once the time range is selected, a history of activity 338 is presented to the user. Also provided is a personal thresholds alarm configuration tab or icon 340 via which the user can personalize alarm thresholds for activities or vitals.
  • Detecting declines in functional or cognitive status, and early indicators of chronic disease exacerbation in the home setting can provide the opportunity to intervene earlier to prevent accidents, complications, and more severe exacerbations. Such intervention has the potential to reduce patient morbidity and risk of mortality, decrease emergency department (ED) and hospital utilization, and reduce system cost. The ability for caregivers and loved ones to monitor the functional status and safety of patients over time offers greater opportunity for seniors to age at home, delaying institutionalization.
  • To continuously and objectively monitor patients in their daily life requires an unobtrusive autonomous system in the home. In addition to primary effects of chronic diseases or syndromes, the systems and methods disclosed here can also permit monitoring of adverse therapeutic drug effects and functioning of implantable medical devices such as heart pacemakers and neural stimulators through auxiliary sensors. This sensor configuration is enabled by the integration and interoperability of a set of collaborative sensors. The ability to unobtrusively and passively monitor chronic disease patients in the home offers the potential for earlier identification of exacerbations or decline beyond clinically important thresholds with less reliance on patient or family history and adherence, and less patient burden. Beyond the ability to predict chronic disease exacerbations, tracking changes in daily activities of the elderly and those with some degree of cognitive impairment can allow loved ones and providers to monitor the overall well-being of a patient and to identify areas where the individual is having difficulty safely performing the activities necessary for independent living.
  • FIG. 4 illustrates a process flow 400-showing six distinct processes related to the hardware, software, controller hubs, algorithms, dashboards and gateways.
  • Exemplary hardware is based on a set of underlying sciences (electricity, ultrasound water, ballistocardiography (BGC), and Infrared and reflected electromagnetic waves) that have been proprietarily selected for an optimum level of adoptability, scalability and validity. In the case of electrical usage and load disaggregation, research has validated energy efficiency models through electrical non-intrusive load monitoring of residential buildings (Berges, Goldman, Matthews, Soibelman, & Anderson, 2011; Zoha, Gluhak, Imran, & Rajasegarar, 2012) Non-intrusive electric load is monitored through electro-magnetic waves enabled with Rogowski coils that are embedded in current transformer sensors and located in the circuit breaker board. (Samimi, Mahari, Farahnakian, & Mohseni, 2014) To ascertain disaggregated water usage, non-invasive inaudible ultrasound acoustic & vibration pulse waves are applied to measure water flow (Britton, Cole, Stewart, & Wiskar, 2008). To monitor the mobility of a human subject in the home, an application of antennas ultra-wideband Radio Frequency (RF) tridimensional (3D) sensing and image processing is installed (Brena et al., 2017). The antenna array illuminates the area in front of it and senses the returning signals. The signals are produced and recorded by an integrated circuit chip and the data is communicated to the remote server via the hub gateway 24. Wi-Fi systems are used to assess movement created by waves among the wireless signals (C.-Y Hsu, R. Hristov, G.-H Lee, M. Zhao and D. Katabi 2019 “Enabling Identification and Behavioral Sensing in Homes using Radio Reflections” and in some cases the use of infrared sensors with pixel levels below 16×16 (Wei-Han Chen and Hsi-Pin Ma, 2015 “A fall detection system based on infrared array sensors with tracking capability for the elderly at home,” To monitor sleep, heart rate (HR) and respiratory rate (RR), the ballistocardiography method (BCG) is applied and measures sleep stages based on heart rate, respiratory rate, and gross movement. Ballistocardiography measures movements linked with cardiac contraction & ejection of blood and with the deceleration of blood flow through blood vessels (Pinheiro, Postolache, & Girão, 2010). Chest wall and gross movement is also detected through this method.
  • With continued reference to FIGS. 1-4, the controller hub 20 combines hardware and software to enable distinct capabilities and edge processing in various implementations. This is a distinct home hub that integrates and enables the interoperability of the sensors. The quad core microprocessor 22 may combine a single wide coverage WiFi low power chip with connectivity, allowing the system to reliably transmit data. It has multiple power modes and provides dynamic power scaling. It integrates an antenna switch, radio frequency, power amplifier, filters and other power management modules. The hub 20 is embedded with a communications gateway 24 that transmits data to an external server 32. Data is read from the IoT sensors on a proprietarily defined time interval-based system that relies on degree of relevance, which changes dynamically. The interrogation of each sensor differs by sensor based upon relevance and human subject conditions; thus, it can stream data in batches at intervals ranging from seconds to 12-hours; this rate can vary within its sub-components. The hub has the capability of beaming raw sensor data to the remote server 32 through the communications gateway 24 or processing the raw sensor data locally (at the “edge”) to transmit data which has been transformed for input into algorithms. Whether edge processing occurs may change dynamically based on human subject health. The controller hub is embedded with a cellular transmission system, a local area network WiFi capability and a microprocessor, which has a set of instructions programmed to dynamically manage human data based on a set of health rules.
  • Regarding gateway communications, the hub 20 provides processing, memory, WiFi, and a gateway interface comprised of long-term evolution for machines (LTE-M) broadband interface or 3G cellular networks to the remote server in the cloud. In rare circumstances of low data usage an NB-IoT network system, which is a low power wide area technology that enables the sensor to improve power consumption and spectrum efficiency. This new physical layer signal can provide extended coverage into deep indoors environments with ultra-low device complexity. The LTE-M optimally addresses the low-powered sensors being used.
  • The application combines standards of IEEE, that include WIFI®, ZigBee, Z WAVE®, BLUETOOTH®, local area network (LAN) including using Ethernet, cellular networks and wide area networks (WAN). All data packets have unique encrypted security codes that aims to protect human subjects' data.
  • Software: The software code creates a distinct set of processes that enables the hub platform to operationalize an interoperable sensor system across water, electricity and human body actions. Programming languages and formats used include, but are not limited to, Java, Java Script, C++, Python. These communication protocols are programmed in Java Script and no user interface is permitted with the proprietary hub. The hub 20 is encrypted and HIPAA and FCC compliant and does not provide raw sensor data to any external entity or individual. The remote server 32 is located in the cloud 30 and uses Node.js as an open source cross-platform to execute its Java Script programming functions. A set of distinct firmware Java code is used to integrate individual sensors to related microprocessors. A back-end data engineering program function to receive, validate; organize, and store data is conducted in the server to transform the raw-sensor data into structured and semi-structured formats (data elements) and sent to the database 34 where it can be accessed by registered internal individuals. The database 34 downloads are made available to authorized desktops for data engineering and data science processes. In one implementation, the database 34 is an AWS S3. The AWS S3 is a document database with the scalability and flexibility that permits querying and indexing. The platform has strict data encapsulation, meaning there are several layers built in that enforce limited access to data. Data output for visualization is located in AWS and is exported through an API connect. All external access is mediated through our application programming interface (API), where we have implemented security and audit checks to authenticate access to data.
  • Algorithms: Research has validated that behavioral activities are detected through Passive Infrared Sensors 205 and are predictive of health deterioration in seniors (Cook, Krishnan, & Rashidi, 2013; Dawadi, Cook, & Schmitter-Edgecombe, 2016; Sprint, Cook, Fritz, & Schmitter-Edgecombe, 2016) Summarizing features are generated from raw sensor inputs. These features serve as inputs to classification algorithms which label characteristic patterns of features as certain daily behavioral activities (e.g. bathing, toileting, eating, etc.). Classification approaches utilized include support vector machines, logistic regression, and random forest models; to provide the most robust classifications, we use a model fusion technique to create a single label from the combined outputs of each model type. In parallel, raw sensor inputs feed a sequential pattern mining algorithm which recognizes, but does not label, similar sensor sequences. Labeled activities and stereotypical unlabeled sequences are probabilistically attributed to individuals in a multi-occupant dwelling based on location in the dwelling, body habitus, and/or gait characteristics. Neural network (specifically Long Short-Term Networks) and deep learning models are continually updated with time series inputs of labeled activities, mobility measures, discovered activities sequences, and vital signs. These models identify significant changes in these inputs over time, allowing the recognition of anomalous activity on the part of the monitored individual.
  • FIG. 5 illustrates an example dwelling that may be equipped with the systems and methods described herein. Electrical sensors 1, 1 a are in each of the main rooms of the dwelling, including a living room, a bedroom, a kitchen, a dining room, and a bathroom. A water usage sensor 2 is shown in the kitchen or wherever the water lines may be, as well as in the bathroom. A seep sensor 3 and vital signs sensor 3 a is provided in the bedroom. Mobility sensors 4 are positioned in the living room and dining room, although other rooms can be equipped therewith. A vital sign sensor 5 for respiratory rate is also located in the living room. It will be understood that that any desired room may be equipped with any desired sensor configuration.
  • Of course, modifications and alterations will occur by others upon reading and understanding the preceding description. It is intended that the invention be construed as including all such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (38)

1. A system for labeling daily living activities for a patient comprising:
at least one processor;
and at least one memory including computer program code;
the at least one memory and the computer program code configured to, with the at least one processor; cause the system at least to:
receive at least one measurement from at least one sensor;
determine an activity of the patient based on the received at least one measurement; and
labeling an activity of the patient or elderly based on the determined behavior.
2. The system of claim 1, wherein the predicted activity includes at least one of the activities of daily living:
Ambulating;
Eating;
Bathing;
Dressing;
Toileting;
Transferring;
Continence;
Activities outside home;
Cooking;
Presence in kitchen;
Household Chores;
Taking medications;
Social Communications;
Banking;
Sleeping;
Lying in bed.
3. The system of claim 1, wherein the determined activity includes one or more of:
respiratory rate;
heart rate;
toilet flushes;
paroxysmal torso motion stemming from coughing;
use of a medical device;
night-time walking;
sleep angle;
sleep stages;
gait speed;
bed/chair-to-standing time;
stair ascent/descent time;
amount/speed of locomotion;
cooking;
eating;
bathing/showering;
personal hygiene;
household chores; and
home leaving regularity.
4. The system of claim 1, wherein the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to determine the activity of the patient further based on: (i) ground truth observations, and (ii) training data from a specific home of the patient or elderly and/or from a population of patients.
5. The system of claim 1, wherein the activity includes a time the patient or elderly spends performing the activity.
6. The system of claim 1, further including:
an audio alarm;
wherein the at least one memory and the computer program code are further configured to, with the at least one processor, cause the audio alarm to emit an audible alarm if the trends in activities moves outside a predefined time period or threshold.
7. The system of claim 1, further including:
a visual alarm;
wherein the at least one memory and the computer program code are further configured to, with the at least one processor, cause the visual alarm to emit a visual alarm if the trends in activities moves outside a predefined time period or threshold.
8. The system of claim 1, wherein the at least one sensor includes:
a first electrical measurement device configured to measure an overall power usage of a home of the patient or elderly over time;
a second electrical measurement device configured to measure a power usage of a kitchen of the home over time;
a water sensor configured to monitor water usage of the home over time;
a water sensor configured to monitor water usage in a specific bathroom over time;
a sleeping sensor configured to measure sleep of the patient or elderly;
a vital sign sensor configured to measure a vital sign of the patient or the elderly;
a mobility sensor configured to measure mobility of the patient or elderly; and
a mobility sensor configured to measure gait, sit and stand up movements of the patient of elderly.
9. The system of claim 1, wherein the at least one memory and the computer program code are further configured to; with the at least one processor, cause the system to:
generate an amber alert if the score exceeds a first predetermined threshold; and
generate a red alert if the score exceeds a second predetermined threshold.
10. The system of claim 1, further including:
a water sensor configured to measure toileting of the patient; and
wherein:
the activity is excessive or scant toileting; and
the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to label toileting activity based on the water sensor and signaling a toileting activity of the patient or elderly.
11. The system of claim 1, wherein the labeled activity is bathing, and the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to labeling the bathing activity anomaly activity based on water usage of the patient or elderly and generating alerts when trends deviate from a set of pre-defined thresholds.
12. The system of claim 1, wherein the labeled activity is eating and the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to label the eating activity anomaly based on electricity and water usage of the patient or elderly over a period of time and deviates from a pre-defined threshold or based on movement in space of the patient or elderly over time.
13. The system of claim 1, wherein the labeled activity is cooking and the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to label the cooking activity anomaly based on electricity and water usage of the patient or movement in space of the patient or elderly.
14. The system of claim 1, wherein the labeled activity is continence and the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to label the continence activity anomaly based on water usage and bed behavioral activities of the patient or elderly.
15. The system of claim 1, wherein the labeled activity is dressing and the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to label the dressing activity anomaly based on mobility sensors and electricity usage of the patient or elderly.
16. The system of claim 1, wherein the labeled activity is transferring and the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to predict the transferring activity anomaly based on electricity and mobility sensors of the patient or elderly.
17. The system of claim 1, wherein the labeled activity comprises one or more activities away from the home and the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to predict the shopping activity or non-activity based on water, electricity and mobility sensors located at the patient or elderly place.
18. The system of claim 1, wherein the labeled activity is household chores and the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to predict the household chores activities or non-activity based on electricity, water and mobility sensors located at the patient or elderly place.
19. The system of claim 1, wherein the labeled activity is medication adherence and the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to label the medication adherence activities or non-activities as well as the side effects of medication based on electricity, water and mobility and sleep sensors located at the patient or elderly place.
20. A system for determining an activity of a patient comprising:
at least one processor;
and at least one memory including computer program code;
the at least one memory and the computer program code configured to, with the at least one processor, cause the system to:
receive at least one measurement from at least one sensor; and
determine the activity of the patient based on at least one of: (i) rule-based heuristics, (ii) training data from a home of one or more patients, and (iii) the received at least one measurement from the at least one sensor;
wherein the activity includes at least one of:
a common activity of daily living and common instrumental activity of daily living;
21. The system of claim 20, wherein the at least one sensor includes a ballistocardiography (BCG) measurement device.
22. The system of claim 20, wherein the at least one sensor includes at least one of an ultra-wideband (UWB) radar, WiFi computer aided visioning, Infrared sensors and the at least one measurement includes a presence measurement.
23. The system of claim 20, wherein the at least one sensor includes an ultra-wideband (UWB) radar, and the at least one measurement includes a respiratory rate measurement.
24. The system of claim 20, wherein the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to determine the activity of the patient further based on ground truth observations.
25. The system of claim 20, wherein the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to determine the activity of the patient using machine learning classification algorithms.
26. The system of claim 20, wherein the training data comprises data acquired within a two-week time period.
27. The system of claim 20, wherein the heuristics and/or classification machine learning algorithms are adjusted based on data aggregated from the homes of other monitored individuals.
28. The system of claim 20, further comprising a dashboard, and wherein the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to:
display an avatar of the patient on the dashboard; and
display a three-alarm system on the dashboard;
The data visualization dashboard is available on multiple platforms.
29. The system of claim 20, further comprising a dashboard configured to connect to end user health record systems and into mobile applications.
30. The system of claim 20, further comprising a hub configured for edge processing.
31. The system of claim 20, wherein:
the at least one sensor includes an electrical sensor, and the at least one measurement includes an electrical measurement from the at least one electrical sensor;
the system further includes a load identifier configured to receive electrical measurement and disaggregate the electrical measurement;
the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to determine the activity based on the disaggregated electrical measurement.
32. The system of claim 20, further comprising a water sensor configured to measure water usage of the home;
the at least one memory and the computer program code are further configured to, with the at least one processor, cause the system to determine the activity based on the measured water usage.
33. The system of claim 20, further comprising a dashboard configured to connect to a wearable sensing device.
34. The system of claim 20, further comprising a dashboard configured to connect to an internet-connected smart speaker.
35. The system of claim 20, further comprising an edge processing hub configured to use artificial intelligence to prioritize of data streams based on anomalies in patient behavior.
36. The system of claim 20, further comprising an edge processing hub configured to integrate a suite of disparate sensors to create comprehensive data streams based on behavior activities of 5 or more core common daily activities.
37. The system of claim 20, further comprising a water disaggregation algorithm specific to at least one of bathing, toileting, and kitchen water usage related to common daily activities
38. A method, comprising:
receiving at least one measurement from at least one sensor;
determining an activity of a patient based on the received at least one measurement; and
generating alerts when trends deviate from a set of pre-defined thresholds.
US16/947,816 2019-08-20 2020-08-19 Remote monitoring systems and methods for elderly and patient in-home and senior living facilities care Abandoned US20210057093A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/947,816 US20210057093A1 (en) 2019-08-20 2020-08-19 Remote monitoring systems and methods for elderly and patient in-home and senior living facilities care

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962889306P 2019-08-20 2019-08-20
US16/947,816 US20210057093A1 (en) 2019-08-20 2020-08-19 Remote monitoring systems and methods for elderly and patient in-home and senior living facilities care

Publications (1)

Publication Number Publication Date
US20210057093A1 true US20210057093A1 (en) 2021-02-25

Family

ID=74645783

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/691,696 Active 2040-04-16 US11270799B2 (en) 2019-08-20 2019-11-22 In-home remote monitoring systems and methods for predicting health status decline
US16/947,816 Abandoned US20210057093A1 (en) 2019-08-20 2020-08-19 Remote monitoring systems and methods for elderly and patient in-home and senior living facilities care

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/691,696 Active 2040-04-16 US11270799B2 (en) 2019-08-20 2019-11-22 In-home remote monitoring systems and methods for predicting health status decline

Country Status (1)

Country Link
US (2) US11270799B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230137193A1 (en) * 2021-11-01 2023-05-04 Optum, Inc. Behavior deviation detection with activity timing prediction
US11747463B2 (en) 2021-02-25 2023-09-05 Cherish Health, Inc. Technologies for tracking objects within defined areas
WO2023164473A3 (en) * 2022-02-22 2023-12-07 Nomo International, Inc. Non-intrusive monitoring system

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11918330B2 (en) 2017-03-08 2024-03-05 Praesidium, Inc. Home occupant detection and monitoring system
US20230181059A1 (en) * 2020-05-14 2023-06-15 Vayyar Imaging Ltd. Systems and methods for ongoing monitoring of health and wellbeing
JP7467685B2 (en) * 2020-05-19 2024-04-15 デンタルイージー・インコーポレイテッド Bio-measurable dental treatment room
CN113296087B (en) * 2021-05-25 2023-09-22 沈阳航空航天大学 Frequency modulation continuous wave radar human body action recognition method based on data enhancement
EP4184522A1 (en) * 2021-11-23 2023-05-24 Koninklijke Philips N.V. A device and method for providing clinical information of a subject
WO2023150172A1 (en) * 2022-02-02 2023-08-10 Sleep Number Corporation Bed with features for determining risk of congestive heart failure
CN114913671A (en) * 2022-03-28 2022-08-16 山东浪潮科学研究院有限公司 Old people nursing method and system based on edge calculation
EP4254425A1 (en) 2022-03-29 2023-10-04 Heartkinetics Method, system and computer program for detecting a heart health state
WO2023217745A1 (en) * 2022-05-10 2023-11-16 Signify Holding B.V. A system and method for assessing a health status of a user based on interactions with lighting control interfaces

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100141397A1 (en) * 2008-12-08 2010-06-10 Min Ho Kim System for activity recognition
US20120086573A1 (en) * 2005-12-30 2012-04-12 Healthsense, Inc. Monitoring activity of an individual
US20160157735A1 (en) * 2014-12-09 2016-06-09 Jack Ke Zhang Techniques for near real time wellness monitoring using a wrist-worn device
US20170119283A1 (en) * 2015-10-28 2017-05-04 Koninklijke Philips N.V. Monitoring activities of daily living of a person
US9875450B1 (en) * 2012-08-29 2018-01-23 James Robert Hendrick, III System and method of automated healthcare assessments and event inferences
US20180254096A1 (en) * 2015-09-15 2018-09-06 Commonwealth Scientific And Industrial Research Organisation Activity capability monitoring
US20190244508A1 (en) * 2016-10-20 2019-08-08 Signify Holding B.V. A system and method for monitoring activities of daily living of a person
US20200143655A1 (en) * 2018-11-06 2020-05-07 iEldra Inc. Smart activities monitoring (sam) processing of data
US10825318B1 (en) * 2018-04-09 2020-11-03 State Farm Mutual Automobile Insurance Company Sensing peripheral heuristic evidence, reinforcement, and engagement system

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7937461B2 (en) * 2000-11-09 2011-05-03 Intel-Ge Care Innovations Llc Method for controlling a daily living activity monitoring system from a remote location
US6645153B2 (en) * 2002-02-07 2003-11-11 Pacesetter, Inc. System and method for evaluating risk of mortality due to congestive heart failure using physiologic sensors
US7733224B2 (en) * 2006-06-30 2010-06-08 Bao Tran Mesh network personal emergency response appliance
US20080077020A1 (en) * 2006-09-22 2008-03-27 Bam Labs, Inc. Method and apparatus for monitoring vital signs remotely
US7586418B2 (en) * 2006-11-17 2009-09-08 General Electric Company Multifunctional personal emergency response system
US10311694B2 (en) * 2014-02-06 2019-06-04 Empoweryu, Inc. System and method for adaptive indirect monitoring of subject for well-being in unattended setting
WO2016081510A1 (en) * 2014-11-17 2016-05-26 Curb Inc. Resource monitoring system with disaggregation of devices and device-specific notifications
EP3253445A4 (en) * 2015-02-03 2018-02-21 Nibs Neuroscience Technologies Ltd. Early diagnosis and treatment of alzheimer disease and mild cognitive impairment
US10896756B2 (en) * 2015-04-21 2021-01-19 Washington State University Environmental sensor-based cognitive assessment
CA2932204A1 (en) * 2015-06-25 2016-12-25 Alaya Care Inc. Method for predicting adverse events for home healthcare of remotely monitored patients
US10971253B2 (en) * 2015-06-30 2021-04-06 K4Connect Inc. Climate control system including indoor and setpoint temperature difference and exterior temperature based HVAC mode switching and related methods
US20190134096A1 (en) * 2016-05-06 2019-05-09 Hadasit Medical Research Services & Development Limited Hyperimmune colostrum in the modulation and treatment of conditions associated with the mammalian microbiome
US11883157B2 (en) * 2017-11-21 2024-01-30 Omniscient Medical As System, sensor and method for monitoring health related aspects of a patient
US11623102B2 (en) * 2018-07-31 2023-04-11 Medtronic, Inc. Wearable defibrillation apparatus configured to apply a machine learning algorithm
US20200098471A1 (en) 2018-09-25 2020-03-26 Eaton Intelligent Power Ltd. Actions based on customer premises data

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120086573A1 (en) * 2005-12-30 2012-04-12 Healthsense, Inc. Monitoring activity of an individual
US20100141397A1 (en) * 2008-12-08 2010-06-10 Min Ho Kim System for activity recognition
US9875450B1 (en) * 2012-08-29 2018-01-23 James Robert Hendrick, III System and method of automated healthcare assessments and event inferences
US20160157735A1 (en) * 2014-12-09 2016-06-09 Jack Ke Zhang Techniques for near real time wellness monitoring using a wrist-worn device
US20180254096A1 (en) * 2015-09-15 2018-09-06 Commonwealth Scientific And Industrial Research Organisation Activity capability monitoring
US20170119283A1 (en) * 2015-10-28 2017-05-04 Koninklijke Philips N.V. Monitoring activities of daily living of a person
US20190244508A1 (en) * 2016-10-20 2019-08-08 Signify Holding B.V. A system and method for monitoring activities of daily living of a person
US10825318B1 (en) * 2018-04-09 2020-11-03 State Farm Mutual Automobile Insurance Company Sensing peripheral heuristic evidence, reinforcement, and engagement system
US20200143655A1 (en) * 2018-11-06 2020-05-07 iEldra Inc. Smart activities monitoring (sam) processing of data

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Chen et al., A Knowledge-Driven Approach to Activity Recognition in Smart Homes, June 2012, IEEE Transactions on Knowledge and Data Engineering Vol. 24 No. 6 (Year: 2012) *
Emi et al., SARRIMA: Smart ADL Recognizer and Resident Identifier in Multi-resident Accomodations, October 2015, Proceedings of the conference on Wireless Health (Year: 2015) *
Ni et al., The ELderly's Independent Living in Smart Homes: A Characterization of Activities and Sensing Infrastructure Survey to Facilitate Services Development, May 2015, Sensors 15 (Year: 2015) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11747463B2 (en) 2021-02-25 2023-09-05 Cherish Health, Inc. Technologies for tracking objects within defined areas
US20230137193A1 (en) * 2021-11-01 2023-05-04 Optum, Inc. Behavior deviation detection with activity timing prediction
WO2023164473A3 (en) * 2022-02-22 2023-12-07 Nomo International, Inc. Non-intrusive monitoring system

Also Published As

Publication number Publication date
US20210057101A1 (en) 2021-02-25
US11270799B2 (en) 2022-03-08

Similar Documents

Publication Publication Date Title
US20210057093A1 (en) Remote monitoring systems and methods for elderly and patient in-home and senior living facilities care
JP6502502B2 (en) System and method for monitoring human daily activities
Amiribesheli et al. A review of smart homes in healthcare
US9526421B2 (en) Mobile wireless customizable health and condition monitor
Ding et al. Sensor technology for smart homes
CN109843173B (en) System and method for monitoring activities of daily living of a person
US20120053472A1 (en) Inexpensive non-invasive safety monitoring apparatus
Mardini et al. A survey of healthcare monitoring systems for chronically ill patients and elderly
Palumbo et al. AAL middleware infrastructure for green bed activity monitoring
CN108135538B (en) Monitoring a person's physical or psychological ability
Esch A survey on ambient intelligence in healthcare
EP3807890B1 (en) Monitoring a subject
US10736541B2 (en) Monitoring liquid and/or food consumption of a person
KR101659939B1 (en) System for managing object for care with dual parameter
Mukhopadhyay et al. Are technologies assisted homes safer for the elderly?
Žarić et al. Ambient assisted living systems in the context of human centric sensing and IoT concept: EWall case study
Power et al. Developing a sensor based homecare system: The role of bluetooth low-energy in activity monitoring
Popescu et al. Smart sensor network for continuous monitoring at home of elderly population with chronic diseases
Torres-Sospedra et al. In-home monitoring system based on WiFi fingerprints for ambient assisted living
Subramaniam et al. Ble-enabled medication events monitoring system (mems) for community dwelling seniors
KR20170046598A (en) Method and apparatus for monitoring bionic signal based wireless network
Power et al. Developing a Sensor based Homecare System
Arif et al. Web Services for Telegeriatric and Independent Living of the Elderly in their Homes.
Elango et al. LoRaWAN-Based Intelligent Home and Health Monitoring of Elderly People
Guevara Sensor network for early illness detection in the elderly

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION