US20210378511A1 - Integrated Sensor Network Methods and Systems - Google Patents
Integrated Sensor Network Methods and Systems Download PDFInfo
- Publication number
- US20210378511A1 US20210378511A1 US17/409,974 US202117409974A US2021378511A1 US 20210378511 A1 US20210378511 A1 US 20210378511A1 US 202117409974 A US202117409974 A US 202117409974A US 2021378511 A1 US2021378511 A1 US 2021378511A1
- Authority
- US
- United States
- Prior art keywords
- time period
- sensor
- sensor data
- alert
- living unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000033001 locomotion Effects 0.000 claims abstract description 66
- 230000000694 effects Effects 0.000 claims abstract description 57
- 238000013507 mapping Methods 0.000 claims description 19
- 230000004044 response Effects 0.000 claims description 14
- 230000004048 modification Effects 0.000 claims description 13
- 238000012986 modification Methods 0.000 claims description 13
- 230000009471 action Effects 0.000 claims description 9
- 230000029058 respiratory gaseous exchange Effects 0.000 claims description 7
- 208000001431 Psychomotor Agitation Diseases 0.000 claims description 6
- 206010038743 Restlessness Diseases 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 5
- 230000004633 cognitive health Effects 0.000 abstract description 6
- 238000009795 derivation Methods 0.000 abstract description 5
- 238000010586 diagram Methods 0.000 description 47
- 230000036541 health Effects 0.000 description 35
- 238000012545 processing Methods 0.000 description 20
- 230000008859 change Effects 0.000 description 8
- 230000015654 memory Effects 0.000 description 7
- 239000013598 vector Substances 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 230000035485 pulse pressure Effects 0.000 description 4
- 230000032683 aging Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000007621 cluster analysis Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000035487 diastolic blood pressure Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003340 mental effect Effects 0.000 description 2
- 230000000276 sedentary effect Effects 0.000 description 2
- 230000035488 systolic blood pressure Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 208000017667 Chronic Disease Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 231100000870 cognitive problem Toxicity 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000010304 firing Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000000474 nursing effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000002688 persistence Effects 0.000 description 1
- 238000011524 similarity measure Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 230000035882 stress Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/002—Monitoring the patient using a local or closed circuit, e.g. in a room or building
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0022—Monitoring a patient using a global network, e.g. telephone networks, internet
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1118—Determining activity level
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0407—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
- G08B21/0423—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting deviation from an expected pattern of behaviour or schedule
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Z—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS, NOT OTHERWISE PROVIDED FOR
- G16Z99/00—Subject matter not provided for in other main groups of this subclass
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/08—Elderly
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/07—Home care
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0242—Operational features adapted to measure environmental factors, e.g. temperature, pollution
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H15/00—ICT specially adapted for medical reports, e.g. generation or transmission thereof
Definitions
- This application relates to methods and systems for sensor networks, and more specifically to methods and systems for integrated sensor networks.
- FIGS. 1 and 2 are block diagrams of example systems, according to example embodiments
- FIG. 3 is a block diagram of an example operator device that may be deployed within the system of FIG. 1 , according to an example embodiment
- FIG. 4 is a block diagram of an example provider device that may be deployed within the system of FIG. 1 , according to an example embodiment
- FIG. 5 is a block diagram of an example sensor processing subsystem that may be deployed within the operator device of FIG. 3 or the provider device of FIG. 4 , according to an example embodiment
- FIGS. 6-8 are block diagrams of flowcharts illustrating methods for sensor processing, according to example embodiments.
- FIGS. 9 and 10 are block diagrams of flowcharts illustrating methods for display generation, according to example embodiments.
- FIG. 11 is a block diagram of a flowchart illustrating a method for determining dis-similarity of density maps, according to an example embodiment
- FIG. 12 is a block diagram of a flowchart illustrating a method for performing cluster analysis, according to an example embodiment
- FIGS. 13-27 are diagrams, according to example embodiments.
- FIG. 28 is a block diagram of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
- Example methods and systems for an integrated sensor network are described.
- numerous specific details are set forth in order to provide a thorough understanding of example embodiments. It will be evident, however, to one of ordinary skill in the art that embodiments of the invention may be practiced without these specific details.
- One alternative consideration for monitoring older adults includes the use of smart sensor technologies as part of an integrated sensor network that detects activity levels around them and electronically sends the activity data to a central repository.
- data can be accessed and viewed by health care providers, families or others interested in the health of the older person being monitored.
- the integrated sensor network includes simple motion sensors, a stove sensor, video sensors, and a bed sensor that captures sleep restlessness and pulse and respiration levels. Patterns in the sensor data may represent physical and cognitive health conditions. Recognition may be performed when activity patterns begin to deviate from the norm. Performing the recognition may enable early detection of potential problems that may lead to serious health events if left unattended.
- FIG. 1 illustrates an example system 100 in which an integrated sensor network may be used.
- the system 100 is an example platform in which one or more embodiments of the methods may be used.
- the integrated sensor network may also be used on other platforms.
- An operator may use the integrated sensor network by using the operator device 102 .
- the integrated sensor network may be used by a person residing in a living unit.
- the operator device 102 may be located in the living unit, outside of the living unit but in a living unit community, or a location outside of the living unit community. Examples of operators include clinicians, researchers, and the like.
- the operator may use the operator device 102 as a stand-alone device to use the integrated sensor network, or may use the operator device 102 in combination with a provider device 106 available over a network 104 .
- the provider device 106 is also under the control of the operator but at a location outside of the living unit community.
- the operator device 102 may be in a client-server relationship with the provider device 106 , a peer-to-peer relationship with the provider device 106 , or in a different type of relationship with the provider device 106 .
- the client-server relationship may include a thin client on the operator device 102 .
- the client-server relationship may include a thick client on the operator device 102 .
- the network 104 over which the operator device 102 and the provider device 106 may communicate include, by way of example, a Mobile Communications (GSM) network, a code division multiple access (CDMA) network, 3rd Generation Partnership Project (3GPP), an Internet Protocol (IP) network, a Wireless Application Protocol (WAP) network, a WiFi network, or an IEEE 802.11 standards network, as well as various combinations thereof.
- GSM Mobile Communications
- CDMA code division multiple access
- 3GPP 3rd Generation Partnership Project
- IP Internet Protocol
- WAP Wireless Application Protocol
- WiFi Wireless Fidelity
- IEEE 802.11 IEEE 802.11
- the provider device 106 is a single device. In one embodiment, the provider device 106 may include multiple computer systems. For example, the provider device 106 may include multiple computer systems in a cloud computing configuration.
- sensors 108 forming a sensor network are included in the system 100 to obtain sensor data 112 .
- sensors 108 include motion sensors, a bed sensor, and a stove sensor.
- the multiple sensors 108 are passive, nonwearable sensors.
- the operator device 102 , the provider device 106 , or both may communicate with a database 110 .
- the database 110 may contain sensor data 112 , health data 114 , and generated data 116 .
- the sensor data 112 may be received from the sensors 108 or otherwise accessed (e.g., indirectly accessed by the provider 106 from the operator device 102 ).
- the health data 114 includes health related information about people. In general, the health data 114 is for the people associated with a particular doctor, healthcare organization, and/or living unit community.
- the generated data 116 includes information received and stored based on use of the integrated network.
- FIG. 2 illustrates an example system 200 , according to an example embodiment.
- the system 200 is a specific example of the system 100 .
- the sensor data 112 is received by the operator device 102 from the sensors 108 and stored in a database 202 .
- the operator device 102 is a logging device that simply collects the sensor data 112 and does not regularly receive input from the person, the operator, or otherwise.
- the operator device 102 transmits the sensor data 112 to the provider device 106 for storage in the database 110 on a regular basis.
- the sensor data 112 may be transmitted, hourly, daily, weekly, or at other greater or lesser time increments.
- the provider device 106 of the system 200 may include multiple provider devices including client provider devices and server provider devices.
- the operator may communicate with a server provider device through a user interface or otherwise.
- FIG. 3 illustrates an example operator device 102 that may be deployed in the system 100 (see FIG. 1 ), or otherwise deployed in another system.
- the operator device 102 is shown to include a signal processing subsystem 302 to enable use of the integrated sensor network.
- FIG. 4 illustrates an example provider device 106 that may be deployed in the system 100 (see FIG. 1 ), or otherwise deployed in another system.
- the provider device 106 is shown to include a signal processing subsystem 302 to enable use of the integrated sensor network.
- the functionality that enables use of the integrated sensor network voxel model resides solely on the sensor processing subsystem 302 deployed in the operator device 102 . In another embodiment, the functionality resides solely on the sensor processing subsystem 302 deployed in the provider device 106 . In another embodiment, the functionality is partially performed on the sensor processing subsystem 302 deployed in the operator device 102 and partially performed on the sensor processing subsystem 302 deployed in the provider device 106 . The functionality may otherwise be distributed among the operator device 102 , the provider device 106 , or another device.
- FIG. 5 illustrates an example sensor processing subsystem 302 that may be deployed in the operator device 102 , the provider device 106 , or otherwise deployed in another system.
- One or more modules are included in the sensor processing subsystem 302 to process the sensor data 112 .
- the modules of the signal processing subsystem 302 that may be included are a sensor data module 502 , an activity pattern identification module 504 , a deviation module 506 , an alert module 508 , a health data module 510 , a parameter calculation module 512 , a feedback module 514 , a correlation module 516 , a change determination module 518 , a display generation module 520 , a density module 522 , and/or a clustering module 524 .
- Other modules may also be included.
- the modules may be distributed so that some of the modules may be deployed in the operator device 102 and some of the modules may be deployed in the provider device 106 .
- the signal processing subsystem 302 includes a processor, memory coupled to the processor, and a number of the aforementioned modules deployed in the memory and executed by the processor.
- the sensor data module 502 accesses the sensor data 112 .
- the sensor data 112 may be associated with motion sensors deployed in a living unit, a bed sensor deployed in a living unit, a stove sensor deployed in the living unit, other environmentally-mounted, nonwearable sensors, or combinations thereof In general, the sensors 108 are passive, non-wearable sensors.
- the sensor data 112 accessed by the sensor data module 502 may be for a time period.
- the living unit is an apartment. In other embodiments, the living unit is a house.
- the activity pattern identification module 504 identifies an activity pattern for the time period based on at least a portion of the sensor data 112 accessed by the sensor data module 502 .
- the activity pattern represents a physical and cognitive health condition of a person living in the living unit.
- activity pattern includes a single feature. In another embodiment, the activity pattern includes multiple features.
- the sensor data module 502 identifies at least a portion of the sensor data 112 associated with the time period as being associated with the person. Identification of the activity pattern for the first time period is based on at least the portion of the sensor data 112 associated with the first time period associated with the person.
- the sensor data module 502 accesses additional sensor data 112 for an additional time period that occurs after a first time period.
- the deviation module 506 may then determine whether a deviation of the activity pattern of the first time period has occurred for the additional time period.
- the alert module 508 generates an alert based on a determination that the derivation has occurred.
- the alert module 508 transmits the alert. In some embodiments, the alert module 508 stores the alert. The alert module 508 may otherwise use or process the alert.
- event listeners (the observers) register with an event provider associated with the alert module 508 to be notified of sensor events (the changes).
- the event provider may support a filtering operation. That is, a template for the sensor events can be specified so that event listeners are only notified if a sensor event matches the template.
- the alert module 508 provides a cohesive yet flexible mechanism for incorporating different types of alert conditions. State machines may be used by alert providers to model alert specifications. As sensor events are observed, an alert model associated with the alert module 508 may transition to a new state and, if warranted, will generate an alert condition.
- Timers may be included for state transitions.
- the state machine generalization supports simple one-sensor alerts as well as alerts that involve more complex interactions among multiple sensors.
- the alert module 508 easily accepts inputs from multiple sources. Sensor events may be replayed from the database 110 through the use of the generated data 116 , to facilitate testing of alert algorithms. Alerts may be sent to different output streams, including a pager system for immediate alerts as well as emailed alerts for daily summaries.
- the activity pattern identification module 504 identifies the activity pattern for the second time period based on access of the additional sensor data 112 associated with the second time period.
- the determination of whether the deviation has occurred by the deviation module 506 may then include determining whether the deviation of the activity pattern of the second time period from the activity pattern of the first time period exceeds a threshold.
- the health data module 510 analyzes health data associated with the person. Generation of the alert by the alert module 508 may be based on when the activity pattern of the second time period from the activity pattern of the first time period exceeds the threshold and analysis of the health data 114 .
- the parameter calculation module 512 calculates statistical parameters of at least a portion of the sensor data 112 for the time period. A determination of whether the deviation has occurred by the deviation module 506 may then include determining whether at least a portion of the additional sensor data 112 for the additional time period is outside of a threshold based on the statistical parameters.
- the alert generated by the alert module 508 may be a hits-based alert.
- the activity pattern for the time period is based on total number of sensor hits of a sensor 108 during a day of the time period.
- the alert generated by the alert module 508 may be a time-based alert.
- the activity pattern for the time period is based on total time that the sensor 108 fired during a particular day of the time period.
- the alert module 508 transmits the alert including a link to a web interface.
- the web interface includes the sensor data 112 of the second time period in the context of the sensor data 112 of the first time period.
- the feedback module 514 may be deployed in the sensor processing subsystem 302 to receive and process feedback, requests, selections, or the like.
- the feedback module 514 receives a feedback response to the alert.
- the feedback response includes feedback regarding clinical relevance of the alert.
- the feedback module 514 may then take action based on receipt of the feedback response.
- the action includes adjusting the threshold based on the receipt of the feedback response. In one embodiment, the action includes recording ignored indicia for the person based on the receipt of the feedback response. The ignored indicia may be associated with a feature of the alert.
- the sensor processing subsystem 302 includes the correlation module 516 and the change determination module 518 to predict changes in a health condition.
- the health condition may be a physical condition, a mental condition, or a physical and a mental condition.
- the health condition is pulse pressure. Pulse pressure may be the difference between systolic blood pressure (SBP) and the diastolic blood pressure (DBP).
- the sensor data module 502 accesses the sensor data 112 associated with a person and the health module data module 510 accessing the health data 114 of the person for a first time period.
- the correlation module 516 then correlates the health data to at least a portion of the sensor data 112 for the first time period.
- the sensor data module 502 accesses additional sensor data 112 for a second time period.
- the changer determination module 518 determines whether a change in a health condition of the person has occurred based on the additional sensor data 112 and correlation of the health condition data to at least the portion of the sensor data 112 for the first time period.
- the alert module 508 may generate an alert when a determination is made that the change in the health condition has occurred.
- the display generation module 520 generates a display.
- the alert module 508 generates the alert and the display generation module 520 generates a display based the alert.
- the sensor data module 502 accesses the sensor data 112 and the display generation module 520 generates a display based on the sensor data 112 .
- the sensor data 112 is grouped on the display based on multiple categories. The categories may include, by way of example, motion, pulse, breathing, and bed restlessness.
- the feedback module 514 receives a selection of a person and a date range.
- the sensor data module 502 may then access the sensor data 112 based on receipt of the selection.
- a user may interface with the sensor processing subsystem 302 to zoom in or zoom out on the display.
- the feedback module 514 receives a time interval modification request.
- the display generation module 520 may then generate a display based on access of the sensor data 112 associated with the time period and receipt of the time interval modification request.
- the feedback module 514 receives a time increment modification request.
- the display generation module 520 may then generate the display based on access of the sensor data 112 associated with the time period and receipt of the time increment modification request.
- the density module 522 determines an away-from-home time period for a person associated with the living unit during the time period.
- the generation of the display by the display generation module 520 then generates the display based on access of the sensor data 112 and a determination of the away-from-home time period.
- a determination of the away-from home time period by the density module 522 includes analyzing the sensor data 112 to determine whether a living unit departure sensor sequence and a living unit return sensor sequence has occurred and calculating a time difference between occurrence of the living unit departure sensor sequence and occurrence of the living unit return sensor sequence.
- analyzing the sensor data 112 includes applying fuzzy logic to at least a portion of the sensor data 112 to determine whether a living unit departure sensor sequence and a living unit return sensor sequence has occurred.
- the density module 522 computes a number of motion sensor hits for multiple hours. A sensor hit is associated with a motion sensor. The density module 522 may then calculate density for the multiple hours. The generation of the display by the display generation module 520 may then be based on calculation of the density.
- the display generation module 520 selects color mappings and then generates the display based on the selection of color mappings.
- a color mapping has a color based on the density and is associated with a position on a display based on the hour and day.
- Dis-similarity between density maps may be computed by use of the density module 522 .
- the density module 522 accesses a first density map and a second density map, the first density map having a first color mappings, the second density map having a second color mappings, computes a dis-similarity between the first density map and the second density map based on a textual feature of the first density map and the second density map, and generates a computational result based on computing the dis-similarity.
- Textual features may include, by way of example, spatial, frequency, and perceptual properties.
- the display generation module 520 may then generate a display based on computation of the dis-similarity.
- the density module 520 may transmit a notification based on computation of the dis-similarity, storing the computational result, or both.
- Clustering may be performed by the clustering module 524 to analyze the sensor data 112 based on clusters.
- the clustering module 524 generates feature clusters for a time period.
- a feature cluster is associated with multiple feature vectors, wherein a feature vector is associated with the sensor data 112 from at least some of motion sensors and/or a bed sensor.
- the sensor data module 502 accesses additional sensor data 112 associated a feature for a different time period.
- the clustering module 524 may then determine whether the additional sensor data 112 falls within the feature clusters or belongs in a new cluster.
- the clustering module 524 Based on a result of the determination, the clustering module 524 generates a notification.
- the notification may be a cluster addition notification based on a determination that the additional sensor data 112 falls within the feature clusters.
- the notification may be a new cluster notification based on a determination that the additional sensor data 112 belongs in the new cluster.
- FIG. 6 illustrates a method 600 for sensor processing according to an example embodiment.
- the method 600 may be performed by the operator device 102 or the processor device 106 of the system 100 (see FIG. 1 ), or may be otherwise performed.
- the sensor data 112 is accessed from the motion sensors and the bed sensor deployed in a living unit for a first time period.
- the deployed sensors are passive, non-wearable sensors.
- At least a portion of the sensor data 112 associated with the first time period may be identified as being associated with the person at block 604 .
- An activity pattern is identified for the first time period at block 606 based on at least a portion of sensor data 112 associated with the first time period.
- the activity pattern represents a physical and cognitive health condition of a person residing in the living unit.
- identification of the activity pattern for the first time period is based on at least the portion of sensor data 112 associated with the first time period associated with the person.
- additional sensor data 112 is accessed from the motion sensors and the bed sensor deployed in the living unit for a second time period.
- the second time period occurs after the first time period.
- the first period of time has a same time duration as the second period of time.
- the first time period is for a period of fourteen consecutive days and the second time period is for a period of a single day. Different periods of time may be used for the first time period and the second time period.
- the operations performed at block 602 including accessing the sensor data 112 from a stove sensor deployed in the living unit and the operations performed at block 608 include accessing the additional sensor data 112 from the stove sensor deployed in the living unit.
- a determination of whether a deviation of the activity pattern of the first time period has occurred for the second time period is performed at block 610 .
- the activity pattern includes multiple features and the deviation is associated with a feature of the multiple features.
- the activity pattern for the second time period is identified based on access of the additional sensor data 112 associated with the second time period.
- the determination performed at block 610 may then include determining whether the deviation of the activity pattern of the second time period from the activity pattern of the first time period exceeds a threshold.
- the health data 114 associated with the person may be analyzed at block 612 , while an alert is generated at block 614 .
- the alert is generated based on a determination that the derivation has occurred.
- the alert is generated based on when the activity pattern of the second time period from the activity pattern of the first time period exceeds the threshold and analysis of the health data 114 .
- the alert is transmitted, while in some embodiments the alert is stored.
- FIG. 7 illustrates a method 700 for sensor processing according to an example embodiment.
- the method 70 may be performed by the operator device 102 or the processor device 106 of the system 100 (see FIG. 1 ), or may be otherwise performed.
- the sensor data 112 is accessed from motion sensors and a bed sensor deployed in a living unit for a first time period.
- An activity pattern for the first time period is identified at block 704 based on at least a portion of the sensor data 112 associated with the first time period.
- the activity pattern represents a physical and cognitive health condition of a person residing in the living unit.
- the activity pattern includes a single feature. In another embodiment, the activity pattern includes multiple features.
- additional sensor data 112 is accessed from the motion sensors and the bed sensor deployed in the living unit for a second time period.
- the second time period occurs after the first time period.
- Statistical parameters of at least a portion of the sensor data 112 for the first time period are calculated at block 708 .
- a determination of whether a deviation of the activity pattern of the first time period has occurred for the second time period is performed at block 710 .
- the determination includes determining whether at least a portion of the additional sensor data 112 for the second time period is outside of a threshold.
- the threshold is based on the statistical parameters.
- An alert is generated at block 712 based on a determination that the derivation has occurred.
- the alert is a hits-based alert.
- the activity pattern for the first time period may then be based on total number of sensor hits of a particular sensor 108 during a particular day of the first time period.
- the alert is a time-based alert.
- the activity pattern for the first time period may then be based on total time that a particular sensor 108 fired during a particular day of the first time period.
- the alert generated may be adapted or customized based on received feedback.
- the alert including a link to a web interface may be transmitted at block 714 .
- the web interface may include the sensor data 112 of the second time period in the context of the sensor data 112 of the first time period.
- a feedback response may be received to the alert at block 716 .
- the feedback response includes feedback regarding clinical relevance of the alert.
- An action may be taken at block 718 based on receipt of the feedback response.
- taking the action may include adjusting the threshold based on the receipt of the feedback response.
- taking the action may include recording ignored indicia for the person based on the receipt of the feedback response. The ignored indicia may be associated with a feature of the alert.
- FIG. 8 illustrates a method 800 for sensor processing according to an example embodiment.
- the method 800 may be performed by the operator device 102 or the processor device 106 of the system 100 (see FIG. 1 ), or may be otherwise performed.
- the health data 114 of a person for a first time period is accessed at block 802 .
- the sensor data 112 from motion sensors and a bed sensor deployed in a living unit for the first time period is accessed at block 804 .
- the health data is correlated to at least a portion of the sensor data 112 for the first time period at block 806 .
- Additional sensor data 112 is accessed at block 808 from the motion sensors and the bed sensor deployed in the living unit for a second time period.
- the second time period generally occurs after the first time period.
- a determination of whether a change in a health condition of the person has occurred is made based on the additional sensor data 112 and correlation of the health data 114 to at least the portion of the sensor data 112 for the first time period.
- An alert may be generated at block 812 when a determination is made that the change in the health condition has occurred.
- FIG. 9 illustrates a method 900 for display generation according to an example embodiment.
- the method 900 may be performed by the operator device 102 or the processor device 106 of the system 100 (see FIG. 1 ), or may be otherwise performed.
- a selection of a person and/or a date range may be received at block 902 .
- the sensor data 112 is accessed from motion sensors and a bed sensor deployed in a living unit for a time period at block 904 .
- the access of the sensor data 112 from the motion sensors and the bed sensor for the time period is based on receipt of the selection.
- a request may be received at block 906 .
- the request is a time interval modification request. In some embodiments, the request is a time increment modification request.
- a display is generated at block 908 based on access of the sensor data 112 associated with the time period.
- the sensor data 112 is grouped on the display based on multiple categories.
- the categories may include motion, pulse, breathing, and restlessness.
- generation of the display is based on access of the sensor data 112 associated with the time period and receipt of the time interval modification request. In some embodiments, generation of the display is based on access of the sensor data 112 associated with the time period and receipt of the time increment modification request.
- FIG. 10 illustrates a method 1000 for display generation according to an example embodiment.
- the method 1000 may be performed by the operator device 102 or the processor device 106 of the system 100 (see FIG. 1 ), or may be otherwise performed.
- a selection of a person and/or a date range may be received at block 1002 .
- the sensor data 112 is accessed from motion sensors and a bed sensor deployed in a living unit for a time period at block 1004 .
- the access of the sensor data 112 from the motion sensors and the bed sensor for the time period is based on receipt of the selection.
- a determination of an away-from-home time period for a person associated with the living unit during the time period is made at block 1006 .
- the determination of the away-from-home time period includes analyzing the sensor data 112 to determine whether a living unit departure sensor sequence and a living unit return sensor sequence has occurred and calculating a time difference between occurrence of the living unit departure sensor sequence and occurrence of the living unit return sensor sequence,. The away-from home time period may then be based on the time difference.
- analyzing the sensor data 112 includes applying fuzzy logic to at least a portion of the sensor data 112 to determine whether a living unit departure sensor sequence and a living unit return sensor sequence has occurred.
- a number of motion sensor hits for multiple hours of the time period may be computed at block 1008 .
- a single motion sensor hit is associated with a single motion sensor of the plurality of motion sensors.
- a density for the hours may be calculated at block 1010 .
- the density for an hour may be based on the number of motion sensor hits during the hour and the determination of the away-from-home time period.
- a display is generated at block 1012 based on access of the sensor data 112 associated with the time period and a determination of the away-from-home time period. In some embodiments, generation of the display is based on calculation of the density.
- generation of the display includes selecting a color mappings and generating the display based on selection of the color mappings.
- a color mapping has a color based on the density and is associated with a position based on an hour of a day.
- FIG. 11 illustrates a method 1100 for determining dis-similarity of density maps according to an example embodiment.
- the method 1100 may be performed by the operator device 102 or the processor device 106 of the system 100 (see FIG. 1 ), or may be otherwise performed.
- a dis-similarity measure based on texture features may be used for comparing density maps and automatically determining changes in activity patterns.
- the dis-similarity between two density maps may be computed to aid caregivers in evaluating changes of residents.
- the texture features may be used evaluate the dis-similarity of density maps by capturing spatial, frequency, and perceptual properties such as periodicity, coarseness, and complexity. Texture features may be extracted using the co-occurrence distribution (e.g., the gray-level co-occurrence statistical method using the density values directly).
- the density maps need not have the color mapping to determine the dis-similarity.
- a first density map and a second density map are accessed at block 1102 .
- the first density map has first color mappings.
- the second density map has second color mappings.
- a color mapping has a color based on density and is associated with a position based on an hour of a day.
- the density is based on a number of motion sensor hits during the hour and a determination of the away-from-home time period.
- a dis-similarity between the first density map and the second density map is computed at block 1104 based on a textual feature of the first density map and the second density map.
- textual features include spatial, frequency, and perceptual properties.
- the computation may be performed based on a single textual feature or multiple textual features.
- An angular second moment feature may measure homogeneity of the image.
- the contrast feature may measure the amount of local variations in an image.
- the inverse difference moment may also measure image homogeneity.
- Entropy may measure the disorder.
- Other non-textual features may also be used to discriminate the dis-similarity of density maps. For example, average motion density per hour and average time away from the living unit per day may be used during the computation performed at block 1104 .
- the dis-similarity of two different density maps is represented by a number that is computed in feature space as the distance from one map to another.
- a computational result is generated at block 1106 based on computing the dis-similarity.
- a display may be generated at block 1108 based on computation of the dis-similarity.
- a notification based on computation of the dis-similarity may be transmitted.
- the computational result may be stored.
- FIG. 12 illustrates a method 1200 for performing cluster analysis according to an example embodiment.
- the method 1200 may be performed by the operator device 102 or the processor device 106 of the system 100 (see FIG. 1 ), or may be otherwise performed.
- Feature clusters are generated for a time period at block 1202 .
- the time period includes multiple days.
- a feature cluster is associated with a multiple feature vectors.
- a feature vector is associated with the sensor data 112 from at least some of the motion sensors and/or a bed sensor deployed in a living unit.
- Additional sensor data 112 associated with a particular feature for a different time period is accessed at block 1204 .
- a determination of whether the additional sensor data 112 falls within the feature clusters or belongs in a new cluster is made at block 1206 .
- a notification is generated at block 1208 based on a result of a determination.
- a cluster addition notification is generated based on a determination that the additional sensor data 112 falls within the feature clusters. In some embodiments, a new cluster notification is generated based on a determination that the additional sensor data 112 belongs in the new cluster.
- FIG. 13 is a diagram 1300 of a user interface, according to an example embodiment.
- the user interface shows motion sensor data for multiple sensors over a period of fourteen days.
- FIG. 14 is a diagram 1400 of a user interface, according to an example embodiment.
- the user interface shows motion sensor data over a period of twenty eight days.
- the diagram 1400 is a “zoomed out” version of the diagram 1300 (see FIG. 13 ).
- FIG. 15 is a diagram 1500 of a user interface, according to an example embodiment.
- the user interface shows motion sensor data over a period of a day.
- the diagram 1500 is a “zoomed in” version of the diagram 1300 (see FIG. 13 ).
- FIG. 16 is a diagram 1600 of a user interface, according to an example embodiment.
- the user interface shows motion sensor data for a single sensor over a period of fourteen days.
- FIG. 17 is a diagram 1700 of an example alert, according to an example embodiment.
- the alert shown the diagram 1700 may be transmitted as an e-mail or otherwise transmitted.
- the alert is shown to include links to a user interface associated with sensors.
- the links included in the diagram are a link to a bathroom sensor, a kitchen sensor, and a living room sensor.
- links are also included to feedback web pages to capture a user's rating of the significance of the alert.
- FIG. 18 is a diagram 1800 of a user interface, according to an example embodiment.
- the diagram 1800 shows a user interface that may be presented based on selection of a link included in an alert of the diagram 1700 .
- a resident ID As shown in the diagram 1800 , a resident ID, a time period including starting date, starting hour, ending date, and ending hour, a time interval, and an increment selections may be available for customization.
- the operator may modify default selections and then press a submit button.
- FIG. 19 is a diagram 1900 of a user interface, according to an example embodiment.
- the diagram 1900 shows sensor firing data for a fourteen day period.
- the diagram 1900 may be presented based on selection of a submit button from the diagram 1800 .
- FIG. 20 is a diagram 2000 of a user interface, according to an example embodiment.
- the diagram enables an operator to provide alert feedback.
- the operator may include a rating of the significance of the alert, thoughts about the alert (e.g., not enough of a change and not a good parameter), and comments through the user interface. Other or different feedback may be collected.
- the operator may also designate the perspective (e.g., classification) of the operator submitting the feedback.
- the user interface shown in the diagram 2000 may be used to provide adaptive, customizable alerts by adjusting the sensor parameters and thresholds, based on the alert feedback ratings.
- FIGS. 21-23 are diagrams 2100 - 2300 of density maps, according to an example embodiment. While the diagrams 2100 - 2300 are shown in this document in black and white, the displays associated with the diagrams 2100 - 2300 are typically generated in color based on color mappings.
- the diagram 2100 is a density map of a person with a sedentary lifestyle pattern for one month.
- the diagram 2200 is a density map of a person with an active lifestyle pattern for one month.
- the diagram 2300 is a density map of a person with an irregular lifestyle pattern showing a cognitive problem for one month.
- health care providers in some embodiments may identify a typical pattern of activity for an individual and watch for changes in the pattern.
- FIG. 24 is a diagram 2400 of a floor plan of living unit, according to an example embodiment.
- the diagram 2400 shows example locations of motion sensors, a bed sensor, and a stove sensor in the living unit.
- the motion sensors may detect presence in a particular room as well as specific activities. For example, a motion sensor installed on the ceiling above the shower detects showering activity; motion sensors installed discretely in cabinets and the refrigerator detect kitchen activity. For convenience, a motion sensor may also installed on the ceiling above the door of the living unit, to detect movement in and out of the doorway (e.g., for living unit exits).
- the motion sensors in some embodiments, are commercially available passive infrared (PIR) sensors which transmit using the wireless X10 protocol. Other types of sensors may be used.
- PIR passive infrared
- the sensors detect movement of warm bodies and transmit an event about every 7 seconds when movement is still detected. This artifact is useful for capturing a general lifestyle pattern; for example, a sedentary pattern will result in a smaller number of sensor events over time compared to a more active “puttering” pattern.
- the bed sensor may be a transducer which detects presence in the bed, pulse and respiration rates, and bed restlessness. Pulse and respiration rates may be reported as low, normal, and high, based on thresholds, or pulse and respiration rates may be reported as numerical rates. In some embodiments, bed restlessness is reported based on the persistence of movement in the bed. All of the output of the bed sensor may contribute to the general pattern of the resident.
- the stove sensor may detect motion in the kitchen as well as the temperature of the stove/oven unit. This may be performed through a modified X10 PIR motion sensor. When a high temperature is detected, a “stove on” event may be generated. When the temperature drops below a threshold again, a “stove off” event may be generated. This sensor is included so that an alert could be generated if the stove is left on and there is no indication of someone in the kitchen for a specified period of time.
- all of the sensor data 112 for the person is transmitted wirelessly via the X10 protocol to a data monitor PC which is located in the living unit of the person.
- the data monitor may add a date-time stamp for each sensor event and may log it as the sensor data into a file that is periodically sent to a dedicated central server which stores the data in a relational database.
- the data monitors may be connected to the central server through a dedicated local network, for security purposes.
- identifiers may be stripped from the data before transmission.
- FIG. 25 is a diagram 2500 of predicted pulse pressure from the sensor data 112 and measured pulse pressure, according to an example embodiment
- FIG. 26 is a diagram 2600 of a comparison of Euclidean distance for a person, according to an example embodiment.
- the diagram 2600 may be generated as a result of the operations performed at block 1108 (see FIG. 11 ).
- FIG. 27 is a diagram 2700 of multiple density maps associated with for the diagram 2600 .
- FIG. 28 shows a block diagram of a machine in the example form of a computer system 2800 within which a set of instructions may be executed causing the machine to perform any one or more of the methods, processes, operations, or methodologies discussed herein.
- the operator device 102 , the provider device 106 , or both may include the functionality of the one or more computer systems 2800 .
- the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
- the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, a kiosk, a point of sale (POS) device, a cash register, an Automated Teller Machine (ATM), or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
- POS point of sale
- ATM Automated Teller Machine
- machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
- the example computer system 2800 includes a processor 2812 (e.g., a central processing unit (CPU) a graphics processing unit (GPU) or both), a main memory 2804 and a static memory 2806 , which communicate with each other via a bus 2808 .
- the computer system 2800 may further include a video display unit 2810 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
- the computer system 2800 also includes an alphanumeric input device 2812 (e.g., a keyboard), a cursor control device 2814 (e.g., a mouse), a drive unit 2816 , a signal generation device 2818 (e.g., a speaker) and a network interface device 2820 .
- the drive unit 2816 includes a machine-readable medium 2822 on which is stored one or more sets of instructions (e.g., software 2824 ) embodying any one or more of the methodologies or functions described herein.
- the software 2824 may also reside, completely or at least partially, within the main memory 2804 and/or within the processor 2812 during execution thereof by the computer system 2800 , the main memory 2804 and the processor 2812 also constituting machine-readable media.
- the software 2824 may further be transmitted or received over a network 2826 via the network interface device 2820 .
- machine-readable medium 2822 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
- the term “machine-readable medium” shall also be taken to include any medium that is capable of storing or encoding a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention.
- the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical media, and magnetic media.
- the machine-readable medium is a non-transitory machine readable medium.
- a module may be a unit of distinct functionality that may be presented in software, hardware, or combinations thereof.
- the functionality of a module is performed in any part through software, the module includes a machine-readable medium.
- the modules may be regarded as being communicatively coupled.
- sensor data may be accessed from a plurality of motion sensors and a bed sensor deployed in a living unit for a first time period.
- An activity pattern for the first time period may be identified based on at least a portion of sensor data associated with the first time period.
- the activity pattern may represent a physical and cognitive health condition of a person residing in the living unit.
- Additional sensor data may be accessed from the plurality of motion sensors and the bed sensor deployed in the living unit for a second time period.
- the second time period may occur after the first time period.
- a determination of whether a deviation of the activity pattern of the first time period has occurred for the second time period may be performed.
- An alert may be generated based on a determination that the derivation has occurred.
- health data of a person may be accessed for a first time period.
- Sensor data from a plurality of motion sensors and a bed sensor deployed in a living unit may be accessed for the first time period.
- the person may live in the living unit.
- Health data may be correlated to at least a portion of the sensor data for the first time period.
- Additional sensor data may be accessed from the plurality of motion sensors and the bed sensor deployed in the living unit for a second time period.
- the second time period may occurring after the first time period.
- a determination of whether a change in a health condition of the person has occurred may be made based on the additional sensor data and correlation of the health data to at least the portion of the sensor data for the first time period.
- sensor data may be accessed from a plurality of motion sensors and a bed sensor deployed in a living unit for a time period.
- a display may be generated based on access of the sensor data associated with the time period.
- a first density map and a second density map may be accessed.
- the first density map may have a plurality of first color mappings.
- the second density map may have a plurality of second color mappings.
- a particular first color mapping may have a color based on density and being associated with a position based on a particular hour and a particular day. Density may be based on a number of motion sensor hits during the particular hour and a determination of the away-from-home time period.
- a dis-similarity between the first density map and the second density map may be computed based on a textual feature of the first density map and the second density map.
- a computational result may be generated based on computing the dis-similarity.
- a plurality of feature clusters may be generated for a time period.
- the time period may include a plurality of days.
- a particular feature cluster may be associated with a plurality of feature vectors.
- a particular feature vector may be associated with sensor data from at least some of a plurality of motion sensors and a bed sensor deployed in a living unit. Additional sensor data associated a particular feature for a different time period may be accessed. A determination of whether the additional sensor data falls within the plurality of feature clusters or belongs in a new cluster may be made. A notification may be generated based on a result of a determination.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- General Physics & Mathematics (AREA)
- Pulmonology (AREA)
- Gerontology & Geriatric Medicine (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Image Analysis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Methods and systems for an integrated sensor network are described. In one embodiment, sensor data may be accessed from a plurality of motion sensors and a bed sensor deployed in a living unit for a first time period. An activity pattern for the first time period may be identified based on at least a portion of sensor data associated with the first time period. The activity pattern may represent a physical and cognitive health condition of a person residing in the living unit. Additional sensor data may be accessed from the motion sensors and the bed sensor deployed for a second time period. A determination of whether a deviation of the activity pattern of the first time period has occurred for the second time period may be performed. An alert may be generated based on a determination that the derivation has occurred. In some embodiments, user feedback is captured on the significance of the alerts, and the alert method is customized based on this feedback. Additional methods and systems are disclosed.
Description
- This patent application is a continuation of U.S. patent application Ser. No. 16/251,478, filed Jan. 18, 2019, and entitled “Integrated Sensor Network Methods and Systems”, now U.S. Pat. No. ______, which is a continuation of U.S. patent application Ser. No. 12/791,628, filed Jun. 1, 2010, and entitled “Integrated Sensor Network Methods and Systems”, now U.S. Pat. No. 10,188,295, which claims priority to U.S. Provisional Patent Application Ser. No. 61/217,623, filed Jun. 1, 2009, and entitled “Monitoring System for Eldercare”, the entire disclosures of each of which are herein incorporated by reference.
- This invention was made with government support under Grant No. IIS-0428420 and, Grant No. 90AM3013 awarded by the U.S. Administration on Aging, and Grant No. 1R21NR011197-01 awarded by the National Institute of Health. The government has certain rights in the invention.
- This application relates to methods and systems for sensor networks, and more specifically to methods and systems for integrated sensor networks.
- Countries on multiple continents are experiencing an aging population. The number of older adults is growing dramatically. With this demographic shift, there is a desire to keep older adults healthy, functionally able, and living independently, in part because this provides a better quality of life, and in part because the aging population will stress current facilities and resources designed to care for elders. Challenges exist in keeping people healthy and functionally able as they age.
-
FIGS. 1 and 2 are block diagrams of example systems, according to example embodiments; -
FIG. 3 is a block diagram of an example operator device that may be deployed within the system ofFIG. 1 , according to an example embodiment; -
FIG. 4 is a block diagram of an example provider device that may be deployed within the system ofFIG. 1 , according to an example embodiment; -
FIG. 5 is a block diagram of an example sensor processing subsystem that may be deployed within the operator device ofFIG. 3 or the provider device ofFIG. 4 , according to an example embodiment; -
FIGS. 6-8 are block diagrams of flowcharts illustrating methods for sensor processing, according to example embodiments; -
FIGS. 9 and 10 are block diagrams of flowcharts illustrating methods for display generation, according to example embodiments; -
FIG. 11 is a block diagram of a flowchart illustrating a method for determining dis-similarity of density maps, according to an example embodiment; -
FIG. 12 is a block diagram of a flowchart illustrating a method for performing cluster analysis, according to an example embodiment; -
FIGS. 13-27 are diagrams, according to example embodiments; and -
FIG. 28 is a block diagram of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. - Example methods and systems for an integrated sensor network are described. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of example embodiments. It will be evident, however, to one of ordinary skill in the art that embodiments of the invention may be practiced without these specific details.
- Most adults would prefer to remain as active as possible and to live independently in unrestricted environments as they age. However, because chronic illness and declining health affect most people as they get older, placement in more restricted housing environments like assisted living or nursing homes is fairly common. The reason this sort of placement occurs is because health assessments and medical care have traditionally required face to face meetings.
- One alternative consideration for monitoring older adults includes the use of smart sensor technologies as part of an integrated sensor network that detects activity levels around them and electronically sends the activity data to a central repository. Through web technologies, data can be accessed and viewed by health care providers, families or others interested in the health of the older person being monitored.
- The integrated sensor network includes simple motion sensors, a stove sensor, video sensors, and a bed sensor that captures sleep restlessness and pulse and respiration levels. Patterns in the sensor data may represent physical and cognitive health conditions. Recognition may be performed when activity patterns begin to deviate from the norm. Performing the recognition may enable early detection of potential problems that may lead to serious health events if left unattended.
-
FIG. 1 illustrates anexample system 100 in which an integrated sensor network may be used. Thesystem 100 is an example platform in which one or more embodiments of the methods may be used. However, the integrated sensor network may also be used on other platforms. - An operator may use the integrated sensor network by using the
operator device 102. The integrated sensor network may be used by a person residing in a living unit. Theoperator device 102 may be located in the living unit, outside of the living unit but in a living unit community, or a location outside of the living unit community. Examples of operators include clinicians, researchers, and the like. - The operator may use the
operator device 102 as a stand-alone device to use the integrated sensor network, or may use theoperator device 102 in combination with aprovider device 106 available over anetwork 104. In some embodiments, theprovider device 106 is also under the control of the operator but at a location outside of the living unit community. - The
operator device 102 may be in a client-server relationship with theprovider device 106, a peer-to-peer relationship with theprovider device 106, or in a different type of relationship with theprovider device 106. In one embodiment, the client-server relationship may include a thin client on theoperator device 102. In another embodiment, the client-server relationship may include a thick client on theoperator device 102. - The
network 104 over which theoperator device 102 and theprovider device 106 may communicate include, by way of example, a Mobile Communications (GSM) network, a code division multiple access (CDMA) network, 3rd Generation Partnership Project (3GPP), an Internet Protocol (IP) network, a Wireless Application Protocol (WAP) network, a WiFi network, or an IEEE 802.11 standards network, as well as various combinations thereof. Other conventional and/or later developed wired and wireless networks may also be used. - In one embodiment, the
provider device 106 is a single device. In one embodiment, theprovider device 106 may include multiple computer systems. For example, theprovider device 106 may include multiple computer systems in a cloud computing configuration. -
Multiple sensors 108 forming a sensor network are included in thesystem 100 to obtainsensor data 112. Examples ofsensors 108 include motion sensors, a bed sensor, and a stove sensor. In general, themultiple sensors 108 are passive, nonwearable sensors. - The
operator device 102, theprovider device 106, or both may communicate with adatabase 110. Thedatabase 110 may containsensor data 112,health data 114, and generateddata 116. - The
sensor data 112 may be received from thesensors 108 or otherwise accessed (e.g., indirectly accessed by theprovider 106 from the operator device 102). Thehealth data 114 includes health related information about people. In general, thehealth data 114 is for the people associated with a particular doctor, healthcare organization, and/or living unit community. The generateddata 116 includes information received and stored based on use of the integrated network. -
FIG. 2 illustrates anexample system 200, according to an example embodiment. Thesystem 200 is a specific example of thesystem 100. As shown in thesystem 200, thesensor data 112 is received by theoperator device 102 from thesensors 108 and stored in adatabase 202. Theoperator device 102 is a logging device that simply collects thesensor data 112 and does not regularly receive input from the person, the operator, or otherwise. - The
operator device 102 transmits thesensor data 112 to theprovider device 106 for storage in thedatabase 110 on a regular basis. Thesensor data 112 may be transmitted, hourly, daily, weekly, or at other greater or lesser time increments. Theprovider device 106 of thesystem 200 may include multiple provider devices including client provider devices and server provider devices. The operator may communicate with a server provider device through a user interface or otherwise. -
FIG. 3 illustrates anexample operator device 102 that may be deployed in the system 100 (seeFIG. 1 ), or otherwise deployed in another system. Theoperator device 102 is shown to include asignal processing subsystem 302 to enable use of the integrated sensor network. -
FIG. 4 illustrates anexample provider device 106 that may be deployed in the system 100 (seeFIG. 1 ), or otherwise deployed in another system. Theprovider device 106 is shown to include asignal processing subsystem 302 to enable use of the integrated sensor network. - In one embodiment, the functionality that enables use of the integrated sensor network voxel model resides solely on the
sensor processing subsystem 302 deployed in theoperator device 102. In another embodiment, the functionality resides solely on thesensor processing subsystem 302 deployed in theprovider device 106. In another embodiment, the functionality is partially performed on thesensor processing subsystem 302 deployed in theoperator device 102 and partially performed on thesensor processing subsystem 302 deployed in theprovider device 106. The functionality may otherwise be distributed among theoperator device 102, theprovider device 106, or another device. -
FIG. 5 illustrates an examplesensor processing subsystem 302 that may be deployed in theoperator device 102, theprovider device 106, or otherwise deployed in another system. One or more modules are included in thesensor processing subsystem 302 to process thesensor data 112. The modules of thesignal processing subsystem 302 that may be included are asensor data module 502, an activitypattern identification module 504, adeviation module 506, analert module 508, ahealth data module 510, aparameter calculation module 512, afeedback module 514, acorrelation module 516, achange determination module 518, adisplay generation module 520, adensity module 522, and/or aclustering module 524. Other modules may also be included. In various embodiments, the modules may be distributed so that some of the modules may be deployed in theoperator device 102 and some of the modules may be deployed in theprovider device 106. In one particular embodiment, thesignal processing subsystem 302 includes a processor, memory coupled to the processor, and a number of the aforementioned modules deployed in the memory and executed by the processor. - The
sensor data module 502 accesses thesensor data 112. Thesensor data 112 may be associated with motion sensors deployed in a living unit, a bed sensor deployed in a living unit, a stove sensor deployed in the living unit, other environmentally-mounted, nonwearable sensors, or combinations thereof In general, thesensors 108 are passive, non-wearable sensors. Thesensor data 112 accessed by thesensor data module 502 may be for a time period. - In some embodiments, the living unit is an apartment. In other embodiments, the living unit is a house.
- The activity
pattern identification module 504 identifies an activity pattern for the time period based on at least a portion of thesensor data 112 accessed by thesensor data module 502. In some embodiments, the activity pattern represents a physical and cognitive health condition of a person living in the living unit. In one embodiment, activity pattern includes a single feature. In another embodiment, the activity pattern includes multiple features. - In some embodiments, the
sensor data module 502 identifies at least a portion of thesensor data 112 associated with the time period as being associated with the person. Identification of the activity pattern for the first time period is based on at least the portion of thesensor data 112 associated with the first time period associated with the person. - In some embodiments, the
sensor data module 502 accessesadditional sensor data 112 for an additional time period that occurs after a first time period. Thedeviation module 506 may then determine whether a deviation of the activity pattern of the first time period has occurred for the additional time period. Thealert module 508 generates an alert based on a determination that the derivation has occurred. - In some embodiments, the
alert module 508 transmits the alert. In some embodiments, thealert module 508 stores the alert. Thealert module 508 may otherwise use or process the alert. - In some embodiments, event listeners (the observers) register with an event provider associated with the
alert module 508 to be notified of sensor events (the changes). The event provider may support a filtering operation. That is, a template for the sensor events can be specified so that event listeners are only notified if a sensor event matches the template. - The
alert module 508 provides a cohesive yet flexible mechanism for incorporating different types of alert conditions. State machines may be used by alert providers to model alert specifications. As sensor events are observed, an alert model associated with thealert module 508 may transition to a new state and, if warranted, will generate an alert condition. - Timers may be included for state transitions. The state machine generalization supports simple one-sensor alerts as well as alerts that involve more complex interactions among multiple sensors. The
alert module 508 easily accepts inputs from multiple sources. Sensor events may be replayed from thedatabase 110 through the use of the generateddata 116, to facilitate testing of alert algorithms. Alerts may be sent to different output streams, including a pager system for immediate alerts as well as emailed alerts for daily summaries. - In some embodiments, the activity
pattern identification module 504 identifies the activity pattern for the second time period based on access of theadditional sensor data 112 associated with the second time period. The determination of whether the deviation has occurred by thedeviation module 506 may then include determining whether the deviation of the activity pattern of the second time period from the activity pattern of the first time period exceeds a threshold. - In some embodiments, the
health data module 510 analyzes health data associated with the person. Generation of the alert by thealert module 508 may be based on when the activity pattern of the second time period from the activity pattern of the first time period exceeds the threshold and analysis of thehealth data 114. - In some embodiments, the
parameter calculation module 512 calculates statistical parameters of at least a portion of thesensor data 112 for the time period. A determination of whether the deviation has occurred by thedeviation module 506 may then include determining whether at least a portion of theadditional sensor data 112 for the additional time period is outside of a threshold based on the statistical parameters. - The alert generated by the
alert module 508 may be a hits-based alert. In one embodiment, the activity pattern for the time period is based on total number of sensor hits of asensor 108 during a day of the time period. - The alert generated by the
alert module 508 may be a time-based alert. In one embodiment, the activity pattern for the time period is based on total time that thesensor 108 fired during a particular day of the time period. - In some embodiments, the
alert module 508 transmits the alert including a link to a web interface. In one embodiment, the web interface includes thesensor data 112 of the second time period in the context of thesensor data 112 of the first time period. - The
feedback module 514 may be deployed in thesensor processing subsystem 302 to receive and process feedback, requests, selections, or the like. - In some embodiments, the
feedback module 514 receives a feedback response to the alert. The feedback response includes feedback regarding clinical relevance of the alert. Thefeedback module 514 may then take action based on receipt of the feedback response. - In one embodiment, the action includes adjusting the threshold based on the receipt of the feedback response. In one embodiment, the action includes recording ignored indicia for the person based on the receipt of the feedback response. The ignored indicia may be associated with a feature of the alert.
- In some embodiments, the
sensor processing subsystem 302 includes thecorrelation module 516 and thechange determination module 518 to predict changes in a health condition. The health condition may be a physical condition, a mental condition, or a physical and a mental condition. In one embodiment, the health condition is pulse pressure. Pulse pressure may be the difference between systolic blood pressure (SBP) and the diastolic blood pressure (DBP). - By way of example, the
sensor data module 502 accesses thesensor data 112 associated with a person and the healthmodule data module 510 accessing thehealth data 114 of the person for a first time period. Thecorrelation module 516 then correlates the health data to at least a portion of thesensor data 112 for the first time period. Thesensor data module 502 accessesadditional sensor data 112 for a second time period. - The
changer determination module 518 then determines whether a change in a health condition of the person has occurred based on theadditional sensor data 112 and correlation of the health condition data to at least the portion of thesensor data 112 for the first time period. Thealert module 508 may generate an alert when a determination is made that the change in the health condition has occurred. - The
display generation module 520 generates a display. In some embodiments, thealert module 508 generates the alert and thedisplay generation module 520 generates a display based the alert. - In some embodiments, the
sensor data module 502 accesses thesensor data 112 and thedisplay generation module 520 generates a display based on thesensor data 112. In one embodiment, thesensor data 112 is grouped on the display based on multiple categories. The categories may include, by way of example, motion, pulse, breathing, and bed restlessness. - In some embodiments, the
feedback module 514 receives a selection of a person and a date range. Thesensor data module 502 may then access thesensor data 112 based on receipt of the selection. - A user may interface with the
sensor processing subsystem 302 to zoom in or zoom out on the display. In some embodiments, thefeedback module 514 receives a time interval modification request. Thedisplay generation module 520 may then generate a display based on access of thesensor data 112 associated with the time period and receipt of the time interval modification request. - In some embodiments, the
feedback module 514 receives a time increment modification request. Thedisplay generation module 520 may then generate the display based on access of thesensor data 112 associated with the time period and receipt of the time increment modification request. - The
density module 522 determines an away-from-home time period for a person associated with the living unit during the time period. The generation of the display by thedisplay generation module 520 then generates the display based on access of thesensor data 112 and a determination of the away-from-home time period. - In some embodiments, a determination of the away-from home time period by the
density module 522 includes analyzing thesensor data 112 to determine whether a living unit departure sensor sequence and a living unit return sensor sequence has occurred and calculating a time difference between occurrence of the living unit departure sensor sequence and occurrence of the living unit return sensor sequence. - In one embodiment, analyzing the
sensor data 112 includes applying fuzzy logic to at least a portion of thesensor data 112 to determine whether a living unit departure sensor sequence and a living unit return sensor sequence has occurred. - In some embodiments, the
density module 522 computes a number of motion sensor hits for multiple hours. A sensor hit is associated with a motion sensor. Thedensity module 522 may then calculate density for the multiple hours. The generation of the display by thedisplay generation module 520 may then be based on calculation of the density. - In one embodiment, the
display generation module 520 selects color mappings and then generates the display based on the selection of color mappings. In general, a color mapping has a color based on the density and is associated with a position on a display based on the hour and day. - Dis-similarity between density maps may be computed by use of the
density module 522. In some embodiments, thedensity module 522 accesses a first density map and a second density map, the first density map having a first color mappings, the second density map having a second color mappings, computes a dis-similarity between the first density map and the second density map based on a textual feature of the first density map and the second density map, and generates a computational result based on computing the dis-similarity. Textual features may include, by way of example, spatial, frequency, and perceptual properties. Thedisplay generation module 520 may then generate a display based on computation of the dis-similarity. Thedensity module 520 may transmit a notification based on computation of the dis-similarity, storing the computational result, or both. - Clustering may be performed by the
clustering module 524 to analyze thesensor data 112 based on clusters. In some embodiments, theclustering module 524 generates feature clusters for a time period. A feature cluster is associated with multiple feature vectors, wherein a feature vector is associated with thesensor data 112 from at least some of motion sensors and/or a bed sensor. Thesensor data module 502 accessesadditional sensor data 112 associated a feature for a different time period. Theclustering module 524 may then determine whether theadditional sensor data 112 falls within the feature clusters or belongs in a new cluster. - Based on a result of the determination, the
clustering module 524 generates a notification. The notification may be a cluster addition notification based on a determination that theadditional sensor data 112 falls within the feature clusters. The notification may be a new cluster notification based on a determination that theadditional sensor data 112 belongs in the new cluster. -
FIG. 6 illustrates amethod 600 for sensor processing according to an example embodiment. Themethod 600 may be performed by theoperator device 102 or theprocessor device 106 of the system 100 (seeFIG. 1 ), or may be otherwise performed. - At
block 602, thesensor data 112 is accessed from the motion sensors and the bed sensor deployed in a living unit for a first time period. In general, the deployed sensors are passive, non-wearable sensors. - At least a portion of the
sensor data 112 associated with the first time period may be identified as being associated with the person atblock 604. - An activity pattern is identified for the first time period at
block 606 based on at least a portion ofsensor data 112 associated with the first time period. In one embodiment, the activity pattern represents a physical and cognitive health condition of a person residing in the living unit. In some embodiments, identification of the activity pattern for the first time period is based on at least the portion ofsensor data 112 associated with the first time period associated with the person. - At
block 608,additional sensor data 112 is accessed from the motion sensors and the bed sensor deployed in the living unit for a second time period. The second time period occurs after the first time period. In some embodiments, the first period of time has a same time duration as the second period of time. - In one embodiment, the first time period is for a period of fourteen consecutive days and the second time period is for a period of a single day. Different periods of time may be used for the first time period and the second time period.
- In some embodiments, the operations performed at
block 602 including accessing thesensor data 112 from a stove sensor deployed in the living unit and the operations performed atblock 608 include accessing theadditional sensor data 112 from the stove sensor deployed in the living unit. - A determination of whether a deviation of the activity pattern of the first time period has occurred for the second time period is performed at
block 610. In some embodiments, the activity pattern includes multiple features and the deviation is associated with a feature of the multiple features. - In some embodiments, the activity pattern for the second time period is identified based on access of the
additional sensor data 112 associated with the second time period. The determination performed atblock 610 may then include determining whether the deviation of the activity pattern of the second time period from the activity pattern of the first time period exceeds a threshold. - The
health data 114 associated with the person may be analyzed atblock 612, while an alert is generated atblock 614. In some embodiments, the alert is generated based on a determination that the derivation has occurred. In some embodiment, the alert is generated based on when the activity pattern of the second time period from the activity pattern of the first time period exceeds the threshold and analysis of thehealth data 114. In some embodiments, the alert is transmitted, while in some embodiments the alert is stored. -
FIG. 7 illustrates amethod 700 for sensor processing according to an example embodiment. Themethod 70 may be performed by theoperator device 102 or theprocessor device 106 of the system 100 (seeFIG. 1 ), or may be otherwise performed. - At
block 702, thesensor data 112 is accessed from motion sensors and a bed sensor deployed in a living unit for a first time period. - An activity pattern for the first time period is identified at
block 704 based on at least a portion of thesensor data 112 associated with the first time period. In some embodiments, the activity pattern represents a physical and cognitive health condition of a person residing in the living unit. In one embodiment, the activity pattern includes a single feature. In another embodiment, the activity pattern includes multiple features. - At
block 706,additional sensor data 112 is accessed from the motion sensors and the bed sensor deployed in the living unit for a second time period. The second time period occurs after the first time period. - Statistical parameters of at least a portion of the
sensor data 112 for the first time period are calculated atblock 708. - A determination of whether a deviation of the activity pattern of the first time period has occurred for the second time period is performed at
block 710. In some embodiments, the determination includes determining whether at least a portion of theadditional sensor data 112 for the second time period is outside of a threshold. In general, the threshold is based on the statistical parameters. - An alert is generated at
block 712 based on a determination that the derivation has occurred. In some embodiments, the alert is a hits-based alert. The activity pattern for the first time period may then be based on total number of sensor hits of aparticular sensor 108 during a particular day of the first time period. In some embodiments, the alert is a time-based alert. The activity pattern for the first time period may then be based on total time that aparticular sensor 108 fired during a particular day of the first time period. - In some embodiments, the alert generated may be adapted or customized based on received feedback.
- The alert including a link to a web interface may be transmitted at
block 714. The web interface may include thesensor data 112 of the second time period in the context of thesensor data 112 of the first time period. - A feedback response may be received to the alert at
block 716. The feedback response includes feedback regarding clinical relevance of the alert. - An action may be taken at
block 718 based on receipt of the feedback response. In some embodiments, taking the action may include adjusting the threshold based on the receipt of the feedback response. In some embodiments, taking the action may include recording ignored indicia for the person based on the receipt of the feedback response. The ignored indicia may be associated with a feature of the alert. -
FIG. 8 illustrates amethod 800 for sensor processing according to an example embodiment. Themethod 800 may be performed by theoperator device 102 or theprocessor device 106 of the system 100 (seeFIG. 1 ), or may be otherwise performed. - The
health data 114 of a person for a first time period is accessed atblock 802. - The
sensor data 112 from motion sensors and a bed sensor deployed in a living unit for the first time period is accessed atblock 804. - The health data is correlated to at least a portion of the
sensor data 112 for the first time period atblock 806. -
Additional sensor data 112 is accessed atblock 808 from the motion sensors and the bed sensor deployed in the living unit for a second time period. The second time period generally occurs after the first time period. - At
block 810, a determination of whether a change in a health condition of the person has occurred is made based on theadditional sensor data 112 and correlation of thehealth data 114 to at least the portion of thesensor data 112 for the first time period. - An alert may be generated at
block 812 when a determination is made that the change in the health condition has occurred. -
FIG. 9 illustrates amethod 900 for display generation according to an example embodiment. Themethod 900 may be performed by theoperator device 102 or theprocessor device 106 of the system 100 (seeFIG. 1 ), or may be otherwise performed. - A selection of a person and/or a date range may be received at
block 902. - The
sensor data 112 is accessed from motion sensors and a bed sensor deployed in a living unit for a time period atblock 904. In some embodiments, the access of thesensor data 112 from the motion sensors and the bed sensor for the time period is based on receipt of the selection. - A request may be received at
block 906. In some embodiments, the request is a time interval modification request. In some embodiments, the request is a time increment modification request. - A display is generated at
block 908 based on access of thesensor data 112 associated with the time period. In some embodiments, thesensor data 112 is grouped on the display based on multiple categories. For example, the categories may include motion, pulse, breathing, and restlessness. - In some embodiments, generation of the display is based on access of the
sensor data 112 associated with the time period and receipt of the time interval modification request. In some embodiments, generation of the display is based on access of thesensor data 112 associated with the time period and receipt of the time increment modification request. -
FIG. 10 illustrates amethod 1000 for display generation according to an example embodiment. Themethod 1000 may be performed by theoperator device 102 or theprocessor device 106 of the system 100 (seeFIG. 1 ), or may be otherwise performed. - A selection of a person and/or a date range may be received at
block 1002. - The
sensor data 112 is accessed from motion sensors and a bed sensor deployed in a living unit for a time period atblock 1004. In some embodiments, the access of thesensor data 112 from the motion sensors and the bed sensor for the time period is based on receipt of the selection. - A determination of an away-from-home time period for a person associated with the living unit during the time period is made at
block 1006. In some embodiments, the determination of the away-from-home time period includes analyzing thesensor data 112 to determine whether a living unit departure sensor sequence and a living unit return sensor sequence has occurred and calculating a time difference between occurrence of the living unit departure sensor sequence and occurrence of the living unit return sensor sequence,. The away-from home time period may then be based on the time difference. - In one embodiment, analyzing the
sensor data 112 includes applying fuzzy logic to at least a portion of thesensor data 112 to determine whether a living unit departure sensor sequence and a living unit return sensor sequence has occurred. - A number of motion sensor hits for multiple hours of the time period may be computed at
block 1008. A single motion sensor hit is associated with a single motion sensor of the plurality of motion sensors. - A density for the hours may be calculated at
block 1010. The density for an hour may be based on the number of motion sensor hits during the hour and the determination of the away-from-home time period. - A display is generated at
block 1012 based on access of thesensor data 112 associated with the time period and a determination of the away-from-home time period. In some embodiments, generation of the display is based on calculation of the density. - In some embodiments, generation of the display includes selecting a color mappings and generating the display based on selection of the color mappings. In general, a color mapping has a color based on the density and is associated with a position based on an hour of a day.
-
FIG. 11 illustrates amethod 1100 for determining dis-similarity of density maps according to an example embodiment. Themethod 1100 may be performed by theoperator device 102 or theprocessor device 106 of the system 100 (seeFIG. 1 ), or may be otherwise performed. - A dis-similarity measure based on texture features may be used for comparing density maps and automatically determining changes in activity patterns. The dis-similarity between two density maps may be computed to aid caregivers in evaluating changes of residents. The texture features may be used evaluate the dis-similarity of density maps by capturing spatial, frequency, and perceptual properties such as periodicity, coarseness, and complexity. Texture features may be extracted using the co-occurrence distribution (e.g., the gray-level co-occurrence statistical method using the density values directly).
- In some embodiments, the density maps need not have the color mapping to determine the dis-similarity.
- A first density map and a second density map are accessed at
block 1102. The first density map has first color mappings. The second density map has second color mappings. In general, a color mapping has a color based on density and is associated with a position based on an hour of a day. In some embodiments, the density is based on a number of motion sensor hits during the hour and a determination of the away-from-home time period. - A dis-similarity between the first density map and the second density map is computed at
block 1104 based on a textual feature of the first density map and the second density map. Examples of textual features include spatial, frequency, and perceptual properties. The computation may be performed based on a single textual feature or multiple textual features. - An angular second moment feature (ASM) may measure homogeneity of the image. The contrast feature may measure the amount of local variations in an image. The inverse difference moment may also measure image homogeneity. Entropy may measure the disorder. Other non-textual features may also be used to discriminate the dis-similarity of density maps. For example, average motion density per hour and average time away from the living unit per day may be used during the computation performed at
block 1104. - The dis-similarity of two different density maps, in some embodiments, is represented by a number that is computed in feature space as the distance from one map to another.
- A computational result is generated at
block 1106 based on computing the dis-similarity. - A display may be generated at
block 1108 based on computation of the dis-similarity. In some embodiments, a notification based on computation of the dis-similarity may be transmitted. In some embodiments, the computational result may be stored. -
FIG. 12 illustrates amethod 1200 for performing cluster analysis according to an example embodiment. Themethod 1200 may be performed by theoperator device 102 or theprocessor device 106 of the system 100 (seeFIG. 1 ), or may be otherwise performed. - Feature clusters are generated for a time period at
block 1202. The time period includes multiple days. A feature cluster is associated with a multiple feature vectors. A feature vector is associated with thesensor data 112 from at least some of the motion sensors and/or a bed sensor deployed in a living unit. -
Additional sensor data 112 associated with a particular feature for a different time period is accessed atblock 1204. - A determination of whether the
additional sensor data 112 falls within the feature clusters or belongs in a new cluster is made atblock 1206. - A notification is generated at
block 1208 based on a result of a determination. - In some embodiments, a cluster addition notification is generated based on a determination that the
additional sensor data 112 falls within the feature clusters. In some embodiments, a new cluster notification is generated based on a determination that theadditional sensor data 112 belongs in the new cluster. -
FIG. 13 is a diagram 1300 of a user interface, according to an example embodiment. The user interface shows motion sensor data for multiple sensors over a period of fourteen days. -
FIG. 14 is a diagram 1400 of a user interface, according to an example embodiment. The user interface shows motion sensor data over a period of twenty eight days. The diagram 1400 is a “zoomed out” version of the diagram 1300 (seeFIG. 13 ). -
FIG. 15 is a diagram 1500 of a user interface, according to an example embodiment. The user interface shows motion sensor data over a period of a day. The diagram 1500 is a “zoomed in” version of the diagram 1300 (seeFIG. 13 ). -
FIG. 16 is a diagram 1600 of a user interface, according to an example embodiment. The user interface shows motion sensor data for a single sensor over a period of fourteen days. -
FIG. 17 is a diagram 1700 of an example alert, according to an example embodiment. The alert shown the diagram 1700 may be transmitted as an e-mail or otherwise transmitted. The alert is shown to include links to a user interface associated with sensors. The links included in the diagram are a link to a bathroom sensor, a kitchen sensor, and a living room sensor. In the example alert shown in the diagram 1700, links are also included to feedback web pages to capture a user's rating of the significance of the alert. -
FIG. 18 is a diagram 1800 of a user interface, according to an example embodiment. The diagram 1800 shows a user interface that may be presented based on selection of a link included in an alert of the diagram 1700. - As shown in the diagram 1800, a resident ID, a time period including starting date, starting hour, ending date, and ending hour, a time interval, and an increment selections may be available for customization. The operator may modify default selections and then press a submit button.
-
FIG. 19 is a diagram 1900 of a user interface, according to an example embodiment. The diagram 1900 shows sensor firing data for a fourteen day period. The diagram 1900 may be presented based on selection of a submit button from the diagram 1800. -
FIG. 20 is a diagram 2000 of a user interface, according to an example embodiment. The diagram enables an operator to provide alert feedback. The operator may include a rating of the significance of the alert, thoughts about the alert (e.g., not enough of a change and not a good parameter), and comments through the user interface. Other or different feedback may be collected. The operator may also designate the perspective (e.g., classification) of the operator submitting the feedback. - In some embodiments, the user interface shown in the diagram 2000 may be used to provide adaptive, customizable alerts by adjusting the sensor parameters and thresholds, based on the alert feedback ratings.
-
FIGS. 21-23 are diagrams 2100-2300 of density maps, according to an example embodiment. While the diagrams 2100-2300 are shown in this document in black and white, the displays associated with the diagrams 2100-2300 are typically generated in color based on color mappings. - The diagram 2100 is a density map of a person with a sedentary lifestyle pattern for one month. The diagram 2200 is a density map of a person with an active lifestyle pattern for one month. The diagram 2300 is a density map of a person with an irregular lifestyle pattern showing a cognitive problem for one month.
- By monitoring the motion density maps over time, health care providers in some embodiments may identify a typical pattern of activity for an individual and watch for changes in the pattern.
-
FIG. 24 is a diagram 2400 of a floor plan of living unit, according to an example embodiment. The diagram 2400 shows example locations of motion sensors, a bed sensor, and a stove sensor in the living unit. - The motion sensors may detect presence in a particular room as well as specific activities. For example, a motion sensor installed on the ceiling above the shower detects showering activity; motion sensors installed discretely in cabinets and the refrigerator detect kitchen activity. For convenience, a motion sensor may also installed on the ceiling above the door of the living unit, to detect movement in and out of the doorway (e.g., for living unit exits). The motion sensors, in some embodiments, are commercially available passive infrared (PIR) sensors which transmit using the wireless X10 protocol. Other types of sensors may be used.
- In some embodiments, the sensors detect movement of warm bodies and transmit an event about every 7 seconds when movement is still detected. This artifact is useful for capturing a general lifestyle pattern; for example, a sedentary pattern will result in a smaller number of sensor events over time compared to a more active “puttering” pattern.
- The bed sensor may be a transducer which detects presence in the bed, pulse and respiration rates, and bed restlessness. Pulse and respiration rates may be reported as low, normal, and high, based on thresholds, or pulse and respiration rates may be reported as numerical rates. In some embodiments, bed restlessness is reported based on the persistence of movement in the bed. All of the output of the bed sensor may contribute to the general pattern of the resident.
- The stove sensor may detect motion in the kitchen as well as the temperature of the stove/oven unit. This may be performed through a modified X10 PIR motion sensor. When a high temperature is detected, a “stove on” event may be generated. When the temperature drops below a threshold again, a “stove off” event may be generated. This sensor is included so that an alert could be generated if the stove is left on and there is no indication of someone in the kitchen for a specified period of time.
- In some embodiments, all of the
sensor data 112 for the person is transmitted wirelessly via the X10 protocol to a data monitor PC which is located in the living unit of the person. The data monitor may add a date-time stamp for each sensor event and may log it as the sensor data into a file that is periodically sent to a dedicated central server which stores the data in a relational database. The data monitors may be connected to the central server through a dedicated local network, for security purposes. In addition, as a precaution, identifiers may be stripped from the data before transmission. -
FIG. 25 is a diagram 2500 of predicted pulse pressure from thesensor data 112 and measured pulse pressure, according to an example embodiment -
FIG. 26 is a diagram 2600 of a comparison of Euclidean distance for a person, according to an example embodiment. The diagram 2600 may be generated as a result of the operations performed at block 1108 (seeFIG. 11 ). -
FIG. 27 is a diagram 2700 of multiple density maps associated with for the diagram 2600. -
FIG. 28 shows a block diagram of a machine in the example form of acomputer system 2800 within which a set of instructions may be executed causing the machine to perform any one or more of the methods, processes, operations, or methodologies discussed herein. Theoperator device 102, theprovider device 106, or both may include the functionality of the one ormore computer systems 2800. - In an example embodiment, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, a kiosk, a point of sale (POS) device, a cash register, an Automated Teller Machine (ATM), or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
- The
example computer system 2800 includes a processor 2812 (e.g., a central processing unit (CPU) a graphics processing unit (GPU) or both), amain memory 2804 and astatic memory 2806, which communicate with each other via abus 2808. Thecomputer system 2800 may further include a video display unit 2810 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). Thecomputer system 2800 also includes an alphanumeric input device 2812 (e.g., a keyboard), a cursor control device 2814 (e.g., a mouse), adrive unit 2816, a signal generation device 2818 (e.g., a speaker) and anetwork interface device 2820. - The
drive unit 2816 includes a machine-readable medium 2822 on which is stored one or more sets of instructions (e.g., software 2824) embodying any one or more of the methodologies or functions described herein. Thesoftware 2824 may also reside, completely or at least partially, within themain memory 2804 and/or within theprocessor 2812 during execution thereof by thecomputer system 2800, themain memory 2804 and theprocessor 2812 also constituting machine-readable media. - The
software 2824 may further be transmitted or received over anetwork 2826 via thenetwork interface device 2820. - While the machine-
readable medium 2822 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing or encoding a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical media, and magnetic media. In some embodiments, the machine-readable medium is a non-transitory machine readable medium. - Certain systems, apparatus, applications or processes are described herein as including a number of modules. A module may be a unit of distinct functionality that may be presented in software, hardware, or combinations thereof. When the functionality of a module is performed in any part through software, the module includes a machine-readable medium. The modules may be regarded as being communicatively coupled.
- In an example embodiment, sensor data may be accessed from a plurality of motion sensors and a bed sensor deployed in a living unit for a first time period. An activity pattern for the first time period may be identified based on at least a portion of sensor data associated with the first time period. The activity pattern may represent a physical and cognitive health condition of a person residing in the living unit. Additional sensor data may be accessed from the plurality of motion sensors and the bed sensor deployed in the living unit for a second time period. The second time period may occur after the first time period. A determination of whether a deviation of the activity pattern of the first time period has occurred for the second time period may be performed. An alert may be generated based on a determination that the derivation has occurred.
- In an example embodiment, health data of a person may be accessed for a first time period. Sensor data from a plurality of motion sensors and a bed sensor deployed in a living unit may be accessed for the first time period. The person may live in the living unit. Health data may be correlated to at least a portion of the sensor data for the first time period. Additional sensor data may be accessed from the plurality of motion sensors and the bed sensor deployed in the living unit for a second time period. The second time period may occurring after the first time period. A determination of whether a change in a health condition of the person has occurred may be made based on the additional sensor data and correlation of the health data to at least the portion of the sensor data for the first time period.
- In an example embodiment, sensor data may be accessed from a plurality of motion sensors and a bed sensor deployed in a living unit for a time period. A display may be generated based on access of the sensor data associated with the time period.
- In an example embodiment, a first density map and a second density map may be accessed. The first density map may have a plurality of first color mappings. The second density map may have a plurality of second color mappings. A particular first color mapping may have a color based on density and being associated with a position based on a particular hour and a particular day. Density may be based on a number of motion sensor hits during the particular hour and a determination of the away-from-home time period. A dis-similarity between the first density map and the second density map may be computed based on a textual feature of the first density map and the second density map. A computational result may be generated based on computing the dis-similarity.
- In an example embodiment, a plurality of feature clusters may be generated for a time period. The time period may include a plurality of days. A particular feature cluster may be associated with a plurality of feature vectors. A particular feature vector may be associated with sensor data from at least some of a plurality of motion sensors and a bed sensor deployed in a living unit. Additional sensor data associated a particular feature for a different time period may be accessed. A determination of whether the additional sensor data falls within the plurality of feature clusters or belongs in a new cluster may be made. A notification may be generated based on a result of a determination.
- Thus, methods and systems for an integrated network have been described. Although embodiments of the present invention have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the embodiments of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
- Various activities described with respect to the methods identified herein can be executed in serial or parallel fashion. Although “End” blocks are shown in the flowcharts, the methods may be performed continuously.
- The Abstract of the Disclosure is provided to comply with 37 C.F.R. § 1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter may lie in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.
Claims (20)
1. A system comprising:
a sensor network, the sensor network comprising a plurality of motion sensors and a bed sensor for deployment in a living unit; and
a processor for cooperation with the sensor network, the processor configured to:
access sensor data from a plurality of the motion sensors and the bed sensor for a first time period;
identify an activity pattern for the first time period based on at least a portion of sensor data associated with the first time period;
access additional sensor data from the plurality of motion sensors and the bed sensor for a second time period, the second time period occurring after the first time period;
determine whether a deviation of the activity pattern of the first time period has occurred for the second time period; and
generate an alert based on a determination that the deviation has occurred.
2. The system of claim 1 , wherein the activity pattern includes a plurality of features and the deviation is associated with a particular feature of the plurality of features.
3. The system of claim 1 , wherein the processor is further configured to:
identify the activity pattern for the second time period based on access of the additional sensor data associated with the second time period; and
wherein the determination of whether the deviation has occurred includes a determination of whether the deviation of the activity pattern of the second time period from the activity pattern of the first time period exceeds a threshold.
4. The system of claim 1 , wherein the plurality of motion sensors and the bed sensor are passive, non-wearable sensors.
5. The system of claim 1 , wherein the first period of time has a same time duration as the second period of time.
6. The system of claim 1 , wherein the processor is further configured to:
calculate statistical parameters of at least a portion of the sensor data for the first time period; and
wherein the determination of whether the deviation has occurred includes a determination of whether at least a portion of the additional sensor data for the second time period is outside of a threshold, wherein the threshold is based on the statistical parameters.
7. The system of claim 1 , wherein the processor is further configured to:
transmit the alert including a link to a web interface, the web interface including the sensor data of the second time period in the context of the sensor data of the first time period.
8. The system of claim 7 , wherein the processor is further configured to:
receive a feedback response to the alert, feedback response including feedback regarding clinical relevance of the alert; and
take action based on receipt of the feedback response.
9. The system of claim 8 , wherein taking the action comprises:
adjusting the threshold, the sensor parameter used for the alert, or both based on the receipt of the feedback response.
10. The system of claim 1 , wherein the activity pattern includes a plurality of features.
11. A method comprising:
accessing sensor data from a plurality of motion sensors and a bed sensor deployed in a living unit for a first time period;
identifying an activity pattern for the first time period based on at least a portion of sensor data associated with the first time period;
accessing additional sensor data from the plurality of motion sensors and the bed sensor deployed in the living unit for a second time period, the second time period occurring after the first time period;
determining whether a deviation of the activity pattern of the first time period has occurred for the second time period; and
generating an alert based on a determination that the deviation has occurred.
12. A method comprising:
accessing sensor data from a plurality of motion sensors and a bed sensor deployed in a living unit for a time period; and
generating a display based on access of the sensor data associated with the time period.
13. The method of claim 12 , wherein the sensor data is grouped on the display based on a plurality of categories, the plurality of categories include motion, pulse, breathing, and restlessness.
14. The method of claim 12 , further comprising:
receiving a selection of a person and a date range; and
wherein accessing sensor data from the plurality of motion sensors and the bed sensor for the time period is based on receipt of the selection.
15. The method of claim 12 , further comprising:
receiving a time interval modification request; and
wherein generation of the display is based on access of the sensor data associated with the time period and receipt of the time interval modification request.
16. The method of claim 12 , further comprising:
receiving a time increment modification request; and
wherein generation of the display is based on access of the sensor data associated with the time period and receipt of the time increment modification request.
17. The method of claim 12 , further comprising:
determining an away-from-home time period for a person associated with the living unit during the time period; and
wherein generation of the display is based on access of the sensor data and a determination of the away-from-home time period.
18. The method of claim 17 , wherein determining the away-from home time period comprises:
analyzing the sensor data to determine whether a living unit departure sensor sequence and a living unit return sensor sequence has occurred; and
calculating a time difference between occurrence of the living unit departure sensor sequence and occurrence of the living unit return sensor sequence, wherein the away-from home time period is based on the time difference.
19. The method of claim 17 , further comprising:
computing a number of motion sensor hits for a plurality of hours, a single motion sensor hit being associated with a particular motion sensor of the plurality of motion sensors, the time period including the plurality of hours; and
calculating density for the plurality of hours, the density for a particular hour of the plurality of hours being based on the number of motion sensor hits during the particular hour and the determination of the away-from-home time period; and
wherein generation of the display is based on calculation of the density.
20. The method of claim 19 , wherein generation of the display comprises:
selecting a plurality of color mappings, a particular color mapping having a color based on the density and being associated with a position based on the particular hour and a particular day, the time period including a plurality of days; and
wherein generation of the display is based on selection of the plurality of color mappings.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/409,974 US20210378511A1 (en) | 2009-06-01 | 2021-08-24 | Integrated Sensor Network Methods and Systems |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US21762309P | 2009-06-01 | 2009-06-01 | |
US12/791,628 US10188295B2 (en) | 2009-06-01 | 2010-06-01 | Integrated sensor network methods and systems |
US16/251,478 US11147451B2 (en) | 2009-06-01 | 2019-01-18 | Integrated sensor network methods and systems |
US17/409,974 US20210378511A1 (en) | 2009-06-01 | 2021-08-24 | Integrated Sensor Network Methods and Systems |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/251,478 Continuation US11147451B2 (en) | 2009-06-01 | 2019-01-18 | Integrated sensor network methods and systems |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210378511A1 true US20210378511A1 (en) | 2021-12-09 |
Family
ID=43219597
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/791,496 Active 2032-10-10 US8890937B2 (en) | 2009-06-01 | 2010-06-01 | Anonymized video analysis methods and systems |
US12/791,628 Active 2033-10-18 US10188295B2 (en) | 2009-06-01 | 2010-06-01 | Integrated sensor network methods and systems |
US16/251,478 Active 2030-09-09 US11147451B2 (en) | 2009-06-01 | 2019-01-18 | Integrated sensor network methods and systems |
US16/254,339 Abandoned US20190167103A1 (en) | 2009-06-01 | 2019-01-22 | Integrated Sensor Network Methods and Systems |
US17/409,974 Abandoned US20210378511A1 (en) | 2009-06-01 | 2021-08-24 | Integrated Sensor Network Methods and Systems |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/791,496 Active 2032-10-10 US8890937B2 (en) | 2009-06-01 | 2010-06-01 | Anonymized video analysis methods and systems |
US12/791,628 Active 2033-10-18 US10188295B2 (en) | 2009-06-01 | 2010-06-01 | Integrated sensor network methods and systems |
US16/251,478 Active 2030-09-09 US11147451B2 (en) | 2009-06-01 | 2019-01-18 | Integrated sensor network methods and systems |
US16/254,339 Abandoned US20190167103A1 (en) | 2009-06-01 | 2019-01-22 | Integrated Sensor Network Methods and Systems |
Country Status (1)
Country | Link |
---|---|
US (5) | US8890937B2 (en) |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8452052B2 (en) * | 2008-01-21 | 2013-05-28 | The Boeing Company | Modeling motion capture volumes with distance fields |
US8890937B2 (en) | 2009-06-01 | 2014-11-18 | The Curators Of The University Of Missouri | Anonymized video analysis methods and systems |
US9053562B1 (en) | 2010-06-24 | 2015-06-09 | Gregory S. Rabin | Two dimensional to three dimensional moving image converter |
US20120010488A1 (en) * | 2010-07-01 | 2012-01-12 | Henry Barry J | Method and apparatus for improving personnel safety and performance using logged and real-time vital sign monitoring |
US9020886B2 (en) * | 2010-12-23 | 2015-04-28 | Ncr Corporation | Peer to peer diagnostic tool |
US11080513B2 (en) * | 2011-01-12 | 2021-08-03 | Gary S. Shuster | Video and still image data alteration to enhance privacy |
WO2012118998A2 (en) * | 2011-03-02 | 2012-09-07 | The Regents Of The University Of California | Apparatus, system, and method for detecting activities and anomalies in time series data |
US20130176142A1 (en) * | 2011-06-10 | 2013-07-11 | Aliphcom, Inc. | Data-capable strapband |
US20130127620A1 (en) | 2011-06-20 | 2013-05-23 | Cerner Innovation, Inc. | Management of patient fall risk |
US8884751B2 (en) * | 2011-07-01 | 2014-11-11 | Albert S. Baldocchi | Portable monitor for elderly/infirm individuals |
US10546481B2 (en) | 2011-07-12 | 2020-01-28 | Cerner Innovation, Inc. | Method for determining whether an individual leaves a prescribed virtual perimeter |
US9741227B1 (en) | 2011-07-12 | 2017-08-22 | Cerner Innovation, Inc. | Method and process for determining whether an individual suffers a fall requiring assistance |
US11013415B2 (en) | 2011-08-31 | 2021-05-25 | The Curators Of The University Of Missouri | Hydraulic bed sensor and system for non-invasive monitoring of physiological data |
US9408561B2 (en) * | 2012-04-27 | 2016-08-09 | The Curators Of The University Of Missouri | Activity analysis, fall detection and risk assessment systems and methods |
US9597016B2 (en) * | 2012-04-27 | 2017-03-21 | The Curators Of The University Of Missouri | Activity analysis, fall detection and risk assessment systems and methods |
US9710761B2 (en) | 2013-03-15 | 2017-07-18 | Nordic Technology Group, Inc. | Method and apparatus for detection and prediction of events based on changes in behavior |
AU2014257036A1 (en) * | 2013-04-23 | 2015-11-12 | Canary Connect, Inc. | Security and/or monitoring devices and systems |
US10096223B1 (en) | 2013-12-18 | 2018-10-09 | Cerner Innovication, Inc. | Method and process for determining whether an individual suffers a fall requiring assistance |
US9729833B1 (en) | 2014-01-17 | 2017-08-08 | Cerner Innovation, Inc. | Method and system for determining whether an individual takes appropriate measures to prevent the spread of healthcare-associated infections along with centralized monitoring |
US10078956B1 (en) | 2014-01-17 | 2018-09-18 | Cerner Innovation, Inc. | Method and system for determining whether an individual takes appropriate measures to prevent the spread of healthcare-associated infections |
US10225522B1 (en) | 2014-01-17 | 2019-03-05 | Cerner Innovation, Inc. | Method and system for determining whether an individual takes appropriate measures to prevent the spread of healthcare-associated infections |
US20210202103A1 (en) * | 2014-03-28 | 2021-07-01 | Hc1.Com Inc. | Modeling and simulation of current and future health states |
US10971264B1 (en) * | 2014-05-21 | 2021-04-06 | Intrado Corporation | Patient tracking and dynamic updating of patient profile |
US10129384B2 (en) | 2014-09-29 | 2018-11-13 | Nordic Technology Group Inc. | Automatic device configuration for event detection |
US9378421B2 (en) * | 2014-09-29 | 2016-06-28 | Xerox Corporation | System and method for seat occupancy detection from ceiling mounted camera using robust adaptive threshold criteria |
US10090068B2 (en) | 2014-12-23 | 2018-10-02 | Cerner Innovation, Inc. | Method and system for determining whether a monitored individual's hand(s) have entered a virtual safety zone |
US10524722B2 (en) | 2014-12-26 | 2020-01-07 | Cerner Innovation, Inc. | Method and system for determining whether a caregiver takes appropriate measures to prevent patient bedsores |
US10091463B1 (en) | 2015-02-16 | 2018-10-02 | Cerner Innovation, Inc. | Method for determining whether an individual enters a prescribed virtual zone using 3D blob detection |
WO2016161119A1 (en) * | 2015-04-01 | 2016-10-06 | Smartcare Consultants, Llc | System for determining behavioral patterns and deviations from determined behavioral patterns |
WO2016172549A1 (en) * | 2015-04-23 | 2016-10-27 | Gen-Nine, Inc. | Activity and exercise monitoring system |
US10342478B2 (en) | 2015-05-07 | 2019-07-09 | Cerner Innovation, Inc. | Method and system for determining whether a caretaker takes appropriate measures to prevent patient bedsores |
US9892611B1 (en) | 2015-06-01 | 2018-02-13 | Cerner Innovation, Inc. | Method for determining whether an individual enters a prescribed virtual zone using skeletal tracking and 3D blob detection |
US10206630B2 (en) | 2015-08-28 | 2019-02-19 | Foresite Healthcare, Llc | Systems for automatic assessment of fall risk |
US11864926B2 (en) | 2015-08-28 | 2024-01-09 | Foresite Healthcare, Llc | Systems and methods for detecting attempted bed exit |
US9892310B2 (en) | 2015-12-31 | 2018-02-13 | Cerner Innovation, Inc. | Methods and systems for detecting prohibited objects in a patient room |
CN107534736B (en) * | 2016-03-30 | 2020-04-28 | 华为技术有限公司 | Image registration method and device of terminal and terminal |
CA3030850C (en) | 2016-06-28 | 2023-12-05 | Foresite Healthcare, Llc | Systems and methods for use in detecting falls utilizing thermal sensing |
US10147184B2 (en) | 2016-12-30 | 2018-12-04 | Cerner Innovation, Inc. | Seizure detection |
US10943088B2 (en) | 2017-06-14 | 2021-03-09 | Target Brands, Inc. | Volumetric modeling to identify image areas for pattern recognition |
US10643446B2 (en) | 2017-12-28 | 2020-05-05 | Cerner Innovation, Inc. | Utilizing artificial intelligence to detect objects or patient safety events in a patient room |
US10482321B2 (en) | 2017-12-29 | 2019-11-19 | Cerner Innovation, Inc. | Methods and systems for identifying the crossing of a virtual barrier |
AU2019350718A1 (en) | 2018-09-24 | 2021-04-29 | The Curators Of The University Of Missouri | Model-based sensor technology for detection of cardiovascular status |
US10922936B2 (en) | 2018-11-06 | 2021-02-16 | Cerner Innovation, Inc. | Methods and systems for detecting prohibited objects |
US11507769B2 (en) | 2018-12-12 | 2022-11-22 | International Business Machines Corporation | Interpreting sensor transmission patterns to analyze anomalies in a smart environment |
WO2021040428A1 (en) * | 2019-08-29 | 2021-03-04 | Samsung Electronics Co., Ltd. | Method and system for identifying an activity of a user |
US20220058177A1 (en) * | 2020-08-21 | 2022-02-24 | Sap Se | Customized processing of sensor data |
US20220147736A1 (en) * | 2020-11-09 | 2022-05-12 | Altumview Systems Inc. | Privacy-preserving human action recognition, storage, and retrieval via joint edge and cloud computing |
CN114027822B (en) * | 2021-04-19 | 2022-11-25 | 北京超思电子技术有限责任公司 | Respiration rate measuring method and device based on PPG signal |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000018449A2 (en) * | 1998-09-30 | 2000-04-06 | Minimed Inc. | Communication station and software for interfacing with an infusion pump, analyte monitor, analyte meter, or the like |
US20050131736A1 (en) * | 2003-12-16 | 2005-06-16 | Adventium Labs And Red Wing Technologies, Inc. | Activity monitoring |
US20060241510A1 (en) * | 2005-04-25 | 2006-10-26 | Earlysense Ltd. | Techniques for prediction and monitoring of clinical episodes |
US20080117060A1 (en) * | 2006-11-17 | 2008-05-22 | General Electric Company | Multifunctional personal emergency response system |
US20090079813A1 (en) * | 2007-09-24 | 2009-03-26 | Gesturetek, Inc. | Enhanced Interface for Voice and Video Communications |
US20110193704A1 (en) * | 2009-08-31 | 2011-08-11 | Abbott Diabetes Care Inc. | Displays for a medical device |
Family Cites Families (60)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4320766A (en) | 1979-03-13 | 1982-03-23 | Instrumentarium Oy | Apparatus in medicine for the monitoring and or recording of the body movements of a person on a bed, for instance of a patient |
NL8701288A (en) | 1987-06-02 | 1989-01-02 | Henrikus Bernardus Maria Verme | Monitoring system for physically or mentally handicapped - detects when person leaves bed or chair using meandering hose serving pressure sensor unit |
US5097841A (en) | 1988-09-22 | 1992-03-24 | Terumo Kabushiki Kaisha | Disposable pressure transducer and disposable pressure transducer apparatus |
US5309921A (en) * | 1992-02-11 | 1994-05-10 | Spectrum Medical Technologies | Apparatus and method for respiratory monitoring |
US6002994A (en) * | 1994-09-09 | 1999-12-14 | Lane; Stephen S. | Method of user monitoring of physiological and non-physiological measurements |
US6546813B2 (en) | 1997-01-08 | 2003-04-15 | The Trustees Of Boston University | Patient monitoring system employing array of force sensors on a bedsheet or similar substrate |
US5844488A (en) | 1997-09-23 | 1998-12-01 | Musick; Jeff L. | Bed sensor and alarm |
CN2394612Y (en) | 1999-07-16 | 2000-09-06 | 陈纪铭 | Monitoring sick bed |
US7522186B2 (en) | 2000-03-07 | 2009-04-21 | L-3 Communications Corporation | Method and apparatus for providing immersive surveillance |
US7656299B2 (en) | 2007-01-17 | 2010-02-02 | Hoana Medical, Inc. | Bed exit and patient detection system |
US20020077781A1 (en) | 2000-10-17 | 2002-06-20 | Spx Corporation | Data monitoring and display method and apparatus |
US6915008B2 (en) | 2001-03-08 | 2005-07-05 | Point Grey Research Inc. | Method and apparatus for multi-nodal, three-dimensional imaging |
CN2477135Y (en) | 2001-03-30 | 2002-02-20 | 王金华 | Precision electronic weight measuring bed |
US20030058111A1 (en) * | 2001-09-27 | 2003-03-27 | Koninklijke Philips Electronics N.V. | Computer vision based elderly care monitoring system |
US7202791B2 (en) * | 2001-09-27 | 2007-04-10 | Koninklijke Philips N.V. | Method and apparatus for modeling behavior using a probability distrubution function |
GB0207207D0 (en) * | 2002-03-27 | 2002-05-08 | Smith Simon L | Activity and behavioural monitor and alarm device |
US20040030531A1 (en) | 2002-03-28 | 2004-02-12 | Honeywell International Inc. | System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor |
CA2393880A1 (en) | 2002-07-17 | 2004-01-17 | Tactex Controls Inc. | Bed occupant monitoring system |
JP4013195B2 (en) | 2002-11-07 | 2007-11-28 | 株式会社シービーシステム開発 | Sleep state monitoring device |
KR100507780B1 (en) * | 2002-12-20 | 2005-08-17 | 한국전자통신연구원 | Apparatus and method for high-speed marker-free motion capture |
DE10305289A1 (en) | 2003-02-08 | 2004-08-19 | Rapp, Josef, Dr. | Multimedia support in the form of a bed or chair, e.g. for use in medical therapeutic applications that incorporate a massage function, has an acceleration or inclination sensor so that acceleration can be controlled |
US7330566B2 (en) * | 2003-05-15 | 2008-02-12 | Microsoft Corporation | Video-based gait recognition |
US7532924B2 (en) | 2003-09-22 | 2009-05-12 | Cardiac Pacemakers, Inc. | Cardiac rhythm management system with exercise test interface |
US20050088515A1 (en) | 2003-10-23 | 2005-04-28 | Geng Z. J. | Camera ring for three-dimensional (3D) surface imaging |
US7396331B2 (en) | 2003-10-27 | 2008-07-08 | Home Guardian, Llc | System and process for non-invasive collection and analysis of physiological signals |
US7831087B2 (en) * | 2003-10-31 | 2010-11-09 | Hewlett-Packard Development Company, L.P. | Method for visual-based recognition of an object |
WO2005055824A1 (en) | 2003-12-04 | 2005-06-23 | Hoana Medical, Inc. | Intelligent medical vigilance system |
FR2865032B1 (en) | 2004-01-08 | 2006-09-29 | Balea | MOBILE WEIGHING BASE |
US20070118054A1 (en) | 2005-11-01 | 2007-05-24 | Earlysense Ltd. | Methods and systems for monitoring patients for clinical episodes |
US7218325B1 (en) | 2004-03-31 | 2007-05-15 | Trading Technologies International, Inc. | Graphical display with integrated recent period zoom and historical period context data |
US8589315B2 (en) | 2004-08-14 | 2013-11-19 | Hrl Laboratories, Llc | Behavior recognition using cognitive swarms and fuzzy graphs |
US7843351B2 (en) * | 2004-09-01 | 2010-11-30 | Robert Bourne | Back training device |
US20060055543A1 (en) * | 2004-09-10 | 2006-03-16 | Meena Ganesh | System and method for detecting unusual inactivity of a resident |
HRPK20041063B3 (en) | 2004-11-15 | 2007-10-31 | Nikolić Gojko | "intelligent" sick-bed mattress quilt |
JP2006288932A (en) | 2005-04-14 | 2006-10-26 | Keakomu:Kk | Leaving-from-bed monitoring system |
US8022987B2 (en) * | 2005-06-30 | 2011-09-20 | Sandia Corporation | Information-based self-organization of sensor nodes of a sensor network |
US7420472B2 (en) * | 2005-10-16 | 2008-09-02 | Bao Tran | Patient monitoring apparatus |
JP5388580B2 (en) | 2005-11-29 | 2014-01-15 | ベンチャー ゲイン リミテッド ライアビリティー カンパニー | Residue-based management of human health |
US7589637B2 (en) * | 2005-12-30 | 2009-09-15 | Healthsense, Inc. | Monitoring activity of an individual |
US20090178199A1 (en) | 2006-05-05 | 2009-07-16 | Koninklijke Philips Electronics N.V. | Sensor unit, bed for a patient and method of modifying a patient's bed |
US20070262247A1 (en) | 2006-05-11 | 2007-11-15 | Carlos Becerra | Sensory feedback bed |
US20070268480A1 (en) | 2006-05-18 | 2007-11-22 | Kaye Mitchell G | Bed angle sensor for reducing ventilator-associated pneumonia |
WO2008020363A2 (en) | 2006-08-14 | 2008-02-21 | Philips Intellectual Property & Standards Gmbh | A bed with integrated sensor unit for a patient |
US20080077020A1 (en) | 2006-09-22 | 2008-03-27 | Bam Labs, Inc. | Method and apparatus for monitoring vital signs remotely |
KR100743112B1 (en) | 2006-10-20 | 2007-07-27 | 김정현 | Sensor for bed-wetting |
GB2445760A (en) | 2007-01-19 | 2008-07-23 | Wound Solutions Ltd | A flexible pressure sensor |
US20080275349A1 (en) * | 2007-05-02 | 2008-11-06 | Earlysense Ltd. | Monitoring, predicting and treating clinical episodes |
AU2008258283B2 (en) | 2007-06-08 | 2013-06-20 | Sonomedical Pty Ltd | Passive monitoring sensor system for use with mattress |
US8120498B2 (en) | 2007-09-24 | 2012-02-21 | Intel-Ge Care Innovations Llc | Capturing body movement related to a fixed coordinate system |
KR20090032339A (en) | 2007-09-27 | 2009-04-01 | 한국전자통신연구원 | Method and apparatus for service to inquire after old people's health |
US8149273B2 (en) | 2007-11-30 | 2012-04-03 | Fuji Xerox Co., Ltd. | System and methods for vital sign estimation from passive thermal video |
CN101499106B (en) | 2008-01-30 | 2010-11-10 | 蒲卫 | Digitized hospital bed monitoring network system |
DE102008011142B4 (en) | 2008-02-26 | 2022-09-08 | Dr. Ing. H.C. F. Porsche Aktiengesellschaft | surveillance system |
US20090243833A1 (en) | 2008-03-31 | 2009-10-01 | Ching Ching Huang | Monitoring system and method for patient care |
US20100163315A1 (en) | 2008-10-25 | 2010-07-01 | Rachel Dunford York | Bed monitoring pad |
US8890937B2 (en) | 2009-06-01 | 2014-11-18 | The Curators Of The University Of Missouri | Anonymized video analysis methods and systems |
US20100330543A1 (en) * | 2009-06-24 | 2010-12-30 | Alexander Black | Method and system for a child review process within a networked community |
US8998817B2 (en) | 2009-08-28 | 2015-04-07 | Up-Med Gmbh | Blood pressure measuring device and method for measuring the blood pressure of a living being |
US20110166992A1 (en) | 2010-01-06 | 2011-07-07 | Firethorn Holdings, Llc | System and method for creating and managing a stored value account associated with a client unique identifier |
US20110308015A1 (en) | 2010-03-22 | 2011-12-22 | Paul Newham | Fitted Mattress Cover with Integrated Sensor Elements for Monitoring the Presence of a Person on a Mattress |
-
2010
- 2010-06-01 US US12/791,496 patent/US8890937B2/en active Active
- 2010-06-01 US US12/791,628 patent/US10188295B2/en active Active
-
2019
- 2019-01-18 US US16/251,478 patent/US11147451B2/en active Active
- 2019-01-22 US US16/254,339 patent/US20190167103A1/en not_active Abandoned
-
2021
- 2021-08-24 US US17/409,974 patent/US20210378511A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000018449A2 (en) * | 1998-09-30 | 2000-04-06 | Minimed Inc. | Communication station and software for interfacing with an infusion pump, analyte monitor, analyte meter, or the like |
US20050131736A1 (en) * | 2003-12-16 | 2005-06-16 | Adventium Labs And Red Wing Technologies, Inc. | Activity monitoring |
US20060241510A1 (en) * | 2005-04-25 | 2006-10-26 | Earlysense Ltd. | Techniques for prediction and monitoring of clinical episodes |
US20080117060A1 (en) * | 2006-11-17 | 2008-05-22 | General Electric Company | Multifunctional personal emergency response system |
US20090079813A1 (en) * | 2007-09-24 | 2009-03-26 | Gesturetek, Inc. | Enhanced Interface for Voice and Video Communications |
US20110193704A1 (en) * | 2009-08-31 | 2011-08-11 | Abbott Diabetes Care Inc. | Displays for a medical device |
Also Published As
Publication number | Publication date |
---|---|
US11147451B2 (en) | 2021-10-19 |
US8890937B2 (en) | 2014-11-18 |
US10188295B2 (en) | 2019-01-29 |
US20100302043A1 (en) | 2010-12-02 |
US20190167103A1 (en) | 2019-06-06 |
US20190167102A1 (en) | 2019-06-06 |
US20100328436A1 (en) | 2010-12-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210378511A1 (en) | Integrated Sensor Network Methods and Systems | |
Suryadevara et al. | Wireless sensor network based home monitoring system for wellness determination of elderly | |
US9019099B2 (en) | Systems and methods for patient monitoring | |
US6524239B1 (en) | Apparatus for non-instrusively measuring health parameters of a subject and method of use thereof | |
US9257029B1 (en) | System and method of remote monitoring and diagnostics for health conditions and emergency interventions | |
JP5555443B2 (en) | System for monitoring human cognitive ability | |
US20100081889A1 (en) | Systems and methods for monitoring and evaluating individual performance | |
Glascock et al. | The impact of behavioral monitoring technology on the provision of health care in the home. | |
Suryadevara et al. | Sensor data fusion to determine wellness of an elderly in intelligent home monitoring environment | |
JP2013078567A (en) | Method for providing remote health monitoring data and associated system | |
EP3807890B1 (en) | Monitoring a subject | |
EP3160328B1 (en) | Device, system and computer program for detecting a health condition of a subject | |
CA2866969C (en) | Method and system for determining hrv and rrv and use to identify potential condition onset | |
JP2022009084A (en) | Cognitive function prediction system and program | |
WO2017146643A1 (en) | A patient monitoring system | |
KR20180012379A (en) | Apparatus and method for analyzing user health status using activity history | |
US20160228067A1 (en) | System and method for intelligent monitoring of patient vital signs | |
Huang et al. | Knowledge discovery from lifestyle profiles to support self-management of chronic heart failure | |
WO2024150512A1 (en) | Method for generating trained model, device for generating trained model, and program | |
US20240188883A1 (en) | Hypnodensity-based sleep apnea monitoring system and method of operation thereof | |
US20240282455A1 (en) | Methods, systems, and devices for analyzing objective health data and subject-conveyed health information for health information reporting, treatment planning, diagnoses, and treatment | |
JP2024065638A (en) | Computer-executed method, program and information processing device for acquiring index indicating stability of sleep pattern | |
Budai et al. | Monitoring health status of relatives—“The smoke in the chimney” approach | |
CN117978565A (en) | Intelligent home interaction control method and system based on WeChat applet |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: THE CURATORS OF THE UNIVERSITY OF MISSOURI, MISSOURI Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SKUBIC, MARJORIE;RANTZ, MARILYN J.;POPESCU, MIHAIL;AND OTHERS;SIGNING DATES FROM 20100628 TO 20100811;REEL/FRAME:057268/0459 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |