CA2773507C - Fall detection and reporting technology - Google Patents
Fall detection and reporting technology Download PDFInfo
- Publication number
- CA2773507C CA2773507C CA2773507A CA2773507A CA2773507C CA 2773507 C CA2773507 C CA 2773507C CA 2773507 A CA2773507 A CA 2773507A CA 2773507 A CA2773507 A CA 2773507A CA 2773507 C CA2773507 C CA 2773507C
- Authority
- CA
- Canada
- Prior art keywords
- patient
- fall event
- potential fall
- determining
- person
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 116
- 238000005516 engineering process Methods 0.000 title abstract description 7
- 230000000694 effects Effects 0.000 claims abstract description 135
- 238000000034 method Methods 0.000 claims abstract description 125
- 238000012544 monitoring process Methods 0.000 claims abstract description 122
- 238000004891 communication Methods 0.000 claims description 44
- 230000011664 signaling Effects 0.000 claims description 15
- 230000011218 segmentation Effects 0.000 claims description 13
- 230000008569 process Effects 0.000 description 55
- 230000033001 locomotion Effects 0.000 description 52
- 230000008859 change Effects 0.000 description 30
- 238000012545 processing Methods 0.000 description 23
- 238000012795 verification Methods 0.000 description 18
- 230000004044 response Effects 0.000 description 14
- 238000005286 illumination Methods 0.000 description 13
- 238000003384 imaging method Methods 0.000 description 11
- 238000004458 analytical method Methods 0.000 description 10
- 230000006399 behavior Effects 0.000 description 7
- 230000035945 sensitivity Effects 0.000 description 7
- 238000005259 measurement Methods 0.000 description 6
- 230000009467 reduction Effects 0.000 description 6
- 238000012502 risk assessment Methods 0.000 description 6
- 230000001413 cellular effect Effects 0.000 description 5
- 230000036544 posture Effects 0.000 description 5
- 230000001960 triggered effect Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 238000013481 data capture Methods 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 239000003795 chemical substances by application Substances 0.000 description 3
- 229940079593 drug Drugs 0.000 description 3
- 239000003814 drug Substances 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000005021 gait Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 230000009295 sperm incapacitation Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- UGFAIRIUMAVXCW-UHFFFAOYSA-N Carbon monoxide Chemical compound [O+]#[C-] UGFAIRIUMAVXCW-UHFFFAOYSA-N 0.000 description 1
- 206010010071 Coma Diseases 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 210000001124 body fluid Anatomy 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 229910002091 carbon monoxide Inorganic materials 0.000 description 1
- 238000013479 data entry Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 235000012631 food intake Nutrition 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000000383 hazardous chemical Substances 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000037230 mobility Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000005180 public health Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0407—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
- G08B21/043—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Z—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS, NOT OTHERWISE PROVIDED FOR
- G16Z99/00—Subject matter not provided for in other main groups of this subclass
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0022—Monitoring a patient using a global network, e.g. telephone networks, internet
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
- A61B5/1117—Fall detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- General Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Business, Economics & Management (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Emergency Management (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Networks & Wireless Communication (AREA)
- Epidemiology (AREA)
- General Business, Economics & Management (AREA)
- Primary Health Care (AREA)
- Gerontology & Geriatric Medicine (AREA)
- Social Psychology (AREA)
- Psychology (AREA)
- Psychiatry (AREA)
- Alarm Systems (AREA)
- Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)
- Transition And Organic Metals Composition Catalysts For Addition Polymerization (AREA)
Abstract
Fall detection and reporting technology, in which output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling is monitored and a determination is made to capture one or more images of the room based on the monitoring. An image of the room is captured with a camera positioned to include the patient within a field of view of the camera and the captured image of the room is analyzed to detect a state of the patient at a time of capturing the image. A potential fall event for the patient is determined based on the detected state of the patient and a message indicating the potential fall event for the patient is sent based on the determination of the potential fall event for the patient. Techniques are also described for fall detection and reporting using an on-body sensing device.
Description
FALL DETECTION AND REPORTING TECHNOLOGY
CROSS REFERENCE TO RELATED APPLICATIONS
The present application claims the benefit of U.S. Provisional Application No.
61/471,495, filed April 4,2011.
TECHNICAL FIELD
This disclosure relates to fall detection and reporting technology.
BACKGROUND
Falls are a public health concern and cause for institutionalization in the senescent population, for whom they disproportionately affect. Loosely defined as an unintentional and uncontrolled movement towards the ground or lower level, a fall can have debilitating and sometimes fatal consequences. Although falls increase rates of morbidity and mortality, earlier detection and reporting of such events can improve outcomes.
Practical, early detection and reporting of falls has been an elusive goal.
Efforts to detect falls have classically employed wearable technologies to capture user input (e.g., panic button press) or to characterize and classify movements and postures.
Although these technologies demonstrate reasonable utility in ideal conditions, user non-compliance and fall-related incapacitation reduce general efficacy in application.
Furthermore, inability to verify incidence of detected falls (e.g., both true and false) leads to inaccurate fall reporting and undesirable handling of potential fall events.
SUMMARY
Techniques are described for fall detection and reporting technology. In one aspect, a method includes monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling and, based on the monitoring of output from the at least one sensor, deteimining to capture one or more images of the room. The method also includes capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room and analyzing the captured image of the room to detect a state of the patient at a time of capturing the image. The method further includes determining, based on the detected state of the patient, a potential fall event for the patient and, based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient.
Implementations may include one or more of the following features. For example, the at least one sensor configured to sense activity associated with the patient falling may be an on-body sensor configured to detect an impact and determine a change in an orientation of the patient. In this example, the method may include receiving data indicating a detected change in an orientation of the patient and an amount of the orientation change, receiving data indicating a detected impact and a severity of the detected impact, and determining, based on the received amount of orientation change and the received data indicating the severity of the impact of the patient, a threshold for inactivity of the patient. The method also may include determining, based on output from the on-body sensor, that the patient has been inactive for a period of time greater than the determined threshold and determining to capture one or more images of the room based on the determination that the patient has been inactive for a period of time greater than the determined threshold.
In addition, the at least one sensor configured to sense activity associated with the patient falling may be a button located in the room at a position that permits the patient to press the button after a fall and the method may include determining that the button has been pressed. The at least one sensor configured to sense activity associated with the patient falling may be a sensor configured to determine a presence of the patient in the room and the method may include receiving, from the sensor configured to determine the presence of the patient in the room, a signal indicating that the patient is present in the room and, after a threshold period of time, determining that the patient has not left the room and that no further signals have been received from the sensor configured to determine the presence of the patient in the room. The method may include determining to capture one or more images of the room based on determining that the patient has not left the room and that no further
CROSS REFERENCE TO RELATED APPLICATIONS
The present application claims the benefit of U.S. Provisional Application No.
61/471,495, filed April 4,2011.
TECHNICAL FIELD
This disclosure relates to fall detection and reporting technology.
BACKGROUND
Falls are a public health concern and cause for institutionalization in the senescent population, for whom they disproportionately affect. Loosely defined as an unintentional and uncontrolled movement towards the ground or lower level, a fall can have debilitating and sometimes fatal consequences. Although falls increase rates of morbidity and mortality, earlier detection and reporting of such events can improve outcomes.
Practical, early detection and reporting of falls has been an elusive goal.
Efforts to detect falls have classically employed wearable technologies to capture user input (e.g., panic button press) or to characterize and classify movements and postures.
Although these technologies demonstrate reasonable utility in ideal conditions, user non-compliance and fall-related incapacitation reduce general efficacy in application.
Furthermore, inability to verify incidence of detected falls (e.g., both true and false) leads to inaccurate fall reporting and undesirable handling of potential fall events.
SUMMARY
Techniques are described for fall detection and reporting technology. In one aspect, a method includes monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling and, based on the monitoring of output from the at least one sensor, deteimining to capture one or more images of the room. The method also includes capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room and analyzing the captured image of the room to detect a state of the patient at a time of capturing the image. The method further includes determining, based on the detected state of the patient, a potential fall event for the patient and, based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient.
Implementations may include one or more of the following features. For example, the at least one sensor configured to sense activity associated with the patient falling may be an on-body sensor configured to detect an impact and determine a change in an orientation of the patient. In this example, the method may include receiving data indicating a detected change in an orientation of the patient and an amount of the orientation change, receiving data indicating a detected impact and a severity of the detected impact, and determining, based on the received amount of orientation change and the received data indicating the severity of the impact of the patient, a threshold for inactivity of the patient. The method also may include determining, based on output from the on-body sensor, that the patient has been inactive for a period of time greater than the determined threshold and determining to capture one or more images of the room based on the determination that the patient has been inactive for a period of time greater than the determined threshold.
In addition, the at least one sensor configured to sense activity associated with the patient falling may be a button located in the room at a position that permits the patient to press the button after a fall and the method may include determining that the button has been pressed. The at least one sensor configured to sense activity associated with the patient falling may be a sensor configured to determine a presence of the patient in the room and the method may include receiving, from the sensor configured to determine the presence of the patient in the room, a signal indicating that the patient is present in the room and, after a threshold period of time, determining that the patient has not left the room and that no further signals have been received from the sensor configured to determine the presence of the patient in the room. The method may include determining to capture one or more images of the room based on determining that the patient has not left the room and that no further
2 signals have been received from the sensor configured to determine the presence of the patient in the room.
In some examples, the method may include performing image foreground segmentation on the captured image to create a segmented image, performing template matching on the segmented image to identify a human shape in the segmented image, and calculating a position and an orientation associated with the identified human shape in the segmented image. In these examples, the method may include determining a potential fall event for the patient based on the calculated position and the calculated orientation. Further, in these examples, the method may include monitoring successive image and sensor data after calculating the position and the orientation, comparing the successive image and sensor data with prior image and sensor data, determining an activity level of the patient based on the comparison of the successive image and sensor data with the prior image and sensor data, classifying the potential fall event based on the determined activity level of the patient, and handling reporting for the potential fall event based on the classification of the potential fall event.
In some implementations, the method may inelude analyzing the monitored output from the at least one sensor over a period of time to determine activities of the patient over the period of time and accessing information indicative of expected activities of the patient over the period of time. In these implementations, the method may include comparing the determined activities of the patient over the period of time to the expected activities of the patient over the period of time and, based on the comparison revealing that the determined activities of the patient over the period of time do not match the expected activities of the patient over the period of time, determining a level of fall risk associated with the patient.
The method may include determining that the level of fall risk associated with the patient exceeds a threshold and, based on the determination that the level of fall risk associated with the patient exceeds the threshold, sending a message to a monitoring server that is located remotely from the building, The method also may include determining that the -level of fall risk associated with the patient exceeds a threshold and, based on the determination that the level of fall risk associated with the patient exceeds the threshold,
In some examples, the method may include performing image foreground segmentation on the captured image to create a segmented image, performing template matching on the segmented image to identify a human shape in the segmented image, and calculating a position and an orientation associated with the identified human shape in the segmented image. In these examples, the method may include determining a potential fall event for the patient based on the calculated position and the calculated orientation. Further, in these examples, the method may include monitoring successive image and sensor data after calculating the position and the orientation, comparing the successive image and sensor data with prior image and sensor data, determining an activity level of the patient based on the comparison of the successive image and sensor data with the prior image and sensor data, classifying the potential fall event based on the determined activity level of the patient, and handling reporting for the potential fall event based on the classification of the potential fall event.
In some implementations, the method may inelude analyzing the monitored output from the at least one sensor over a period of time to determine activities of the patient over the period of time and accessing information indicative of expected activities of the patient over the period of time. In these implementations, the method may include comparing the determined activities of the patient over the period of time to the expected activities of the patient over the period of time and, based on the comparison revealing that the determined activities of the patient over the period of time do not match the expected activities of the patient over the period of time, determining a level of fall risk associated with the patient.
The method may include determining that the level of fall risk associated with the patient exceeds a threshold and, based on the determination that the level of fall risk associated with the patient exceeds the threshold, sending a message to a monitoring server that is located remotely from the building, The method also may include determining that the -level of fall risk associated with the patient exceeds a threshold and, based on the determination that the level of fall risk associated with the patient exceeds the threshold,
3 automatically performing one Or more operations to reduce the level of fall risk associated with the patient.
In some examples, the method may include sending, to the patient, the message indicating the potential fall event and providing the patient with an opportunity to cancel the potential fall event. In these examples, the method may include determining that the patient has not cancelled the potential fall event within a threshold period of time and, based on.
determining that the patient has not cancelled the potential fall event within the threshold period of time, sending a message to a monitoring server indicating the potential fall event.
Further, in these examples, the method may include receiving, from the patient, an indication to cancel the potential fall event and, based on receiving the indication to cancel the potential fall event, determining an overall activity of the patient between detecting the potential fall event and receiving the indication to cancel the potential fall event In addition, the method may include determining that the overall activity of the patient is above a threshold of activity and, based on determining that the overall activity of the patient is above the threshold of activity, signaling that the potential fall event was detection of a false fall. The method also may include determining that the overall activity of the patient is below a threshold of activity and, based on determining that the overall activity of the patient is below the threshold of activity, determining an orientation of the patient.
The method further may include determining that the determined orientation of the patient is upright and, based OD determining that the determined orientation of the patient is upright, signaling that the potential fall event was detection of a minor fall.
In some implementations, the method may include determining that the determined orientation of the patient is not upright and, based on determining that the determined orientation of the patient is not upright, sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event. In these =
implementations, the method may include determining that the patient has not cancelled the potential fall event within a threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event and, based on determining that the patient has not cancelled the potential fall event within the
In some examples, the method may include sending, to the patient, the message indicating the potential fall event and providing the patient with an opportunity to cancel the potential fall event. In these examples, the method may include determining that the patient has not cancelled the potential fall event within a threshold period of time and, based on.
determining that the patient has not cancelled the potential fall event within the threshold period of time, sending a message to a monitoring server indicating the potential fall event.
Further, in these examples, the method may include receiving, from the patient, an indication to cancel the potential fall event and, based on receiving the indication to cancel the potential fall event, determining an overall activity of the patient between detecting the potential fall event and receiving the indication to cancel the potential fall event In addition, the method may include determining that the overall activity of the patient is above a threshold of activity and, based on determining that the overall activity of the patient is above the threshold of activity, signaling that the potential fall event was detection of a false fall. The method also may include determining that the overall activity of the patient is below a threshold of activity and, based on determining that the overall activity of the patient is below the threshold of activity, determining an orientation of the patient.
The method further may include determining that the determined orientation of the patient is upright and, based OD determining that the determined orientation of the patient is upright, signaling that the potential fall event was detection of a minor fall.
In some implementations, the method may include determining that the determined orientation of the patient is not upright and, based on determining that the determined orientation of the patient is not upright, sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event. In these =
implementations, the method may include determining that the patient has not cancelled the potential fall event within a threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event and, based on determining that the patient has not cancelled the potential fall event within the
4 threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event, sending a message to a monitoring server indicating the potential fall event. Also, in these implementations, the method may include after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event, receiving, from the patient, an indication to cancel the potential fall event and, based on receiving the indication to cancel the potential fall event, signaling that the potential fall event was a cancelled fall event.
Implementations of the described techniques may include hardware, a method or process implemented at least partially in hardware, or a computer-readable storage medium encoded with executable instructions that, when executed by a processor, perform operations.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. The at least one sensor configured to sense activity associated with the patient falling is an on-body sensor configured to detect an impact and determine a change in an orientation of the patient. Monitoring output from the at least one sensor comprises:
receiving data indicating a detected change in an orientation of the patient and an amount of the orientation change; receiving data indicating a detected impact and a severity of the detected impact; determining, based on the received amount of orientation change and the received data indicating the severity of the impact of the patient, a threshold for inactivity of the patient; and determining, based on output from 4a the on-body sensor, that the patient has been inactive for a period of time greater than the determined threshold. Determining to capture one or more images of the room comprises determining to capture one or more images of the room based on the determination that the patient has been inactive for a period of time greater than the determined threshold.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. The at least one sensor configured to sense activity associated with the patient falling is a sensor configured to determine a presence of the patient in the room.
Monitoring output from the at least one sensor comprises: receiving, from the sensor configured to determine the presence of the patient in the room, a signal indicating that the patient is present in the room; and after a threshold period of time, determining that the patient has not left the room and that no further signals have been received from the sensor configured to determine the presence of the patient in the room.
Determining to capture one or more images of the room comprises determining to capture one or more images of the room based on determining that the patient has not left the room and that no further signals have been received from the sensor configured to determine the presence of the patient in the room.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to 4b capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. Analyzing the captured image of the room to detect a state of the patient at a time of capturing the image comprises: performing image foreground segmentation on the captured image to create a segmented image; performing template matching on the segmented image to identify a human shape in the segmented image; and calculating a position and an orientation associated with the identified human shape in the segmented image. Determining the potential fall event for the patient comprises determining a potential fall event for the patient based on the calculated position and the calculated orientation.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient;
analyzing the monitored output from the at least one sensor over a period of time to determine activities of the patient over the period of time; accessing information indicative of expected activities of the patient over the period of time;
comparing the determined activities of the patient over the period of time to the expected activities of the patient over the period of time; and based on the comparison revealing that the determined activities of the patient over the period of time do not match the expected 4c activities of the patient over the period of time, determining a level of fall risk associated with the patient.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient, wherein sending the message indicating the potential fall event for the patient comprises sending, to the patient, the message indicating the potential fall event and providing the patient with an opportunity to cancel the potential fall event.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. The at least one sensor configured to sense activity associated with the patient falling is an on-body sensor configured to detect an impact and determine a change in an 4d orientation of the patient. The monitoring output from the at least one sensor comprises:
receiving data indicating a detected change in an orientation of the patient and an amount of the orientation change; receiving data indicating a detected impact and a severity of the detected impact; determining, based on the received amount of orientation change and the received data indicating the severity of the impact of the patient, a threshold for inactivity of the patient; and determining, based on output from the on-body sensor, that the patient has been inactive for a period of time greater than the determined threshold. Determining to capture one or more images of the room comprises determining to capture one or more images of the room based on the determination that the patient has been inactive for a period of time greater than the determined threshold.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. The at least one sensor configured to sense activity associated with the patient falling is a sensor configured to determine a presence of the patient in the room. The monitoring output from the at least one sensor comprises: receiving, from the sensor configured to determine the presence of the patient in the room, a signal indicating that the patient is present in the room; and after a threshold period of time, determining that the patient has not left the room and that no further signals have been received from the 4e sensor configured to determine the presence of the patient in the room.
Determining to capture one or more images of the room comprises determining to capture one or more images of the room based on determining that the patient has not left the room and that no further signals have been received from the sensor configured to determine the presence of the patient in the room.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. Analyzing the captured image of the room to detect a state of the patient at a time of capturing the image comprises: performing image foreground segmentation on the captured image to create a segmented image; performing template matching on the segmented image to identify a human shape in the segmented image; and calculating a position and an orientation associated with the identified human shape in the segmented image. Determining the potential fall event for the patient comprises determining a potential fall event for the patient based on the calculated position and the calculated orientation.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output 4f from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient;
analyzing the monitored output from the at least one sensor over a period of time to determine activities of the patient over the period of time; accessing information indicative of expected activities of the patient over the period of time;
comparing the determined activities of the patient over the period of time to the expected activities of the patient over the period of time; and based on the comparison revealing that the determined activities of the patient over the period of time do not match the expected activities of the patient over the period of time, determining a level of fall risk associated with the patient.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the 4g patient. Sending the message indicating the potential fall event for the patient comprises sending, to the patient, the message indicating the potential fall event and providing the patient with an opportunity to cancel the potential fall event.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense activity associated with a patient falling;
analyzing the monitored output from the at least one sensor over a period of time to determine activities of the patient over the period of time; accessing information indicative of expected activities of the patient over the period of time;
comparing the determined activities of the patient over the period of time to the expected activities of the patient over the period of time; based on the comparison revealing that the determined activities of the patient over the period of time do not match the expected activities of the patient over the period of time, determining, by at least one processor, a level of fall risk associated with the patient; and performing an operation directed to assisting the patient with a fall event based on the determined level of fall risk associated with the patient.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense activity associated with a patient falling;
analyzing the monitored output from the at least one sensor over a period of time to determine activities of the patient over the period of time; accessing information indicative of expected activities of the patient over the period of time;
comparing the determined activities of the patient over the period of time to the expected activities of the patient over the period of time; based on the comparison revealing that the determined activities of the patient over the period of time do not match the expected activities of the patient over the period of time, determining a level of fall risk associated with the patient; and performing an operation directed to assisting the patient with a fall event based on the determined level of fall risk associated with the patient.
4h In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense activity associated with a fall; based on the monitoring of output from the at least one sensor, determining to capture one or more images of an area associated with the at least one sensor; based on the determination to capture one or more images of the area associated with the at least one sensor, capturing, with a camera, an image of the area associated with the at least one sensor;
detecting, in the captured image, a state of a person included in the captured image;
determining, based on the detected state of the person, a potential fall event in the area associated with the at least one sensor; and based on the determination of the potential fall event, sending, by a communication device and to a device associated with the person involved in the potential fall event, a message that indicates the potential fall event and provides the person with an opportunity to cancel the potential fall event.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense activity associated with a fall;
based on the monitoring of output from the at least one sensor, determining to capture one or more images of an area associated with the at least one sensor; based on the determination to capture one or more images of the area associated with the at least one sensor, capturing, with a camera, an image of the area associated with the at least one sensor;
detecting, in the captured image, a state of a person included in the captured image;
determining, based on the detected state of the person, a potential fall event in the area associated with the at least one sensor; and based on the determination of the potential fall event, sending, by a communication device and to a device associated with the person involved in the potential fall event, a message that indicates the potential fall event and provides the person with an opportunity to cancel the potential fall event.
In some implementations, there is a method comprising: determining to capture one or more images of an area; and based on the determination to capture one or more images of the area, capturing, with a camera positioned to include a person within a field 4i of view of the camera, an image of the area. The method further comprises analyzing the captured image of the area to detect a state of the person by: performing image foreground segmentation on the captured image of the area to create a segmented image;
performing template matching on the segmented image to identify a human shape in the segmented image; and calculating a position and an orientation associated with the identified human shape in the segmented image. The method further comprises:
determining, based on the calculated position and the calculated orientation associated with the identified human shape in the segmented image, a potential fall event for the person; and performing an operation directed to assisting the person with the potential fall event based on the determined potential fall event for the person.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising:
determining to capture one or more images of an area; and based on the determination to capture one or more images of the area, capturing, with a camera positioned to include a person within a field of view of the camera, an image of the area. The operations further comprise analyzing the captured image of the area to detect a state of the person by:
performing image foreground segmentation on the captured image of the area to create a segmented image; performing template matching on the segmented image to identify a human shape in the segmented image; and calculating a position and an orientation associated with the identified human shape in the segmented image. The operations further comprise: determining, based on the calculated position and the calculated orientation associated with the identified human shape in the segmented image, a potential fall event for the person; and performing an operation directed to assisting the person with the potential fall event based on the determined potential fall event for the person.
In some implementations, there is a method comprising: triggering fall detection processing based on detection of a fall-related signature; calculating orientation change based on triggering fall detection processing; determining a minimum required inactivity period based on the fall-related signature and the orientation change; and detecting a
Implementations of the described techniques may include hardware, a method or process implemented at least partially in hardware, or a computer-readable storage medium encoded with executable instructions that, when executed by a processor, perform operations.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. The at least one sensor configured to sense activity associated with the patient falling is an on-body sensor configured to detect an impact and determine a change in an orientation of the patient. Monitoring output from the at least one sensor comprises:
receiving data indicating a detected change in an orientation of the patient and an amount of the orientation change; receiving data indicating a detected impact and a severity of the detected impact; determining, based on the received amount of orientation change and the received data indicating the severity of the impact of the patient, a threshold for inactivity of the patient; and determining, based on output from 4a the on-body sensor, that the patient has been inactive for a period of time greater than the determined threshold. Determining to capture one or more images of the room comprises determining to capture one or more images of the room based on the determination that the patient has been inactive for a period of time greater than the determined threshold.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. The at least one sensor configured to sense activity associated with the patient falling is a sensor configured to determine a presence of the patient in the room.
Monitoring output from the at least one sensor comprises: receiving, from the sensor configured to determine the presence of the patient in the room, a signal indicating that the patient is present in the room; and after a threshold period of time, determining that the patient has not left the room and that no further signals have been received from the sensor configured to determine the presence of the patient in the room.
Determining to capture one or more images of the room comprises determining to capture one or more images of the room based on determining that the patient has not left the room and that no further signals have been received from the sensor configured to determine the presence of the patient in the room.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to 4b capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. Analyzing the captured image of the room to detect a state of the patient at a time of capturing the image comprises: performing image foreground segmentation on the captured image to create a segmented image; performing template matching on the segmented image to identify a human shape in the segmented image; and calculating a position and an orientation associated with the identified human shape in the segmented image. Determining the potential fall event for the patient comprises determining a potential fall event for the patient based on the calculated position and the calculated orientation.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient;
analyzing the monitored output from the at least one sensor over a period of time to determine activities of the patient over the period of time; accessing information indicative of expected activities of the patient over the period of time;
comparing the determined activities of the patient over the period of time to the expected activities of the patient over the period of time; and based on the comparison revealing that the determined activities of the patient over the period of time do not match the expected 4c activities of the patient over the period of time, determining a level of fall risk associated with the patient.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient, wherein sending the message indicating the potential fall event for the patient comprises sending, to the patient, the message indicating the potential fall event and providing the patient with an opportunity to cancel the potential fall event.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. The at least one sensor configured to sense activity associated with the patient falling is an on-body sensor configured to detect an impact and determine a change in an 4d orientation of the patient. The monitoring output from the at least one sensor comprises:
receiving data indicating a detected change in an orientation of the patient and an amount of the orientation change; receiving data indicating a detected impact and a severity of the detected impact; determining, based on the received amount of orientation change and the received data indicating the severity of the impact of the patient, a threshold for inactivity of the patient; and determining, based on output from the on-body sensor, that the patient has been inactive for a period of time greater than the determined threshold. Determining to capture one or more images of the room comprises determining to capture one or more images of the room based on the determination that the patient has been inactive for a period of time greater than the determined threshold.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. The at least one sensor configured to sense activity associated with the patient falling is a sensor configured to determine a presence of the patient in the room. The monitoring output from the at least one sensor comprises: receiving, from the sensor configured to determine the presence of the patient in the room, a signal indicating that the patient is present in the room; and after a threshold period of time, determining that the patient has not left the room and that no further signals have been received from the 4e sensor configured to determine the presence of the patient in the room.
Determining to capture one or more images of the room comprises determining to capture one or more images of the room based on determining that the patient has not left the room and that no further signals have been received from the sensor configured to determine the presence of the patient in the room.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient. Analyzing the captured image of the room to detect a state of the patient at a time of capturing the image comprises: performing image foreground segmentation on the captured image to create a segmented image; performing template matching on the segmented image to identify a human shape in the segmented image; and calculating a position and an orientation associated with the identified human shape in the segmented image. Determining the potential fall event for the patient comprises determining a potential fall event for the patient based on the calculated position and the calculated orientation.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output 4f from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient;
analyzing the monitored output from the at least one sensor over a period of time to determine activities of the patient over the period of time; accessing information indicative of expected activities of the patient over the period of time;
comparing the determined activities of the patient over the period of time to the expected activities of the patient over the period of time; and based on the comparison revealing that the determined activities of the patient over the period of time do not match the expected activities of the patient over the period of time, determining a level of fall risk associated with the patient.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling; based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room; based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image; determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the 4g patient. Sending the message indicating the potential fall event for the patient comprises sending, to the patient, the message indicating the potential fall event and providing the patient with an opportunity to cancel the potential fall event.
In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense activity associated with a patient falling;
analyzing the monitored output from the at least one sensor over a period of time to determine activities of the patient over the period of time; accessing information indicative of expected activities of the patient over the period of time;
comparing the determined activities of the patient over the period of time to the expected activities of the patient over the period of time; based on the comparison revealing that the determined activities of the patient over the period of time do not match the expected activities of the patient over the period of time, determining, by at least one processor, a level of fall risk associated with the patient; and performing an operation directed to assisting the patient with a fall event based on the determined level of fall risk associated with the patient.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense activity associated with a patient falling;
analyzing the monitored output from the at least one sensor over a period of time to determine activities of the patient over the period of time; accessing information indicative of expected activities of the patient over the period of time;
comparing the determined activities of the patient over the period of time to the expected activities of the patient over the period of time; based on the comparison revealing that the determined activities of the patient over the period of time do not match the expected activities of the patient over the period of time, determining a level of fall risk associated with the patient; and performing an operation directed to assisting the patient with a fall event based on the determined level of fall risk associated with the patient.
4h In some implementations, there is a method comprising: monitoring output from at least one sensor configured to sense activity associated with a fall; based on the monitoring of output from the at least one sensor, determining to capture one or more images of an area associated with the at least one sensor; based on the determination to capture one or more images of the area associated with the at least one sensor, capturing, with a camera, an image of the area associated with the at least one sensor;
detecting, in the captured image, a state of a person included in the captured image;
determining, based on the detected state of the person, a potential fall event in the area associated with the at least one sensor; and based on the determination of the potential fall event, sending, by a communication device and to a device associated with the person involved in the potential fall event, a message that indicates the potential fall event and provides the person with an opportunity to cancel the potential fall event.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: monitoring output from at least one sensor configured to sense activity associated with a fall;
based on the monitoring of output from the at least one sensor, determining to capture one or more images of an area associated with the at least one sensor; based on the determination to capture one or more images of the area associated with the at least one sensor, capturing, with a camera, an image of the area associated with the at least one sensor;
detecting, in the captured image, a state of a person included in the captured image;
determining, based on the detected state of the person, a potential fall event in the area associated with the at least one sensor; and based on the determination of the potential fall event, sending, by a communication device and to a device associated with the person involved in the potential fall event, a message that indicates the potential fall event and provides the person with an opportunity to cancel the potential fall event.
In some implementations, there is a method comprising: determining to capture one or more images of an area; and based on the determination to capture one or more images of the area, capturing, with a camera positioned to include a person within a field 4i of view of the camera, an image of the area. The method further comprises analyzing the captured image of the area to detect a state of the person by: performing image foreground segmentation on the captured image of the area to create a segmented image;
performing template matching on the segmented image to identify a human shape in the segmented image; and calculating a position and an orientation associated with the identified human shape in the segmented image. The method further comprises:
determining, based on the calculated position and the calculated orientation associated with the identified human shape in the segmented image, a potential fall event for the person; and performing an operation directed to assisting the person with the potential fall event based on the determined potential fall event for the person.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising:
determining to capture one or more images of an area; and based on the determination to capture one or more images of the area, capturing, with a camera positioned to include a person within a field of view of the camera, an image of the area. The operations further comprise analyzing the captured image of the area to detect a state of the person by:
performing image foreground segmentation on the captured image of the area to create a segmented image; performing template matching on the segmented image to identify a human shape in the segmented image; and calculating a position and an orientation associated with the identified human shape in the segmented image. The operations further comprise: determining, based on the calculated position and the calculated orientation associated with the identified human shape in the segmented image, a potential fall event for the person; and performing an operation directed to assisting the person with the potential fall event based on the determined potential fall event for the person.
In some implementations, there is a method comprising: triggering fall detection processing based on detection of a fall-related signature; calculating orientation change based on triggering fall detection processing; determining a minimum required inactivity period based on the fall-related signature and the orientation change; and detecting a
5 potential fall event based on monitoring activity during the minimum required inactivity period.
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: triggering fall detection processing based on detection of a fall-related signature;
calculating orientation change based on triggering fall detection processing; determining a minimum required inactivity period based on the fall-related signature and the orientation change;
and detecting a potential fall event based on monitoring activity during the minimum required inactivity period.
The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features will be apparent from the description and drawings, and from the claims.
DESCRIPTION OF DRAWINGS
FIGS. 1, 2, and 4 to 6 illustrate example systems.
FIGS. 3, 7, 8, 10, and 11 are flow charts illustrating example processes.
FIG. 9 is illustrates example fall detection criteria.
FIG. 12 is a diagram illustrating fall detection examples.
5a DETAILED DESCRIPTION
Techniques are described for addressing the aforementioned fall detection and reporting challenges. For example, a monitoring system in a premise performs fall detection and reporting operations based on output from a sensor (e.g., an image sensor).
When the monitoring system detects that a person has fallen in the premise, actions are taken to assist the fallen person.
FIG. 1 illustrates an image sensing device 110 that may be installed within a monitored home or facility. The image sensing device 110 combines multi-modal sensing (e.g., passive infrared motion sensor, triaxial inertial sensor, illumination sensor), an infrared illumination source, camera, processor, memory, battery, and input/output capabilities. The image sensing device 110 detects events indicative of potential falls proximal to its installation location. A plurality of image sensing devices can be installed throughout a home or facility, and used in conjunction with other sensors, to increase the fall detection coverage area and provide specific location information for fall reporting and response.
The image sensing device 110 includes a processor 111, a memory 112, a camera 113, an illumination source 114, a motion sensor 115, an illumination sensor 116, a battery 117, and an input/output port 118. The processor 111 controls operations of the image sensing device 110 and may be any suitable processor. The memory 112 stores instructions that are executed by the processor 111 and also stores images captured by the camera 113. The memory 112 may be any type of memory that is capable storing data and may include a combination of multiple, memory units. For example, the memory 112 may be a Flash memory component that stores both instructions that are executed by the processor and images captured by the camera 113.
The camera 113 captures images of an area proximate to where the image sensing device is located. For instance, the camera 113 may be placed at an upper corner of a room in a building and, in this instance, the camera 113 captures images of the room. The camera 113 may be a photographic camera or other type of optical sensing device configured to capture images. In some implementations, the camera 113 5b is a CMOS camera sensor (or other CCD sensor) that captures images at various, different resolutions (e.g., low and/or high resolutions). For instance, the CMOS camera sensor may capture 640x480 pixels (e.g., VGA resolution) or higher resolutions. The camera 113 also may capture a lower resolution image (e.g., Quarter VGA = QVGA
=
320x240 pixels).
The illumination source 114 may be any source of illumination that improves capturing of images in a dark area. For example, the illumination source 114 may include one or more Infra Red LEDs that emit Infra Red light over an area within a field of view of the camera 113 to illuminate objects within the area. The processor 111 may control the illumination source 114 to emit light when the illumination sensor detects a level of light that is below a threshold level.
The motion sensor 115 may be Passive Infra Red (PIR) motion sensor, a microwave motion sensor, or any type of sensor that detects motion in an area corresponding to a field of view of the camera 113. The processor 111 may monitor .. output of the motion sensor 115 and trigger the camera 113 to capture images in response to the motion sensor 115 detecting motion in the area corresponding to the field of view of the camera 113.
The battery 117 is the power source of the image sensing device 110 and may be any type of battery capable of delivering power to the image sensing device 110. The battery 117 may have a relatively small size and may be a standard type of battery available for purchase at retail stores. The battery 117 may be located in a compartment that is easily accessible to a user of the image sensing device 110 to facilitate changing of the battery 117, which may occur relatively frequently (e.g., every couple of months) depending on the power consumption and image capture settings of the image sensing device 110.
The input/output port 118 is a communication interface through which the image sensing device may send and receive wireless communications. The input/output port 118 may, using a short range wireless protocol (e.g., BluetoothTM, Z-WaveTM, ZigBeeTM, local wireless 900 MHz communication band, etc.), receive and send short range .. wireless communications with other devices. The input/output port 118 may include a
In some implementations, there is a system comprising at least one processor.
The system further comprises at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising: triggering fall detection processing based on detection of a fall-related signature;
calculating orientation change based on triggering fall detection processing; determining a minimum required inactivity period based on the fall-related signature and the orientation change;
and detecting a potential fall event based on monitoring activity during the minimum required inactivity period.
The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features will be apparent from the description and drawings, and from the claims.
DESCRIPTION OF DRAWINGS
FIGS. 1, 2, and 4 to 6 illustrate example systems.
FIGS. 3, 7, 8, 10, and 11 are flow charts illustrating example processes.
FIG. 9 is illustrates example fall detection criteria.
FIG. 12 is a diagram illustrating fall detection examples.
5a DETAILED DESCRIPTION
Techniques are described for addressing the aforementioned fall detection and reporting challenges. For example, a monitoring system in a premise performs fall detection and reporting operations based on output from a sensor (e.g., an image sensor).
When the monitoring system detects that a person has fallen in the premise, actions are taken to assist the fallen person.
FIG. 1 illustrates an image sensing device 110 that may be installed within a monitored home or facility. The image sensing device 110 combines multi-modal sensing (e.g., passive infrared motion sensor, triaxial inertial sensor, illumination sensor), an infrared illumination source, camera, processor, memory, battery, and input/output capabilities. The image sensing device 110 detects events indicative of potential falls proximal to its installation location. A plurality of image sensing devices can be installed throughout a home or facility, and used in conjunction with other sensors, to increase the fall detection coverage area and provide specific location information for fall reporting and response.
The image sensing device 110 includes a processor 111, a memory 112, a camera 113, an illumination source 114, a motion sensor 115, an illumination sensor 116, a battery 117, and an input/output port 118. The processor 111 controls operations of the image sensing device 110 and may be any suitable processor. The memory 112 stores instructions that are executed by the processor 111 and also stores images captured by the camera 113. The memory 112 may be any type of memory that is capable storing data and may include a combination of multiple, memory units. For example, the memory 112 may be a Flash memory component that stores both instructions that are executed by the processor and images captured by the camera 113.
The camera 113 captures images of an area proximate to where the image sensing device is located. For instance, the camera 113 may be placed at an upper corner of a room in a building and, in this instance, the camera 113 captures images of the room. The camera 113 may be a photographic camera or other type of optical sensing device configured to capture images. In some implementations, the camera 113 5b is a CMOS camera sensor (or other CCD sensor) that captures images at various, different resolutions (e.g., low and/or high resolutions). For instance, the CMOS camera sensor may capture 640x480 pixels (e.g., VGA resolution) or higher resolutions. The camera 113 also may capture a lower resolution image (e.g., Quarter VGA = QVGA
=
320x240 pixels).
The illumination source 114 may be any source of illumination that improves capturing of images in a dark area. For example, the illumination source 114 may include one or more Infra Red LEDs that emit Infra Red light over an area within a field of view of the camera 113 to illuminate objects within the area. The processor 111 may control the illumination source 114 to emit light when the illumination sensor detects a level of light that is below a threshold level.
The motion sensor 115 may be Passive Infra Red (PIR) motion sensor, a microwave motion sensor, or any type of sensor that detects motion in an area corresponding to a field of view of the camera 113. The processor 111 may monitor .. output of the motion sensor 115 and trigger the camera 113 to capture images in response to the motion sensor 115 detecting motion in the area corresponding to the field of view of the camera 113.
The battery 117 is the power source of the image sensing device 110 and may be any type of battery capable of delivering power to the image sensing device 110. The battery 117 may have a relatively small size and may be a standard type of battery available for purchase at retail stores. The battery 117 may be located in a compartment that is easily accessible to a user of the image sensing device 110 to facilitate changing of the battery 117, which may occur relatively frequently (e.g., every couple of months) depending on the power consumption and image capture settings of the image sensing device 110.
The input/output port 118 is a communication interface through which the image sensing device may send and receive wireless communications. The input/output port 118 may, using a short range wireless protocol (e.g., BluetoothTM, Z-WaveTM, ZigBeeTM, local wireless 900 MHz communication band, etc.), receive and send short range .. wireless communications with other devices. The input/output port 118 may include a
6 "normally open" or "normally closed" digital input that can trigger capture of images using the camera 113.
To reduce processing power needed and to conserve battery life, the processor 111 may control components of the image sensing device 110 to periodically enter sleep mode operation. For example, the processor 111 may awaken every second to determine whether any communications have been received at the input/output port 118. If no communications have been received, the processor 111 may place itself and other components (e.g., the memory 112, the camera 113, etc.) in a sleep mode for another second before awaking again to determine whether any communications have been received at the input/output port 118. The processor 111 also may awaken from a sleep mode state based on output from the motion sensor 115 indicating that motion has been detected and/or based on output from an "inertial sensor" that detects impacts to the image sensing device 110.
To reduce processing power needed and to conserve battery life, the processor 111 may control components of the image sensing device 110 to periodically enter sleep mode operation. For example, the processor 111 may awaken every second to determine whether any communications have been received at the input/output port 118. If no communications have been received, the processor 111 may place itself and other components (e.g., the memory 112, the camera 113, etc.) in a sleep mode for another second before awaking again to determine whether any communications have been received at the input/output port 118. The processor 111 also may awaken from a sleep mode state based on output from the motion sensor 115 indicating that motion has been detected and/or based on output from an "inertial sensor" that detects impacts to the image sensing device 110.
7 FIG 2 illustrates an example of an electronic system 200 configured to provide fall detection and reporting. The system 200 includes the image sensing device 110, a gateway 120, one or more remote monitoring servers 130, one or more user devices 140, and a central monitoring station 150. The image sensing device 110 is a relatively small and affordable unit that captures still images of an area that corresponds to a location of the image sensing device. Because the image sensing device 110 is relatively small, tuns off of battery power, and communicates via a wireless communication protocol, the image sensing device 110 may be easily placed at any location within a monitored property (or just outside of a monitored property) to provide image surveillance of an area of the monitored property (or an area just outside of the monitored property).
The gateway 120 is a communication device configured to exchange short range wireless communications with the image sensing device 110 and long range wireless or wired communications with the remote monitoring server 130 over the network 135.
Because the gateway 120 exchanges short range wireless communications with the image sensing device 110, the gateway 120 is positioned nearby the image sensing device 110. As shown in PIG 2, the gateway 120 and the image sensing device 110 are both located within a monitored property that is remote (and may be vary far away from) the remote monitoring server 130.
In some examples, the gateway 120 may include a wireless communication device configured to exchange long range communications over a wireless data channel.
In this example, the gateway 120 may transmit header data and image data over a wireless data channel. The gateway 120 may include one or more of a GSM module, a radio modem, = cellular transmission module, or any type of module configured to exchange communications in one of the following formats: GSM or GPRS, COMA, EDGE or EGPRS, EV-DO or EVDO, or UMTS.
The gateway 120 includes a buffer memory 122 that stores image data captured by the image sensing device 110. The buffer memory 122 may temporarily store image data captured by the image sensing device 110 to delay a decision of whether the image data (or a subset of the image data) is worthwhile to send to the remote monitoring server 130, The buffer memory 122 may be larger than the memory 112 of the image sensing device 110 and,
The gateway 120 is a communication device configured to exchange short range wireless communications with the image sensing device 110 and long range wireless or wired communications with the remote monitoring server 130 over the network 135.
Because the gateway 120 exchanges short range wireless communications with the image sensing device 110, the gateway 120 is positioned nearby the image sensing device 110. As shown in PIG 2, the gateway 120 and the image sensing device 110 are both located within a monitored property that is remote (and may be vary far away from) the remote monitoring server 130.
In some examples, the gateway 120 may include a wireless communication device configured to exchange long range communications over a wireless data channel.
In this example, the gateway 120 may transmit header data and image data over a wireless data channel. The gateway 120 may include one or more of a GSM module, a radio modem, = cellular transmission module, or any type of module configured to exchange communications in one of the following formats: GSM or GPRS, COMA, EDGE or EGPRS, EV-DO or EVDO, or UMTS.
The gateway 120 includes a buffer memory 122 that stores image data captured by the image sensing device 110. The buffer memory 122 may temporarily store image data captured by the image sensing device 110 to delay a decision of whether the image data (or a subset of the image data) is worthwhile to send to the remote monitoring server 130, The buffer memory 122 may be larger than the memory 112 of the image sensing device 110 and,
8 because the gateway 120 operates using an AC power source, using the buffer memory 122 to store images captured by the image sensing device 110 may be more efficient. The gateway 120 also may include a display with which the stored images may be displayed to a user.
The long range wireless network 135 enables wireless conununication between the gateway 120 and the remote monitoring server 130. The long range wireless network 135 maybe any type of cellular network and may support any one or more of the following =
protocols: GSM or GPRS, CDMA, EDGE or EGPRS, EV-DO or EVDO, or UMTS. It may be relatively expensive to transmit data over the long range wireless network 135 and, therefore, the image sensing device 110 and the gateway 120 may be selective in the image data transmitted to the remote monitoring server 130.
The remote monitoring server 130 receives image data from the gateway 120 over the long range wireless or wired network 135. The remote monitoring server 130 stores the received image data and makes the image data available to one or more user devices 140 and/or the central monitoring station 150 over the IP-based network 145. Por instance, the remote monitoring semi 130 may make the image data available to the one or more user devices 140 and/or the central monitoring station 150 at a web site accessible by the one or more user devices 140 and/or the central monitoring station 150 over the Internet. The remote monitoring server 130 also may make the image data available to the one or more user devices 140 and/or the central monitoring station 150 in an electronic message, such as an electronic mail message.
In some implementations, the remote monitoring server 130 receives the image data from the gateway 120 as a reference image and a series of differential images that indicate the difference between the corresponding image and the reference image. In these implementations, header information sent with the image data indicates which images are reference images, which images are differential images, and which reference image each differential image corresponds to. The remote monitoring server 130 processes the reference image and the differential images and converts each image into a standard image format, such as REG The remote monitoring server 130 then stores the converted images in a
The long range wireless network 135 enables wireless conununication between the gateway 120 and the remote monitoring server 130. The long range wireless network 135 maybe any type of cellular network and may support any one or more of the following =
protocols: GSM or GPRS, CDMA, EDGE or EGPRS, EV-DO or EVDO, or UMTS. It may be relatively expensive to transmit data over the long range wireless network 135 and, therefore, the image sensing device 110 and the gateway 120 may be selective in the image data transmitted to the remote monitoring server 130.
The remote monitoring server 130 receives image data from the gateway 120 over the long range wireless or wired network 135. The remote monitoring server 130 stores the received image data and makes the image data available to one or more user devices 140 and/or the central monitoring station 150 over the IP-based network 145. Por instance, the remote monitoring semi 130 may make the image data available to the one or more user devices 140 and/or the central monitoring station 150 at a web site accessible by the one or more user devices 140 and/or the central monitoring station 150 over the Internet. The remote monitoring server 130 also may make the image data available to the one or more user devices 140 and/or the central monitoring station 150 in an electronic message, such as an electronic mail message.
In some implementations, the remote monitoring server 130 receives the image data from the gateway 120 as a reference image and a series of differential images that indicate the difference between the corresponding image and the reference image. In these implementations, header information sent with the image data indicates which images are reference images, which images are differential images, and which reference image each differential image corresponds to. The remote monitoring server 130 processes the reference image and the differential images and converts each image into a standard image format, such as REG The remote monitoring server 130 then stores the converted images in a
9 A
database or a file system and makes the converted images available to the one or more user devices 140 and/or the central monitoring station 150.
The central monitoring station 150 includes an electronic device (e.g., a server) configured to provide alarm monitoring service by exchanging communications with the remote monitoring server 130 over the network 145. For example, the central monitoring station 150 may be configured to monitor alarm events generated by a monitoring or alarm system that monitors the home or facility where the image sensing device 110 is located. In this example, the central monitoring station 150 may exchange communications with the remote monitoring sewer 130 to receive information regarding alarm events detected by the monitoring or alarm system. The central monitoring station 150 also may receive information regarding alarm events from the one or more user devices 140. The central monitoring station 150 may receive images captured by the image sensing device 110 to enable verification of potential fall events.
The central monitoring station 150 may be connected to multiple terminals. The terminals may be used by operators to process alarm events. For example, the central monitoring station 150 may route alarm data to the terminals to enable an operator to process the alarm data. The terminals may include general-purpose computers (e,g., desktop personal computers, workstations, or laptop computers) that are configured to receive alarm data from a server in the central monitoring station 150 and render a display of information based on the alarm data. For example, the central monitoring station 150 may receive alarm data and route the alarm data to a terminal for processing by an operator associated with the terminal.
The terminal may render a display to the operator that includes information associated with the alarm event (e.g., the mune of the user of the alarm system, the address of the building the alarm system is monitoring, the type of alarm event, images of fall events taken of the image sensing device 110, etc.) and the operator may handle the alarm event based on the displayed information.
The one or more user devices 140 include devices that host user interfaces.
For instance, the user devices 140 may include a mobile device that hosts one or more native applications (e.g., a fall detection and reporting application). The user devices 140 may include a cellular phone or a non-cellular locally networked device with a display. The user devices 140 may include a smart phone, a tablet PC, a personal digital assistant ("PDA"), or any other portable device configured to communicate over a network and display information. For example, implementations may also include BlackberryTm-type devices (e.g., as provided by Research in Motion), electronic organizers, iPhoneTm-type devices (e.g., as provided by Apple), iPodTM devices (e.g., as provided by Apple) or other portable music players, other communication devices, and handheld or portable electronic devices for gaming, communications, and/or data organization. The user devices 140 may perform functions unrelated to the monitoring system, such as placing personal telephone calls, playing music, playing video, displaying pictures, browsing the Internet, maintaining an electronic calendar, etc.
The user devices 140 may include a native fall detection and reporting application. The native fall detection and reporting application refers to a software/firmware program running on the corresponding mobile device that enables the .. user interface and features described throughout. The user devices 140 may load or install the native fall detection and reporting application based on data received over a network or data received from local media. The native fall detection and reporting application runs on mobile device platforms, such as iPhoneTM, iPod touchTM, Blackberry TM, GoogleTM AndroidTM, WindowsTM MobileTM, etc. The native fall detection and reporting application enables the user devices 140 to receive and process image and sensor data from the monitoring system.
The user devices 140 also may include a general-purpose computer (e.g., a desktop personal computer, a workstation, or a laptop computer) that is configured to communicate with the remote monitoring server 130 over the network 145. The user devices 140 may be configured to display a fall detection and reporting user interface that is generated by the user devices 140 or generated by the remote monitoring server 130. For example, the user devices 140 may be configured to display a user interface (e.g., a web page) provided by the remote monitoring server 130 that enables a user to perceive images captured by the image sensing device 110 and/or reports related to the monitoring system.
The system 200 further includes one or more trigger sources 128. The trigger sources 128 may include devices that assist in detecting fall events. For example, the trigger sources 128 may include contact or pressure sensors that are positioned at a lower part of a building (e.g., at or near the floor). In this example, when a person falls, the person may touch one of the trigger sources 128 to alert the system 200 to the fall. In this regard, the system 200 may use output of the trigger sources 128 to identify a possible fall location and begin capturing and processing images near that location to determine whether the trigger relates to an actual fall event or a false alarm, such as inadvertent contact with a trigger source.
In some examples, the system 200 may include inertial sensors (e.g., accelerometers) to detect an impact potentially generated from a fall. In these examples, when a person falls, the inertial sensors may detect an impact and the system 200 may use the detected impact to infer a potential fall. In this regard, the system 200 may use output of the inertial sensors to identify a possible fall location and begin capturing and processing images near that location to determine whether the detected impact relates to an actual fall event or a false alarm, such as dropping of an object that resulted in the detected impact. =
In some implementations, the image sensing device 110 and the gateway 120 may be part of a home or facility monitoring system (e.g., a borne security system).
In these implementations, the home or facility monitoring system may sense many types of events or activities associated with the home or facility and the sensed events or activities may be leveraged in performing fall detection and reporting features. The home or facility monitoring system may include a controller that communicates with the gateway 120. The controller may be con6gured to control the home or facility monitoring system (e.g., a home alarm or security system). In some examples, the controller may include a processor or other control circuitry configured to execute instructions of a program that controls operation Of an alarm n system. In these examples, the controller may be configured to receive input from sensors, detectors, or other devices included in the home or facility monitoring system and control operations of devices included in the home or 'facility monitoring system or other household devices (e.g., a thermostat, an appliance, lights, etc.).
=
=
The home or facility monitoring system also includes one or more sensors or detectors. For example, the home Or facility monitoring system may include multiple SODOM'S, including a contact sensor, a motion sensor, a glass break sensor, or any other type of sensor included in an alarm system or security system. The sensors also may include an environmental sensor, such as a temperature sensor, a water sensor, a rain sensor, a wind sensor, a light sensor, a smoke detector, a carbon monoxide detector, an air quality sensor, etc. The sensors further may include a health monitoring sensor, such as a prescription bottle sensor that monitors taking of prescriptions, a blood pressure sensor, a blood sugar sensor, a bed mat configured to sense presence of liquid (e.g., bodily fluids) on the bed mat, bathroom usage sensors, food consumption sensors, etc. In some examples, the sensors 120 may include a radio-frequency identification (RFID) sensor that identifies a particular article that includes a pre-assigned RFD tag.
The system 200 shown in FIG. 2 may be used for the two example processes 300 and 400 of fall detection and reporting described with respeet to FIGS. 3 and 4.
The example .. processes 300 and 400 are independent; however, they can be staged so that first level fall detection triggers further (e.g., second level) analysis and classification of potential fall events. Both processes 300 and 400 have multiple steps, although a subset of steps may be employed to still meet practical requirements of fall detection and reporting.
PIG. 3 illustrates an example process 300 for fall detection and reporting.
The operations of the example process 300 are described generally as being performed by the system 200. The operations of the example process 300 may be performed by one of the components of the system 200 (e.g., the image sensing device 110, the gateway 120, the remote monitoring server 130, etc.) or may be performed by any combination of the .
components of the system 200. In some implementations, operations of the example process 300 may be performed by one or more processors included in one or more electronic devices.
In general, the process 300 enables fall detection and reporting based on room occupancy analysis, The system 200 detects room occupancy (314 For example, movement events may be detected by the image sensing device or other external sensors (e.g., perceived motion by passive infrared motion sensor of the image sensing devices, door openings and closings detected by door/window contact sensors of a home security system).
In this example, the movement events signal possible human entrance into a room where the image sensing device is located and are used to detect room occupancy. The system 200 may capture camera image(s) and analyze the camera image(s) to verify that the room is occupied.
After detecting room occupancy, the system 200 detects a lack of room vacation (320). For example, the system 200 monitors output of the image sensing device or other external sensors for movement events in the occupied room and other rooms in the property.
In this example, the system 200 detects successive movement events based on sensors of the image device or other external sensors (even in other moms). The successive movement events signal human vacation of the room and the system 200 analyzes the successive movement events to determine whether the room has been vacated. For instance, the system 200 may determine that the room has been vacated when no successive movement events are detected in the room and successive movement events are detected in other rooms of the property. The system 200 may detennine that the room has not been vacated when successive movement events are detected in the room and/or no successive movement events are detected in other rooms of the property. Based on a determination that the room has been vacated, the system 200 ceases further analysis and does not perform fall detection processing for the room until the room is detected as being occupied again.
Based on a determination that the room remains occupied, the system 200 captures one or more images for analysis and/or reporting (330). For instance, if sensors indicate that the room remains occupied, but further movement has ceased over a prescribed and configurable interval of time, the system 200 initiates image capture for reporting, further assessment, and/or validation of the possible fall event.
FIG. 4 illustrates another example of an electronic system 400 configured to provide fall detection and reporting. The system 400 includes one or more passive sensors 410, one or more assistance devices 420, one or more imaging sensors 430, one or more user interface devices 440, a gateway device 450, one or more remote servers 460, and a monitoring center 470. The one or more user interface devices 440, the gateway device 450, the one or more remote servers 460, and the monitoring center 470 may exchange communications over a communication network 480.
=
Passive sensors 410 may be employed to measure activity or inactivity within a monitored residence. The activity or inactivity can be associated with a fall (e.g., impact, period of inactivity, location, time, etc.) or it can measure aspects of behavior related to fall risk (e.g., general activity level, sleeping, eating, bathroom use, medication use, gait speed, etc.). The behavior profiling can help to promote fall risk reduction via automated assistance devices 420 or through behavior change suggestions via user interface device(s) 440.
Assistance devices 420 are capable of performing automated tasks based on inputs from sensors 410, a gateway device 450, user interface device(s) 440, or remote servers 460.
Assistance devices 420 can be programmed to respond based on rules specified by users, by caregivers, or by default. For example, a light can be illuminated in response to a bed sensor being vacated during the evening. Assistance devices 420 can also report their state to other devices, systems, or stakeholders.
Imaging sensors 430 (e.g., still frame or video) are capable of detecting possible falls.
Furthermore, imaging sensors 430 can forward images of possible falls to remote servers 460, caregivers, or monitoring centers 470 for automated or human verification. Imaging sensors 430 may also have other modes of sensing (e.g., motion, acceleration, etc.) to trigger or augment native imaging and sensing capabilities. For example, impact sensed by the image sensor 430 could be used to trigger image capture. Captured images, sensed data, or other information (e.g., location, time, etc.) may be communicated to other devices, systems, or stakeholders.
In some implementations, the image sensing device 110 described above with respect to FIG. 1 may be used as.the imaging sensors 430. The image sensing device 110 may be installed within a monitored home or facility, The device 110 combines multi-modal sensing (e.g., passive infrared motion sensor, biaxial inertial sensor, illumination sensor), an infrared illumination source, camera, processor, memory, battery, input/output, and radio (e.g., via input/output) capabilities. The device 110 detects events indicative of potential falls proximal to its installation location. A plurality of devices 110 may be installed throughout a home or facility, and used in conjunction with other sensors, to increase the fall detection coverage area and provide specific location information for fall reporting and response.
A user interface device 440 may be used to communicate information to or gather information from a user about activity related to fall prevention, fall detection, or daily living. Possible physical incarnations of user interface devices 440 may include light or audio sources, displays, push buttons, or mobile devices (e.g., mobile phones or mobile phone applications). A user interface device 440 may also act as a sensing device and relay data to a gateway device 450 or directly to remote servers 460 through the communication network 480, PIG. 5 illustrates an example of a user interface and sensing device.
Specifically, FIG. 5 illustrates an on-body sensor 510. The on-body sensor 510 may be a fall and movement sensor with an emergency button. The on-body sensor 510 is intended to be worn and easily attached to many articles of clothing on the trunk (e.g., belt, lapel, brazier, lanyard, etc.).
FIG. 6 illustrates a device 600 that represents an example of the on-body sensor 510.
In order to facilitate wearability, the device 600 embodies a clip form factor. The clip is fastened closed through tension when no force is applied, but can be opened upon demand (e.g., similar to a clothes pin), thereby ensuring that it remains connected to an article of clothing.
When no force is applied to the clip, both sides of the device 600 are in contact with one another. The device 600 includes compliance contacts (e.g., an electrical switch) comprising a conductive contact on each side of the clip. When the clip is forced open or clipped around a piece of fabric, the switch is opened. Otherwise, the switch is closed and the circuit loop completed_ Using the compliance contacts, the system 400 can identify whether the sensor is being worn. This information can be used to identify false falls created from dropping or otherwise handling the device 600 when not worn. The user can also be reminded via audible or visual interfaces based on the system 400 detecting that the device 600 is not being worn as a result of the output of the compliance contacts. In determining whether to provide the reminder, the system 400 may consider other sensors within the =
monitored premise. For instance, the system 400 may detect motion within the monitored =
premise based on output of one or more Motion sensors, determine that the device 600 is not being worn based on output from the compliance contacts, and provide a reminder to wear the device 600 based on the determination that the device 600 is not being worn at a time when motion has been detected in the monitored premise.
Referring again to FIG. 5, the on-body sensor 510 comprises multi-modal sensing (e.g., triaxial inertial sensor, angular rate sensor, magnetometer, barometric pressure sensor, etc.), input/output, radio (e.g., via input/output), a processor, memory, battery, and user interface capabilities for human interaction (e.g., a button, LED/LCD, buzzer, etc.). The on-body sensor 510 may be used to measure gross human motion and activity, detect specific events or behaviors (e.g., falls, walking, running, sleeping, etc.), communicate to the user (e.g., reminders, notifications, etc.), or capture user input (e.g., panic button press, verification of event, eto.). Detecting falls with on-body sensing is described in further detail below.
Referring again to FIG. 4, a gateway device 450 can be used to relay information between remote servers 460 (e.g., over public or private communication network) and systems at the user location. The gateway device 450 can also allow systems within a user's location to communicate without involvement from remote servers 460. Certain incarnations of the system 400 may not include a gateway device 450. Therefore, passive sensors 410, assistance devices 420, imaging sensors 430, and/or user interface devices 440 may be connected directly to the communication network 480.
Remote servers 460 may be employed to store, process, and initiate actions based upon fall, fall-related, or other data collected about each monitored user and location.
Monitoring center agents can also annotate user records stored on the remote servers 460.
A monitoring center 470 may employ automated or human agents to observe users' fall-related events and contact users or caregivers based on defined protocols, quantitative or qualitative assessments. Monitoring center agents can also annotate user records stored on the remote server 460.
FIG. 7 illustrates an example process 700 for fall management The operations of the example process 700 are described generally as being performed by the system 400, The operations of the example process 700 may be performed by one of the components of the system 400 or may be performed by any combination of the components of the system 400.
The operations of the example process 700 also may be performed by one of the components of the system 200 or may be performed by any combination of the components of the system 200. In some implementations, operations of the example process 700 may be performed by one or more processors included in one or more electronic devices.
The fall management process 700 includes data capture (710), fall detection (720), fall verification (730), fall risk assessment (740), fall risk reduction (750), and reporting (760). Although several steps are illustrated as part of the fall management process 700, some fall management implementations may only employ a subset of these steps.
The system 400 performs data capture (710). Data can be captured from one or more passive sensors, imaging sensors, assistance devices, and user interface devices. Data can be unprocessed sensor readings or sensor-processed readings. Data capture can be triggered by the passive sensors, imaging sensors, user interface devices, remote servers, or monitoring center. Data capture can consist of instantaneous or continuously sampled readings. Data can be forwarded directly from devices to remote servers or to remote servers via a gateway device. Remote servers, a gateway device, or sensors may coordinate the capture of data or buffer data to facilitate on-sensor, on-gateway, or remote processing. In addition to raw sensor readings, meta-data encompassing sensor location, timestamp, etc. can be forwarded to other devices, sensors, gateways, or remote servers.
The system 400 performs fall detection (720). Falls can be detected independently by passive sensors, imaging sensors, or user interface devices (e.g., on-body sensor). Each device can classify a possible fall and communicate fall events or quantitative metrics related to the possibility of a fall (e.g., fall classification score). For example, an on-body sensor can =
capture human motion and detect motion characteristics indicative of a fall (described in more detail below). Furthermore, an image sensor can detect the likelihood of a fall through analysis of images and other in-device sensors (described in more detail below).
Fall detection may also be accomplished through the use of multiple sensors in parallel (e.g., hierarchical) or sequentially to improve sensitivity and specificity of fall detection, Numerous examples of combined sequential and parallel fall detection may be used and data from any combination of the sensors described throughout this disclosure may be fused and considered in combination to detect a potential fall event, For example, the system 400 may detect entry into a room based on output from a motion sensor and/or a door sensor. In this example, the system 400 detects that the room has not been exited after a threshold period of time has passed since the room entry was detected and detects sensor inactivity across all sensors after the room entry was detected. Based on the detections made and consideration of output of all of the sensors within the system 400, the system 400 determines that a potential fall event may have occurred in the room and, in response to the determination that a potential fall event may have occurred in the room, initiates farther processing to verify whether a potential fall event has occurred in the room.
In another example, the system 400 detects a potential fall event based on output from .. an on-body sensor. In this example, the system 400 controls an imaging sensor to capture one or more images in a room where the potential fall event is expected to have occurred, performs analysis of the captured images, and detects possible presence of a prone individual on the ground in the room. The system 400 also detects sensor inactivity across all sensors after detecting the potential fall event based on output from the on-body sensor. Based on the detections made and consideration of output of all of the sensors within the system 400, the system 400 determines that a potential fall event may have occurred in the room and, in response to the determination that a potential fall event may have occurred in the room, initiates further processing to verify whether a potential fall event has occurred in the room.
Independent fall detection processes on single devices or groups of devices also may be weighted (e.g., based on confidence or accuracy of fall detection efficacy). Such weighting may be used to compute an aggregate score indicative of the confidence of a possible fall. Weights may be assigned based on currently observed data and conditions, historic data from the monitored individual, or population data. Fall detection sensitivity may be configured by the. user based on manipulation of weights associated with any of the aforementioned steps. For example, fall smiqinvity could be set by adjusting the interval of sensed inactivity or the threshold for decreased activity. The system 400 may consider output from any of the sensors in the system 400 in computing the aggregate score. The system 400 may use the aggregate score to detect a potential fall event by comparing the aggregate score to a threshold. For instance, the system 400 detects a potential fall event based on the comparison of the aggregate score to the threshold revealing that the aggregate score meets the threshold and determines that a potential fall event has not occurred based on the comparison of the aggregate score to the threshold revealing that the aggregate score does not meet the threshold. By considering weighted output from many different sensors and fall = detection processes in. computing the aggregate score, the system 400 may provide more accurate fall detection with a lower false positive rate because detection of a fall detection only occurs when several sensors sense potential fall criteria or a single sensor detects a very high likelihood of a potential fall.
The system 400 performs fall verification (730). If a likely fall is detected, the detecting device, gateway, remote server, or monitoring center can initiate fall verification.
The process can include an automated or human-prompted user response. For example, a user may be alerted (e.g., by audible tone, vibration, human operator, automated operator, or visual indicator) to verify their need for help (e.g., a button press or vocal response) or may be alerted to respond within a period of time to cancel a potential fall event. A human operator may also speak and listen to a user over two-way communication link.
Fall verification also may be made by human inspection of captured images. For example, following the detection of a potential fall event, an image or successive images captured proximal to the fall may be sent to the monitoring center for human verification.
Image capture also may be triggered post fall (e.g., by a monitoring center or by other caregivers) to verify a fall event. Other contextual sensor or meta-data may be forwarded to human responders to assist in the verification of fall.
Fall verification procedures may be staged sequentially or paired with. fall detection mechanisms to create a hierarchical fall escalation process. For example, less accurate fall detection methods may trigger less invasive user verification (e.g., prompted user button press). If no user response is given within a threshold period of time, then more accurate fall detection methods may be employed alongside more invasive fall verification (e.g., two way communications with monitoring center).
The system 400 performs fall risk assessment (740). Assessment of fall risk may be made on the basis of data captured by sensors, user interface devices, or historic and stored data. Measures such as gait speed and balance can be directly assessed via passive and user interface devices, For example, two motion sensors placed in a hallway can measure gait speed and balance can be assessed via an on-body user interface device (e.g., via on-board inertial sensor and angular rate sensor). Other behavioral data such as medication adherence, sleep patterns, kitchen or re.,stroom use can be used to augment mobility metrics. Data can be combined with prior knowledge of fall incidents or previously verified fall events. In addition, users may be prompted to submit responses to questions or requests for information (e.g., via a user interface device or website, electronic medical records, residence layout, etc.) to form an aggregate fall risk assessment score. Scores can be computed, compared, or =
modified against individual or population scores and histories. Scores can also be computed for various timescales and locations. Fall risk assessment may also take into consideration trending of scores for an individual.
The system 400 performs fall risk reduction (750). Various assistive approaches may be employed with or without prior fall risk assessment scoring to help reduce fall risk.
20. Assistance devices such as automated lighting or medication dispensers can be used to reduce environmental hazards or behaviors related to increase in fall risk, respectively.
Assistance devices may be triggered by fall assessment scores, other sensors, user interface devices, or remote servers. For example, automated lighting can be turned-on when a user gets out of bed.
Furthermore, notifications or educational material can be delivered (e.g., by default, =
for certain fall risk assessment scores, for certain events, etc.) to the user (e.g., via a user interface device or other output device) to help the user better understand and correct fall risk factors. Tips or behavior change techniques can help the user set up a safer environment or promote behaviors associated with decreased fall risk. Notifications may be combined with sensing or other user interface prompts (e.g., prompts to answer questionnaires) to assess adherence to fall risk reduction techniques in real-time or across a period of time. Users may be scored on their ability to reduce fall risk at various timescales or in various locations. Fall risk reduction scores may be compared to individual or population historic data.
The system 400 performs reporting (760). Fall risk, detection, and prevention data, scores, annotations, or observations can be stored at the remote server. Data can be compiled and reported to users, caregivers, monitoring centers, or other trusted parties. Data (including timestamps, scores, locations, confidence, etc.) can be used for the purposes of response to events, for preventative fail risk reduction strategies, or by professional caregivers for general health assessment Data or scores can be compared to individual or population data and reported to all aforementioned parties when appropriate.
Data reporting may be combined with prompts for data entry. For example, a user could receive a notification that bathroom habits are abnormal and be asked whether they are feeling well.
Access to reported data can be restricted based on preferences of the user or caregivers.
Notifications, reminders, user prompts, questionnaires, monitored responses, and other user interface modes can be configured by rules with associated parameters. Rules can be stored and executed at the remote server, gateway device, sensors, or user interface devices.
FIG. 8 illustrates an example process 800 for fall detection using an on-body user interface device. The operations of the example process 800 are described generally as being performed by the system 400. The operations of the example process 800 may be performed by one of the components of the system 400 or may be performed by any combination of the components of the system 400. The operations of the example process 800 also may be performed by one of the components of the system 200 or may be performed by any combination of the components of the system 200. In some implementations, operations of the example process 800 may be performed by one or more processors included in one or more electronic devices.
In order to accurately detect a fall event, the on-body user interface device identifies the various characteristics of a fall comprised of the user starting from a standing or sitting position, falling through to the ground, impacting a surface, and remaining inactive after the fall. The user's trunk may transition from a vertical to horizontal position.
This may result in a ninety degree change in trunk orientation, but since the user may not be standing straight before the fall, or may not be prone or supine after the fall, the angle may not reach ninety degrees. FIG. 8 illustrates a fall detection process 800 for users wearing an on-body sensor with continuous sensing and detection.
The system 400 triggers fall detection processing based on detection of a fall-related = signature (810). The fall detection process may be triggered by an impact metric (e.g., measured from inertial sensing) or a similar fall-related signature (e.g, free fall) crossing a minimum threshold. The fall-related signature may be quantified and stratified into defined ranges indicative of fall detection confidence.
For instance, FIG. 9 illustrates example fall detection criteria. The fall detection criteria include a range of impact metrics 910 used to quantify a measured impact metric. As shown, the range of impact metrics may include less than two, between two to five, between five to ten, between ten to fifteen, and greater than fifteen. The system 400 may use the impact metric of two as a threshold for triggering fall detection processing.
For instance, the system 400 quantifies a measured impact within the ranges of impact metrics and determines not to trigger fall detection processing based on the measured impact falling within the range of less than two. For any of the other ranges, the system 400 triggers fall detection processing and records the range in which the measured impact falls for later processing.
Referring again to FIG. 8, the system 400 calculates orientation change based on triggering fall detection processing (820). Based on the system 400 detecting that a measured impact or similar metric crosses the previously mentioned minimum threshold, the system 400 calculates an orientation change using inertial or angular rate measures from before and after the detected impact or other event. The orientation value may be quantified and stratified into defined ranges.
For example, the fall detection criteria shown in FIG. 9 include a range of orientation changes 920 used to quantify an orientation change. As shown, the range of orientation changes may include less than fifty, between fifty to sixty, between sixty to seventy-five, between seventy-five to eighty-five, and greater than eighty-five. The system 400 may use the orientation change of fifty as a threshold for continuing fall detection processing. For instance, the system 400 quantifies a calculated orientation change within the ranges of orientation changes and determines not to continue fall detection processing based on the calculated orientation change falling within the range of less than fifty. For any of the other ranges, the system 400 continues fall detection processing and records the range in which the calculated orientation change falls for later processing.
Referring again to FIG. 8, the system 400 determines a minimum required inactivity period based on the fall-related signature and the orientation change (830).
Based on the defined ranges derived from impact/signature scoring and orientation scoring, a minimum required inactivity period can be determined by a lookup table or functional relationship_ The fall detection criteria shown in FIG. 9 include an inactivity period lookup table 930. The system 400 references the lookup table 930 using the range of the measured impact and the range of the calculated orientation change and sets the minimum required inactivity period as the period of time defined by the appropriate entry in the lookup table 930. For example, with an impact metric greater than ten, but less than fifteete and an orientation change greater than eighty-five, the inactivity period is set as low as thirty seconds to signal a likely fall.
Referring again to FIG. 8, the system 400 detects a potential fall event based on monitoring activity during the minimum required inactivity period (840). The system 400 may monitor output of the on-body sensor and output from any of the other sensors in the =
system 400 and determine whether any of the sensors signal activity. The system 400 continues to monitor the sensor output until the set period of inactivity has been reached and the system 400 detects a potential fall event based on determining that the set period of inactivity has passed without detecting sensed activity from any of the sensors in the system 400.
PIG. 10 illustrates an example process 1000 for tuning sensitivity and specificity of fall detection. The operations of the example process 1000 are described generally as being performed by the system 400. The operations of the example Process 1000 may be performed by one of the components of the system 400 or may be performed by any combination of the components of the system 400. The operations of the example process 1000 also may be performed by one of the components of the system 200 or may be performed by any combination of the components of the system 200. In some implementations, operations of the example process 1000 may be performed by one or more processors included in one or more electronic devices.
To tune sensitivity and specificity of fall detection (e.g., the on-body fall detection process 800), the process 1000 uses Wet feedback. The process 1000 may produce more granular fall reporting (e.g., true, false, minor, canceled falls) and may help to reduce and report incidence of false positives or false negatives.
The system 400 detects a potential fall event (1005), A possible fall is detected by the on-body device, by other sensors, or user interface devices. Any of the techniques described throughout this disclosure may be used to detect a potential fall event.
The system 400 prompts the user for cancellation of the potential fall event (1010). A
user prompt may be initiated (e.g., audible or visual). The user can respond (e.g., by a. button press or vocalization) to the user prompt at the device to cancel the detected potential fall event.
The system 400 determines whether the user cancels the potential fall event within a defined period of time (1015). For instance, the system 400 monitors for input cancelling the potential fall event until the defined period of time has been reached and the system 400 determines whether the user cancelled the potential fall event within the defined period of time based on the monitoring. 13ased on a determination that the potential fall event was not cancelled within the defined period of time, the system 400 generates a fall signal (e.g., a fall signal from the body-worn device).
Based on a determination that the potential fall event was cancelled within the defined period of time, the system 400 makes a measurement of overall activity over the minimum inactivity period previously mentioned (1020). For example, the system measures the activity detected by the on-body sensor after detection of the potential fall event until the input cancelling the potential fall event was received, The system 400 determines whether the measurement of overall activity meets an expected maximum activity (1025). For instance, the system 400 compares the measurement of overall activity to the expected maximum activity and determines whether the measurement of overall activity meets the expected maximum activity based on the comparison.
Based on a determination that the measurement of overall activity meets the expected maximum activity, the system 400 signals a false fall detection (1030). For example, the system 400 classifies the sensor data used to detect the potential fall event as being sensor data associated with a false detection of a potential fall event. In this example, the system 400 may tune the potential fall detection process such that sensor data similar to the sensor data associated with the false detection of the potential fall event does not result in detection of a potential fall event in the future.
Based on a determination that the measurement of overall activity does not meet the expected maximum activity, the system 400 measures posture or orientation (1035) and determines whether the subject recovered from the suspected fall based on the measured posture or orientation (1040). For instance, the system 400 analyzes the measured posture or orientation and detennines whether the subject has returned to an upright position.
Based on a determination that the subject recovered from the suspected fall, the system 400 triggers a minor fall (1045). For example, the system 400 classifies the sensor data used to detect the potential fall event as being sensor data associated with a minor fall.
In this example, the system 400 may tune the potential fall detection process such that sensor data similar to the sensor data associated with the minor fall results in detection of a minor fall event in the future. The system 400 may handle minor fall events differently than regular fall events. For instance, the system 400 may wan longer to see if a patient recovers from a minor fall prior to alerting a remote caregiver or monitoring station.
Based on a determination that the subject did not recover from the suspected fall, the system 400 performs another user prompt for cancellation (1050) and determines whether the user cancels the potential fall event within a defined period of time from the additional prompt for cancellation (1055). Based on a determination that the potential fall event was =
cancelled within the defined period of time, the system 400 signals a cancelled fall (1060).
For instance, the system 400 does not provide an alert for the potential fall event, but does classify the sensor data used to detect the potential fall event as being sensor data associated with a fall that was ultimately cancelled.
Based on a determination that the potential fall event was not cancelled within the defined period of time, the System 400 generates a fall signal (1065). For instance, the system 400 may generate a fall signal from the body-worn device. The fall signal may be sent to a remote caregiver or monitoring station to alert the remote caregiver or monitoring station to provide assistance to the patient who experienced the potential fall event.
Granular fall detection classes such as true fall, false fall, minor fall, and cancelled fall can be used to tune system parameters for each individual user, provide caregivers or trusted individuals with fall data, and provide automated mechanisms for fall verification.
Furthermore, the data can be stored at the remote servers.
FIG. 11 illustrates an example process 1100 for fall detection and reporting.
The operations of the example process 1100 are described generally as being performed by the system 400. The operations of the example process 1100 may be performed by one of the components of the system 400 or may be performed by any combination of the components of the system 400. The operations of the example process 1100 also may be performed by one of the components of the system 200 or may be performed by any combination of the components of the system 200. In some implementations, operations of the example process 1100 may be performed by one or more processors included in one Or more electronic =
devices.
In general, the process 1100 enables fall detection and reporting based on human movement analysis. The system 400 performs a triggered or scheduled image capture (1110). For example, the system 400 may trigger a camera on an image sensing device to capture an image based on events detected by one or more of the image sensing device's sensors (e.g., perceived motion passive infrared motion sensor, triaxial inertial sensor). In this example, movement or impact detected proximal to the image sensing device may initiate the capture of an image. Furthermore, the system 400 may trigger the camera by one or more external sensors interfaced via a gateway device. For instance, the press of a panic button or the opening of a door sensor may trigger one or more image sensing devices to ca-pture an image. Finally, image capture may be scheduled (e.g., capture an image every one minute during the hours of six in the morning through ten in the evening). In lower light conditions (e.g., characterized by the illumination sensor), the system 400 may employ infrared illumination to increase image detail and quality.
After image capture, the system 400 performs image foreground segmentation and filtering (1120). The system 400 (e.g., the image sensing device) may perform image foreground segmentation via background subtraction or other averaging approaches. The system 400 may filter captured images to help reduce foreground noise and isolate large regions of change. The process may identify changed pixels from previous images, including those morphologically likely to represent human forms or shapes.
After image foreground segmentation and filtering, the system 400 performs human segmentation (1130). The system 400 segments possible human shapes via template matches, shape fitting, or similar methods. For example, the system 400 may segment a foreground shape falling within an approximate elliptical boundary over a size threshold.
Such segmentation may reduce incidence of false detection and reporting (e.g., small pet activity), To further reduce incidence of false detection and reporting, the system 400 may remove regions of the camera's field of view from analysis. For instance, if a bed were present in the field of view, the bed may be marked as a non-detection region and the system 400 would not analyze that portion of images captured by the image sensing device.
After human segmentation, the system 400 performs human orientation and position estimation (1140), For example, the system 400 calculates orientation (e.g., human shape upright, angled, prone, etc.) and position (e.g., human shape above floor, near floor, etc.) by template or boundary shape proportion and rotation relative to a horizontal image plane.
This estimation enables identification of postures and resting positions indicative of a fall.
The floor proximal planar boundary can be specifically defined and moved to fit the unique geometries of different rooms, After human orientation and position estimation, the system 400 performs successive image and/or sensor data comparison (1150), For example, the system 400 stores, either on or off the image sensing device, the orientation and position information calculated previously and compares the prior orientation and position information with successive image orientations and positions. The system 200 repeats this process and isolates changes in position and orientation indicative of a fall (e.g., movement towards the ground), or relative stasis of position and orientation indicative of a fall (e.g., incapacitation after a fall).
Furthermore, the system 400 combines motion sensor Information with or used independent of image-based analysis to ascertain movement through horizontal planes of motion (e.g., human falling from an upright position).
The system 200 performs inactivity detection (1160). For example, the system detects periods of relative inactivity, such as those following a potential fall, from lack of or decreased motion, inertial measures, image-derived orientation and position information, external sensors, or even a combination thereof. The system 400 may classify longer periods of relative inactivity as being indicative of a fall, and classify shorter periods of relative inactivity as being indicative of a non-fall event or recovery from a fall.
After insistivity detection, the system 400 performs fall classification (1170), The system 400 may combine (e.g., logically or algebraically) the data and information compiled in previous operations of the process 1100 and use the combined data in several ways to classify possible falls. For example, if an impact is detected, orientation and position are indicative of a human in a fallen state, and a period of inactivity has exceeded a defined threshold, then the system 400 classifies the event as a fall. Classification sensitivity may be configured by the user based on manipulation of variables associated with any of the aforementioned steps. For example, fall sensitivity could be set by adjusting the interval of sensed inactivity or the threshold for decreased activity. Not all prior conditions must be met, nor all prior steps completed, for fall classification, The system 400 may report classification confidence based on the quality of inputs or classifier performance.
Furthermore, the system 400 may implement the classifier in a variety of ways such as, but not limited to an expert system, native Bayes, decision tree, neural network, etc.
=
After fall classification, the system 400 performs fall reporting (1180). For example, potential fall events are forwarded to a gateway device, remote monitoring servers, and ultimately to users or central monitoring station(s) if appropriate rules and preferences are met, Images (e.g., past and present), data, and location information can be sent for purposes of reporting and verification. Moreover, potential non-fall events, images, data, and location can be forwarded to users or central monitoring station(s) for verification.
Verification of fall events is not a requisite function of the system, rather an additional feature, Fall detection can be performed with or without image or other human-based verification.
FIG. 12 shows fall detection examples with three possible scenarios that illustrate aspects of the fall detection process 1000 discussed above. In the first scenario (a), a person stands upright in a room. In the second scenario (b), a person has fallen and is prone on the floor. In the third scenario (c), a person has fallen to a slumped position on the floor.
Illustrations (d), (e), and (f) represent the results of foreground separation and filtering of illustrations (a), (b), and (c), respectively. Illustrations (g), (h), and (i) represent the results of human orientation and position estimation and inactivity detection (as denoted by a clock) of the previous illustrations, respectively. Notice in illustration (g) that the human shape estimator, illustrated as an ellipse, but not limited to ellipses, extends beyond a floor proximal planar boundary; whereas in illustrations (b) and (i), the human shape estimators are below the plane and their orientations are not vertical, hence, inactivity detection has commenced.
Analysis of room geometry within captured images may be used to project a virtual plane of where a person should be oriented below in a fall event. The system 400 may analyze floor geometry and then perform centroid-based processing to determine where the floor is located in the captured images, After determining the location of the floor, the system 400 projects the virtual plane within the captured images at a particular distance above the floor.
In some implementations, the image sensing device and optional trigger sources (e.g., other sensors) communicate to a gateway (e.g., home monitoring panel) within a home or facility. The gateway's memory enables buffering of images and data from the image sensor and other sensors. Data is forwarded to remote monitoring servers over a long range wireless network (e.g., cellular link). Rules and preferences set at the remote monitoring server enable potential fall information (e.g., captured images and data) to be forwarded via an IP
network to users or a central monitoring station for fall verification. If a fall is verified by human inspection of captured images and data, a response can be initiated (e.g., a two-way voice call may be established with the gateway device, emergency responders may be dispatched, etc.) and location information from the system can be communicated to those providing assistance.
In some implementations, the system (e.g,, the system 200 or the system 400) may evaluate context in determining how to handle a fall detection event In these implementations, the system may check other activity in the property and determine how to handle the fall detection event based on the other activity. For instance, when the system detects other activity in the property, the system may attempt to alert someone in the property to the potential fall event (e.g., by providing an audible alert in the home that indicates the fall detection event). When the system does not detect other activity in the property, the system may, based 011 the fall detection event, send electronic messages to a caregiver associated with the property to alert the caregiver to the fall detection event, establish a two-way voice communication session with a monitoring system at the property, and/or dispatch emergency services.
In some examples, the system may tune sensitivity of one or more sensors/contexts used in fall detection and may determine a score as part of fall classification, In these examples, the system may determine the score based on a number of sensors that indicate a potential fall. For instance, the system may determine a relatively high score when the system detects a thud based on an accelerometer sensor, detects multiple motion sensors indicating motion consistent with a fall, and performs image analysis that suggests that a person has moved from a vertical orientation to a horizontal orientation below a plane near the floor. The system may determine a relatively low score when the system only performs image analysis that suggests that a person is horizontally oriented below a plane near the floor. The system may consider the number of motion sensors detecting motion and leverage ail sensor data. The system may typically operate using a subset of sensors and move to a process that leverages all sensors when a potential fall is detected by the subset of sensors.
The system may consider historic data (e.g., classification by caregivers of whether a fall detection event was actually a fall or a mistake) and tune fall detection based on the historic data.
in some implementations, the location in the home where the fall occurred may be determined and communicated to an emergency response team. In addition, the location in the home where the fall occurred may be used to pick the other sensors the system reviews in confirming a potential fall event, For instance, when the system determines that the potential fall occurs in the basement, the system determines not to consider sensors in the upstairs bedroom, as the sensors in the upstairs bedroom are unlikely to be relevant to the potential fall event in the basement.
The described systems, methods, and techniques may be implemented in digital electronic circuitry, computer hardware, firmware, software, or in combinations of these elements. Apparatus implementing these techniques may include appropriate input and output devices, a computer processor, and a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor. A
process implementing these techniques may be performed by a programmable processor executing a program of instructions to perform desired functions by operating on input data and generating appropriate output. The techniques may be implemented in one or more computer programs that are executable on a prop-an-unable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. Each computer program may be implemented in a high-level procedural or object-oriented programming language, or in assembly or machine language if desired; and in any case, the language may be a compiled or interpreted language. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, a processor will receive instructions and data from a read-only memory and/or a random access memory.
Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices; magnetic disks such as internal hard disks and removable disks;
magneto-optical disks; and Compact Disc Read-Only Memory (CD-ROM). Any of the foregoing may be supplemented by, or incorporated in, specially-designed ASICs (application-specific integrated circuits).
It will be understood that various modifications may be made. For example, other useful implementations could be achieved if steps of the disclosed techniques were performed in a different order and/or if components in the disclosed systems were combined in a different manner and/or replaced or supplemented by other components.
Accordingly, other implementations are within the scope of the disclosure.
database or a file system and makes the converted images available to the one or more user devices 140 and/or the central monitoring station 150.
The central monitoring station 150 includes an electronic device (e.g., a server) configured to provide alarm monitoring service by exchanging communications with the remote monitoring server 130 over the network 145. For example, the central monitoring station 150 may be configured to monitor alarm events generated by a monitoring or alarm system that monitors the home or facility where the image sensing device 110 is located. In this example, the central monitoring station 150 may exchange communications with the remote monitoring sewer 130 to receive information regarding alarm events detected by the monitoring or alarm system. The central monitoring station 150 also may receive information regarding alarm events from the one or more user devices 140. The central monitoring station 150 may receive images captured by the image sensing device 110 to enable verification of potential fall events.
The central monitoring station 150 may be connected to multiple terminals. The terminals may be used by operators to process alarm events. For example, the central monitoring station 150 may route alarm data to the terminals to enable an operator to process the alarm data. The terminals may include general-purpose computers (e,g., desktop personal computers, workstations, or laptop computers) that are configured to receive alarm data from a server in the central monitoring station 150 and render a display of information based on the alarm data. For example, the central monitoring station 150 may receive alarm data and route the alarm data to a terminal for processing by an operator associated with the terminal.
The terminal may render a display to the operator that includes information associated with the alarm event (e.g., the mune of the user of the alarm system, the address of the building the alarm system is monitoring, the type of alarm event, images of fall events taken of the image sensing device 110, etc.) and the operator may handle the alarm event based on the displayed information.
The one or more user devices 140 include devices that host user interfaces.
For instance, the user devices 140 may include a mobile device that hosts one or more native applications (e.g., a fall detection and reporting application). The user devices 140 may include a cellular phone or a non-cellular locally networked device with a display. The user devices 140 may include a smart phone, a tablet PC, a personal digital assistant ("PDA"), or any other portable device configured to communicate over a network and display information. For example, implementations may also include BlackberryTm-type devices (e.g., as provided by Research in Motion), electronic organizers, iPhoneTm-type devices (e.g., as provided by Apple), iPodTM devices (e.g., as provided by Apple) or other portable music players, other communication devices, and handheld or portable electronic devices for gaming, communications, and/or data organization. The user devices 140 may perform functions unrelated to the monitoring system, such as placing personal telephone calls, playing music, playing video, displaying pictures, browsing the Internet, maintaining an electronic calendar, etc.
The user devices 140 may include a native fall detection and reporting application. The native fall detection and reporting application refers to a software/firmware program running on the corresponding mobile device that enables the .. user interface and features described throughout. The user devices 140 may load or install the native fall detection and reporting application based on data received over a network or data received from local media. The native fall detection and reporting application runs on mobile device platforms, such as iPhoneTM, iPod touchTM, Blackberry TM, GoogleTM AndroidTM, WindowsTM MobileTM, etc. The native fall detection and reporting application enables the user devices 140 to receive and process image and sensor data from the monitoring system.
The user devices 140 also may include a general-purpose computer (e.g., a desktop personal computer, a workstation, or a laptop computer) that is configured to communicate with the remote monitoring server 130 over the network 145. The user devices 140 may be configured to display a fall detection and reporting user interface that is generated by the user devices 140 or generated by the remote monitoring server 130. For example, the user devices 140 may be configured to display a user interface (e.g., a web page) provided by the remote monitoring server 130 that enables a user to perceive images captured by the image sensing device 110 and/or reports related to the monitoring system.
The system 200 further includes one or more trigger sources 128. The trigger sources 128 may include devices that assist in detecting fall events. For example, the trigger sources 128 may include contact or pressure sensors that are positioned at a lower part of a building (e.g., at or near the floor). In this example, when a person falls, the person may touch one of the trigger sources 128 to alert the system 200 to the fall. In this regard, the system 200 may use output of the trigger sources 128 to identify a possible fall location and begin capturing and processing images near that location to determine whether the trigger relates to an actual fall event or a false alarm, such as inadvertent contact with a trigger source.
In some examples, the system 200 may include inertial sensors (e.g., accelerometers) to detect an impact potentially generated from a fall. In these examples, when a person falls, the inertial sensors may detect an impact and the system 200 may use the detected impact to infer a potential fall. In this regard, the system 200 may use output of the inertial sensors to identify a possible fall location and begin capturing and processing images near that location to determine whether the detected impact relates to an actual fall event or a false alarm, such as dropping of an object that resulted in the detected impact. =
In some implementations, the image sensing device 110 and the gateway 120 may be part of a home or facility monitoring system (e.g., a borne security system).
In these implementations, the home or facility monitoring system may sense many types of events or activities associated with the home or facility and the sensed events or activities may be leveraged in performing fall detection and reporting features. The home or facility monitoring system may include a controller that communicates with the gateway 120. The controller may be con6gured to control the home or facility monitoring system (e.g., a home alarm or security system). In some examples, the controller may include a processor or other control circuitry configured to execute instructions of a program that controls operation Of an alarm n system. In these examples, the controller may be configured to receive input from sensors, detectors, or other devices included in the home or facility monitoring system and control operations of devices included in the home or 'facility monitoring system or other household devices (e.g., a thermostat, an appliance, lights, etc.).
=
=
The home or facility monitoring system also includes one or more sensors or detectors. For example, the home Or facility monitoring system may include multiple SODOM'S, including a contact sensor, a motion sensor, a glass break sensor, or any other type of sensor included in an alarm system or security system. The sensors also may include an environmental sensor, such as a temperature sensor, a water sensor, a rain sensor, a wind sensor, a light sensor, a smoke detector, a carbon monoxide detector, an air quality sensor, etc. The sensors further may include a health monitoring sensor, such as a prescription bottle sensor that monitors taking of prescriptions, a blood pressure sensor, a blood sugar sensor, a bed mat configured to sense presence of liquid (e.g., bodily fluids) on the bed mat, bathroom usage sensors, food consumption sensors, etc. In some examples, the sensors 120 may include a radio-frequency identification (RFID) sensor that identifies a particular article that includes a pre-assigned RFD tag.
The system 200 shown in FIG. 2 may be used for the two example processes 300 and 400 of fall detection and reporting described with respeet to FIGS. 3 and 4.
The example .. processes 300 and 400 are independent; however, they can be staged so that first level fall detection triggers further (e.g., second level) analysis and classification of potential fall events. Both processes 300 and 400 have multiple steps, although a subset of steps may be employed to still meet practical requirements of fall detection and reporting.
PIG. 3 illustrates an example process 300 for fall detection and reporting.
The operations of the example process 300 are described generally as being performed by the system 200. The operations of the example process 300 may be performed by one of the components of the system 200 (e.g., the image sensing device 110, the gateway 120, the remote monitoring server 130, etc.) or may be performed by any combination of the .
components of the system 200. In some implementations, operations of the example process 300 may be performed by one or more processors included in one or more electronic devices.
In general, the process 300 enables fall detection and reporting based on room occupancy analysis, The system 200 detects room occupancy (314 For example, movement events may be detected by the image sensing device or other external sensors (e.g., perceived motion by passive infrared motion sensor of the image sensing devices, door openings and closings detected by door/window contact sensors of a home security system).
In this example, the movement events signal possible human entrance into a room where the image sensing device is located and are used to detect room occupancy. The system 200 may capture camera image(s) and analyze the camera image(s) to verify that the room is occupied.
After detecting room occupancy, the system 200 detects a lack of room vacation (320). For example, the system 200 monitors output of the image sensing device or other external sensors for movement events in the occupied room and other rooms in the property.
In this example, the system 200 detects successive movement events based on sensors of the image device or other external sensors (even in other moms). The successive movement events signal human vacation of the room and the system 200 analyzes the successive movement events to determine whether the room has been vacated. For instance, the system 200 may determine that the room has been vacated when no successive movement events are detected in the room and successive movement events are detected in other rooms of the property. The system 200 may detennine that the room has not been vacated when successive movement events are detected in the room and/or no successive movement events are detected in other rooms of the property. Based on a determination that the room has been vacated, the system 200 ceases further analysis and does not perform fall detection processing for the room until the room is detected as being occupied again.
Based on a determination that the room remains occupied, the system 200 captures one or more images for analysis and/or reporting (330). For instance, if sensors indicate that the room remains occupied, but further movement has ceased over a prescribed and configurable interval of time, the system 200 initiates image capture for reporting, further assessment, and/or validation of the possible fall event.
FIG. 4 illustrates another example of an electronic system 400 configured to provide fall detection and reporting. The system 400 includes one or more passive sensors 410, one or more assistance devices 420, one or more imaging sensors 430, one or more user interface devices 440, a gateway device 450, one or more remote servers 460, and a monitoring center 470. The one or more user interface devices 440, the gateway device 450, the one or more remote servers 460, and the monitoring center 470 may exchange communications over a communication network 480.
=
Passive sensors 410 may be employed to measure activity or inactivity within a monitored residence. The activity or inactivity can be associated with a fall (e.g., impact, period of inactivity, location, time, etc.) or it can measure aspects of behavior related to fall risk (e.g., general activity level, sleeping, eating, bathroom use, medication use, gait speed, etc.). The behavior profiling can help to promote fall risk reduction via automated assistance devices 420 or through behavior change suggestions via user interface device(s) 440.
Assistance devices 420 are capable of performing automated tasks based on inputs from sensors 410, a gateway device 450, user interface device(s) 440, or remote servers 460.
Assistance devices 420 can be programmed to respond based on rules specified by users, by caregivers, or by default. For example, a light can be illuminated in response to a bed sensor being vacated during the evening. Assistance devices 420 can also report their state to other devices, systems, or stakeholders.
Imaging sensors 430 (e.g., still frame or video) are capable of detecting possible falls.
Furthermore, imaging sensors 430 can forward images of possible falls to remote servers 460, caregivers, or monitoring centers 470 for automated or human verification. Imaging sensors 430 may also have other modes of sensing (e.g., motion, acceleration, etc.) to trigger or augment native imaging and sensing capabilities. For example, impact sensed by the image sensor 430 could be used to trigger image capture. Captured images, sensed data, or other information (e.g., location, time, etc.) may be communicated to other devices, systems, or stakeholders.
In some implementations, the image sensing device 110 described above with respect to FIG. 1 may be used as.the imaging sensors 430. The image sensing device 110 may be installed within a monitored home or facility, The device 110 combines multi-modal sensing (e.g., passive infrared motion sensor, biaxial inertial sensor, illumination sensor), an infrared illumination source, camera, processor, memory, battery, input/output, and radio (e.g., via input/output) capabilities. The device 110 detects events indicative of potential falls proximal to its installation location. A plurality of devices 110 may be installed throughout a home or facility, and used in conjunction with other sensors, to increase the fall detection coverage area and provide specific location information for fall reporting and response.
A user interface device 440 may be used to communicate information to or gather information from a user about activity related to fall prevention, fall detection, or daily living. Possible physical incarnations of user interface devices 440 may include light or audio sources, displays, push buttons, or mobile devices (e.g., mobile phones or mobile phone applications). A user interface device 440 may also act as a sensing device and relay data to a gateway device 450 or directly to remote servers 460 through the communication network 480, PIG. 5 illustrates an example of a user interface and sensing device.
Specifically, FIG. 5 illustrates an on-body sensor 510. The on-body sensor 510 may be a fall and movement sensor with an emergency button. The on-body sensor 510 is intended to be worn and easily attached to many articles of clothing on the trunk (e.g., belt, lapel, brazier, lanyard, etc.).
FIG. 6 illustrates a device 600 that represents an example of the on-body sensor 510.
In order to facilitate wearability, the device 600 embodies a clip form factor. The clip is fastened closed through tension when no force is applied, but can be opened upon demand (e.g., similar to a clothes pin), thereby ensuring that it remains connected to an article of clothing.
When no force is applied to the clip, both sides of the device 600 are in contact with one another. The device 600 includes compliance contacts (e.g., an electrical switch) comprising a conductive contact on each side of the clip. When the clip is forced open or clipped around a piece of fabric, the switch is opened. Otherwise, the switch is closed and the circuit loop completed_ Using the compliance contacts, the system 400 can identify whether the sensor is being worn. This information can be used to identify false falls created from dropping or otherwise handling the device 600 when not worn. The user can also be reminded via audible or visual interfaces based on the system 400 detecting that the device 600 is not being worn as a result of the output of the compliance contacts. In determining whether to provide the reminder, the system 400 may consider other sensors within the =
monitored premise. For instance, the system 400 may detect motion within the monitored =
premise based on output of one or more Motion sensors, determine that the device 600 is not being worn based on output from the compliance contacts, and provide a reminder to wear the device 600 based on the determination that the device 600 is not being worn at a time when motion has been detected in the monitored premise.
Referring again to FIG. 5, the on-body sensor 510 comprises multi-modal sensing (e.g., triaxial inertial sensor, angular rate sensor, magnetometer, barometric pressure sensor, etc.), input/output, radio (e.g., via input/output), a processor, memory, battery, and user interface capabilities for human interaction (e.g., a button, LED/LCD, buzzer, etc.). The on-body sensor 510 may be used to measure gross human motion and activity, detect specific events or behaviors (e.g., falls, walking, running, sleeping, etc.), communicate to the user (e.g., reminders, notifications, etc.), or capture user input (e.g., panic button press, verification of event, eto.). Detecting falls with on-body sensing is described in further detail below.
Referring again to FIG. 4, a gateway device 450 can be used to relay information between remote servers 460 (e.g., over public or private communication network) and systems at the user location. The gateway device 450 can also allow systems within a user's location to communicate without involvement from remote servers 460. Certain incarnations of the system 400 may not include a gateway device 450. Therefore, passive sensors 410, assistance devices 420, imaging sensors 430, and/or user interface devices 440 may be connected directly to the communication network 480.
Remote servers 460 may be employed to store, process, and initiate actions based upon fall, fall-related, or other data collected about each monitored user and location.
Monitoring center agents can also annotate user records stored on the remote servers 460.
A monitoring center 470 may employ automated or human agents to observe users' fall-related events and contact users or caregivers based on defined protocols, quantitative or qualitative assessments. Monitoring center agents can also annotate user records stored on the remote server 460.
FIG. 7 illustrates an example process 700 for fall management The operations of the example process 700 are described generally as being performed by the system 400, The operations of the example process 700 may be performed by one of the components of the system 400 or may be performed by any combination of the components of the system 400.
The operations of the example process 700 also may be performed by one of the components of the system 200 or may be performed by any combination of the components of the system 200. In some implementations, operations of the example process 700 may be performed by one or more processors included in one or more electronic devices.
The fall management process 700 includes data capture (710), fall detection (720), fall verification (730), fall risk assessment (740), fall risk reduction (750), and reporting (760). Although several steps are illustrated as part of the fall management process 700, some fall management implementations may only employ a subset of these steps.
The system 400 performs data capture (710). Data can be captured from one or more passive sensors, imaging sensors, assistance devices, and user interface devices. Data can be unprocessed sensor readings or sensor-processed readings. Data capture can be triggered by the passive sensors, imaging sensors, user interface devices, remote servers, or monitoring center. Data capture can consist of instantaneous or continuously sampled readings. Data can be forwarded directly from devices to remote servers or to remote servers via a gateway device. Remote servers, a gateway device, or sensors may coordinate the capture of data or buffer data to facilitate on-sensor, on-gateway, or remote processing. In addition to raw sensor readings, meta-data encompassing sensor location, timestamp, etc. can be forwarded to other devices, sensors, gateways, or remote servers.
The system 400 performs fall detection (720). Falls can be detected independently by passive sensors, imaging sensors, or user interface devices (e.g., on-body sensor). Each device can classify a possible fall and communicate fall events or quantitative metrics related to the possibility of a fall (e.g., fall classification score). For example, an on-body sensor can =
capture human motion and detect motion characteristics indicative of a fall (described in more detail below). Furthermore, an image sensor can detect the likelihood of a fall through analysis of images and other in-device sensors (described in more detail below).
Fall detection may also be accomplished through the use of multiple sensors in parallel (e.g., hierarchical) or sequentially to improve sensitivity and specificity of fall detection, Numerous examples of combined sequential and parallel fall detection may be used and data from any combination of the sensors described throughout this disclosure may be fused and considered in combination to detect a potential fall event, For example, the system 400 may detect entry into a room based on output from a motion sensor and/or a door sensor. In this example, the system 400 detects that the room has not been exited after a threshold period of time has passed since the room entry was detected and detects sensor inactivity across all sensors after the room entry was detected. Based on the detections made and consideration of output of all of the sensors within the system 400, the system 400 determines that a potential fall event may have occurred in the room and, in response to the determination that a potential fall event may have occurred in the room, initiates farther processing to verify whether a potential fall event has occurred in the room.
In another example, the system 400 detects a potential fall event based on output from .. an on-body sensor. In this example, the system 400 controls an imaging sensor to capture one or more images in a room where the potential fall event is expected to have occurred, performs analysis of the captured images, and detects possible presence of a prone individual on the ground in the room. The system 400 also detects sensor inactivity across all sensors after detecting the potential fall event based on output from the on-body sensor. Based on the detections made and consideration of output of all of the sensors within the system 400, the system 400 determines that a potential fall event may have occurred in the room and, in response to the determination that a potential fall event may have occurred in the room, initiates further processing to verify whether a potential fall event has occurred in the room.
Independent fall detection processes on single devices or groups of devices also may be weighted (e.g., based on confidence or accuracy of fall detection efficacy). Such weighting may be used to compute an aggregate score indicative of the confidence of a possible fall. Weights may be assigned based on currently observed data and conditions, historic data from the monitored individual, or population data. Fall detection sensitivity may be configured by the. user based on manipulation of weights associated with any of the aforementioned steps. For example, fall smiqinvity could be set by adjusting the interval of sensed inactivity or the threshold for decreased activity. The system 400 may consider output from any of the sensors in the system 400 in computing the aggregate score. The system 400 may use the aggregate score to detect a potential fall event by comparing the aggregate score to a threshold. For instance, the system 400 detects a potential fall event based on the comparison of the aggregate score to the threshold revealing that the aggregate score meets the threshold and determines that a potential fall event has not occurred based on the comparison of the aggregate score to the threshold revealing that the aggregate score does not meet the threshold. By considering weighted output from many different sensors and fall = detection processes in. computing the aggregate score, the system 400 may provide more accurate fall detection with a lower false positive rate because detection of a fall detection only occurs when several sensors sense potential fall criteria or a single sensor detects a very high likelihood of a potential fall.
The system 400 performs fall verification (730). If a likely fall is detected, the detecting device, gateway, remote server, or monitoring center can initiate fall verification.
The process can include an automated or human-prompted user response. For example, a user may be alerted (e.g., by audible tone, vibration, human operator, automated operator, or visual indicator) to verify their need for help (e.g., a button press or vocal response) or may be alerted to respond within a period of time to cancel a potential fall event. A human operator may also speak and listen to a user over two-way communication link.
Fall verification also may be made by human inspection of captured images. For example, following the detection of a potential fall event, an image or successive images captured proximal to the fall may be sent to the monitoring center for human verification.
Image capture also may be triggered post fall (e.g., by a monitoring center or by other caregivers) to verify a fall event. Other contextual sensor or meta-data may be forwarded to human responders to assist in the verification of fall.
Fall verification procedures may be staged sequentially or paired with. fall detection mechanisms to create a hierarchical fall escalation process. For example, less accurate fall detection methods may trigger less invasive user verification (e.g., prompted user button press). If no user response is given within a threshold period of time, then more accurate fall detection methods may be employed alongside more invasive fall verification (e.g., two way communications with monitoring center).
The system 400 performs fall risk assessment (740). Assessment of fall risk may be made on the basis of data captured by sensors, user interface devices, or historic and stored data. Measures such as gait speed and balance can be directly assessed via passive and user interface devices, For example, two motion sensors placed in a hallway can measure gait speed and balance can be assessed via an on-body user interface device (e.g., via on-board inertial sensor and angular rate sensor). Other behavioral data such as medication adherence, sleep patterns, kitchen or re.,stroom use can be used to augment mobility metrics. Data can be combined with prior knowledge of fall incidents or previously verified fall events. In addition, users may be prompted to submit responses to questions or requests for information (e.g., via a user interface device or website, electronic medical records, residence layout, etc.) to form an aggregate fall risk assessment score. Scores can be computed, compared, or =
modified against individual or population scores and histories. Scores can also be computed for various timescales and locations. Fall risk assessment may also take into consideration trending of scores for an individual.
The system 400 performs fall risk reduction (750). Various assistive approaches may be employed with or without prior fall risk assessment scoring to help reduce fall risk.
20. Assistance devices such as automated lighting or medication dispensers can be used to reduce environmental hazards or behaviors related to increase in fall risk, respectively.
Assistance devices may be triggered by fall assessment scores, other sensors, user interface devices, or remote servers. For example, automated lighting can be turned-on when a user gets out of bed.
Furthermore, notifications or educational material can be delivered (e.g., by default, =
for certain fall risk assessment scores, for certain events, etc.) to the user (e.g., via a user interface device or other output device) to help the user better understand and correct fall risk factors. Tips or behavior change techniques can help the user set up a safer environment or promote behaviors associated with decreased fall risk. Notifications may be combined with sensing or other user interface prompts (e.g., prompts to answer questionnaires) to assess adherence to fall risk reduction techniques in real-time or across a period of time. Users may be scored on their ability to reduce fall risk at various timescales or in various locations. Fall risk reduction scores may be compared to individual or population historic data.
The system 400 performs reporting (760). Fall risk, detection, and prevention data, scores, annotations, or observations can be stored at the remote server. Data can be compiled and reported to users, caregivers, monitoring centers, or other trusted parties. Data (including timestamps, scores, locations, confidence, etc.) can be used for the purposes of response to events, for preventative fail risk reduction strategies, or by professional caregivers for general health assessment Data or scores can be compared to individual or population data and reported to all aforementioned parties when appropriate.
Data reporting may be combined with prompts for data entry. For example, a user could receive a notification that bathroom habits are abnormal and be asked whether they are feeling well.
Access to reported data can be restricted based on preferences of the user or caregivers.
Notifications, reminders, user prompts, questionnaires, monitored responses, and other user interface modes can be configured by rules with associated parameters. Rules can be stored and executed at the remote server, gateway device, sensors, or user interface devices.
FIG. 8 illustrates an example process 800 for fall detection using an on-body user interface device. The operations of the example process 800 are described generally as being performed by the system 400. The operations of the example process 800 may be performed by one of the components of the system 400 or may be performed by any combination of the components of the system 400. The operations of the example process 800 also may be performed by one of the components of the system 200 or may be performed by any combination of the components of the system 200. In some implementations, operations of the example process 800 may be performed by one or more processors included in one or more electronic devices.
In order to accurately detect a fall event, the on-body user interface device identifies the various characteristics of a fall comprised of the user starting from a standing or sitting position, falling through to the ground, impacting a surface, and remaining inactive after the fall. The user's trunk may transition from a vertical to horizontal position.
This may result in a ninety degree change in trunk orientation, but since the user may not be standing straight before the fall, or may not be prone or supine after the fall, the angle may not reach ninety degrees. FIG. 8 illustrates a fall detection process 800 for users wearing an on-body sensor with continuous sensing and detection.
The system 400 triggers fall detection processing based on detection of a fall-related = signature (810). The fall detection process may be triggered by an impact metric (e.g., measured from inertial sensing) or a similar fall-related signature (e.g, free fall) crossing a minimum threshold. The fall-related signature may be quantified and stratified into defined ranges indicative of fall detection confidence.
For instance, FIG. 9 illustrates example fall detection criteria. The fall detection criteria include a range of impact metrics 910 used to quantify a measured impact metric. As shown, the range of impact metrics may include less than two, between two to five, between five to ten, between ten to fifteen, and greater than fifteen. The system 400 may use the impact metric of two as a threshold for triggering fall detection processing.
For instance, the system 400 quantifies a measured impact within the ranges of impact metrics and determines not to trigger fall detection processing based on the measured impact falling within the range of less than two. For any of the other ranges, the system 400 triggers fall detection processing and records the range in which the measured impact falls for later processing.
Referring again to FIG. 8, the system 400 calculates orientation change based on triggering fall detection processing (820). Based on the system 400 detecting that a measured impact or similar metric crosses the previously mentioned minimum threshold, the system 400 calculates an orientation change using inertial or angular rate measures from before and after the detected impact or other event. The orientation value may be quantified and stratified into defined ranges.
For example, the fall detection criteria shown in FIG. 9 include a range of orientation changes 920 used to quantify an orientation change. As shown, the range of orientation changes may include less than fifty, between fifty to sixty, between sixty to seventy-five, between seventy-five to eighty-five, and greater than eighty-five. The system 400 may use the orientation change of fifty as a threshold for continuing fall detection processing. For instance, the system 400 quantifies a calculated orientation change within the ranges of orientation changes and determines not to continue fall detection processing based on the calculated orientation change falling within the range of less than fifty. For any of the other ranges, the system 400 continues fall detection processing and records the range in which the calculated orientation change falls for later processing.
Referring again to FIG. 8, the system 400 determines a minimum required inactivity period based on the fall-related signature and the orientation change (830).
Based on the defined ranges derived from impact/signature scoring and orientation scoring, a minimum required inactivity period can be determined by a lookup table or functional relationship_ The fall detection criteria shown in FIG. 9 include an inactivity period lookup table 930. The system 400 references the lookup table 930 using the range of the measured impact and the range of the calculated orientation change and sets the minimum required inactivity period as the period of time defined by the appropriate entry in the lookup table 930. For example, with an impact metric greater than ten, but less than fifteete and an orientation change greater than eighty-five, the inactivity period is set as low as thirty seconds to signal a likely fall.
Referring again to FIG. 8, the system 400 detects a potential fall event based on monitoring activity during the minimum required inactivity period (840). The system 400 may monitor output of the on-body sensor and output from any of the other sensors in the =
system 400 and determine whether any of the sensors signal activity. The system 400 continues to monitor the sensor output until the set period of inactivity has been reached and the system 400 detects a potential fall event based on determining that the set period of inactivity has passed without detecting sensed activity from any of the sensors in the system 400.
PIG. 10 illustrates an example process 1000 for tuning sensitivity and specificity of fall detection. The operations of the example process 1000 are described generally as being performed by the system 400. The operations of the example Process 1000 may be performed by one of the components of the system 400 or may be performed by any combination of the components of the system 400. The operations of the example process 1000 also may be performed by one of the components of the system 200 or may be performed by any combination of the components of the system 200. In some implementations, operations of the example process 1000 may be performed by one or more processors included in one or more electronic devices.
To tune sensitivity and specificity of fall detection (e.g., the on-body fall detection process 800), the process 1000 uses Wet feedback. The process 1000 may produce more granular fall reporting (e.g., true, false, minor, canceled falls) and may help to reduce and report incidence of false positives or false negatives.
The system 400 detects a potential fall event (1005), A possible fall is detected by the on-body device, by other sensors, or user interface devices. Any of the techniques described throughout this disclosure may be used to detect a potential fall event.
The system 400 prompts the user for cancellation of the potential fall event (1010). A
user prompt may be initiated (e.g., audible or visual). The user can respond (e.g., by a. button press or vocalization) to the user prompt at the device to cancel the detected potential fall event.
The system 400 determines whether the user cancels the potential fall event within a defined period of time (1015). For instance, the system 400 monitors for input cancelling the potential fall event until the defined period of time has been reached and the system 400 determines whether the user cancelled the potential fall event within the defined period of time based on the monitoring. 13ased on a determination that the potential fall event was not cancelled within the defined period of time, the system 400 generates a fall signal (e.g., a fall signal from the body-worn device).
Based on a determination that the potential fall event was cancelled within the defined period of time, the system 400 makes a measurement of overall activity over the minimum inactivity period previously mentioned (1020). For example, the system measures the activity detected by the on-body sensor after detection of the potential fall event until the input cancelling the potential fall event was received, The system 400 determines whether the measurement of overall activity meets an expected maximum activity (1025). For instance, the system 400 compares the measurement of overall activity to the expected maximum activity and determines whether the measurement of overall activity meets the expected maximum activity based on the comparison.
Based on a determination that the measurement of overall activity meets the expected maximum activity, the system 400 signals a false fall detection (1030). For example, the system 400 classifies the sensor data used to detect the potential fall event as being sensor data associated with a false detection of a potential fall event. In this example, the system 400 may tune the potential fall detection process such that sensor data similar to the sensor data associated with the false detection of the potential fall event does not result in detection of a potential fall event in the future.
Based on a determination that the measurement of overall activity does not meet the expected maximum activity, the system 400 measures posture or orientation (1035) and determines whether the subject recovered from the suspected fall based on the measured posture or orientation (1040). For instance, the system 400 analyzes the measured posture or orientation and detennines whether the subject has returned to an upright position.
Based on a determination that the subject recovered from the suspected fall, the system 400 triggers a minor fall (1045). For example, the system 400 classifies the sensor data used to detect the potential fall event as being sensor data associated with a minor fall.
In this example, the system 400 may tune the potential fall detection process such that sensor data similar to the sensor data associated with the minor fall results in detection of a minor fall event in the future. The system 400 may handle minor fall events differently than regular fall events. For instance, the system 400 may wan longer to see if a patient recovers from a minor fall prior to alerting a remote caregiver or monitoring station.
Based on a determination that the subject did not recover from the suspected fall, the system 400 performs another user prompt for cancellation (1050) and determines whether the user cancels the potential fall event within a defined period of time from the additional prompt for cancellation (1055). Based on a determination that the potential fall event was =
cancelled within the defined period of time, the system 400 signals a cancelled fall (1060).
For instance, the system 400 does not provide an alert for the potential fall event, but does classify the sensor data used to detect the potential fall event as being sensor data associated with a fall that was ultimately cancelled.
Based on a determination that the potential fall event was not cancelled within the defined period of time, the System 400 generates a fall signal (1065). For instance, the system 400 may generate a fall signal from the body-worn device. The fall signal may be sent to a remote caregiver or monitoring station to alert the remote caregiver or monitoring station to provide assistance to the patient who experienced the potential fall event.
Granular fall detection classes such as true fall, false fall, minor fall, and cancelled fall can be used to tune system parameters for each individual user, provide caregivers or trusted individuals with fall data, and provide automated mechanisms for fall verification.
Furthermore, the data can be stored at the remote servers.
FIG. 11 illustrates an example process 1100 for fall detection and reporting.
The operations of the example process 1100 are described generally as being performed by the system 400. The operations of the example process 1100 may be performed by one of the components of the system 400 or may be performed by any combination of the components of the system 400. The operations of the example process 1100 also may be performed by one of the components of the system 200 or may be performed by any combination of the components of the system 200. In some implementations, operations of the example process 1100 may be performed by one or more processors included in one Or more electronic =
devices.
In general, the process 1100 enables fall detection and reporting based on human movement analysis. The system 400 performs a triggered or scheduled image capture (1110). For example, the system 400 may trigger a camera on an image sensing device to capture an image based on events detected by one or more of the image sensing device's sensors (e.g., perceived motion passive infrared motion sensor, triaxial inertial sensor). In this example, movement or impact detected proximal to the image sensing device may initiate the capture of an image. Furthermore, the system 400 may trigger the camera by one or more external sensors interfaced via a gateway device. For instance, the press of a panic button or the opening of a door sensor may trigger one or more image sensing devices to ca-pture an image. Finally, image capture may be scheduled (e.g., capture an image every one minute during the hours of six in the morning through ten in the evening). In lower light conditions (e.g., characterized by the illumination sensor), the system 400 may employ infrared illumination to increase image detail and quality.
After image capture, the system 400 performs image foreground segmentation and filtering (1120). The system 400 (e.g., the image sensing device) may perform image foreground segmentation via background subtraction or other averaging approaches. The system 400 may filter captured images to help reduce foreground noise and isolate large regions of change. The process may identify changed pixels from previous images, including those morphologically likely to represent human forms or shapes.
After image foreground segmentation and filtering, the system 400 performs human segmentation (1130). The system 400 segments possible human shapes via template matches, shape fitting, or similar methods. For example, the system 400 may segment a foreground shape falling within an approximate elliptical boundary over a size threshold.
Such segmentation may reduce incidence of false detection and reporting (e.g., small pet activity), To further reduce incidence of false detection and reporting, the system 400 may remove regions of the camera's field of view from analysis. For instance, if a bed were present in the field of view, the bed may be marked as a non-detection region and the system 400 would not analyze that portion of images captured by the image sensing device.
After human segmentation, the system 400 performs human orientation and position estimation (1140), For example, the system 400 calculates orientation (e.g., human shape upright, angled, prone, etc.) and position (e.g., human shape above floor, near floor, etc.) by template or boundary shape proportion and rotation relative to a horizontal image plane.
This estimation enables identification of postures and resting positions indicative of a fall.
The floor proximal planar boundary can be specifically defined and moved to fit the unique geometries of different rooms, After human orientation and position estimation, the system 400 performs successive image and/or sensor data comparison (1150), For example, the system 400 stores, either on or off the image sensing device, the orientation and position information calculated previously and compares the prior orientation and position information with successive image orientations and positions. The system 200 repeats this process and isolates changes in position and orientation indicative of a fall (e.g., movement towards the ground), or relative stasis of position and orientation indicative of a fall (e.g., incapacitation after a fall).
Furthermore, the system 400 combines motion sensor Information with or used independent of image-based analysis to ascertain movement through horizontal planes of motion (e.g., human falling from an upright position).
The system 200 performs inactivity detection (1160). For example, the system detects periods of relative inactivity, such as those following a potential fall, from lack of or decreased motion, inertial measures, image-derived orientation and position information, external sensors, or even a combination thereof. The system 400 may classify longer periods of relative inactivity as being indicative of a fall, and classify shorter periods of relative inactivity as being indicative of a non-fall event or recovery from a fall.
After insistivity detection, the system 400 performs fall classification (1170), The system 400 may combine (e.g., logically or algebraically) the data and information compiled in previous operations of the process 1100 and use the combined data in several ways to classify possible falls. For example, if an impact is detected, orientation and position are indicative of a human in a fallen state, and a period of inactivity has exceeded a defined threshold, then the system 400 classifies the event as a fall. Classification sensitivity may be configured by the user based on manipulation of variables associated with any of the aforementioned steps. For example, fall sensitivity could be set by adjusting the interval of sensed inactivity or the threshold for decreased activity. Not all prior conditions must be met, nor all prior steps completed, for fall classification, The system 400 may report classification confidence based on the quality of inputs or classifier performance.
Furthermore, the system 400 may implement the classifier in a variety of ways such as, but not limited to an expert system, native Bayes, decision tree, neural network, etc.
=
After fall classification, the system 400 performs fall reporting (1180). For example, potential fall events are forwarded to a gateway device, remote monitoring servers, and ultimately to users or central monitoring station(s) if appropriate rules and preferences are met, Images (e.g., past and present), data, and location information can be sent for purposes of reporting and verification. Moreover, potential non-fall events, images, data, and location can be forwarded to users or central monitoring station(s) for verification.
Verification of fall events is not a requisite function of the system, rather an additional feature, Fall detection can be performed with or without image or other human-based verification.
FIG. 12 shows fall detection examples with three possible scenarios that illustrate aspects of the fall detection process 1000 discussed above. In the first scenario (a), a person stands upright in a room. In the second scenario (b), a person has fallen and is prone on the floor. In the third scenario (c), a person has fallen to a slumped position on the floor.
Illustrations (d), (e), and (f) represent the results of foreground separation and filtering of illustrations (a), (b), and (c), respectively. Illustrations (g), (h), and (i) represent the results of human orientation and position estimation and inactivity detection (as denoted by a clock) of the previous illustrations, respectively. Notice in illustration (g) that the human shape estimator, illustrated as an ellipse, but not limited to ellipses, extends beyond a floor proximal planar boundary; whereas in illustrations (b) and (i), the human shape estimators are below the plane and their orientations are not vertical, hence, inactivity detection has commenced.
Analysis of room geometry within captured images may be used to project a virtual plane of where a person should be oriented below in a fall event. The system 400 may analyze floor geometry and then perform centroid-based processing to determine where the floor is located in the captured images, After determining the location of the floor, the system 400 projects the virtual plane within the captured images at a particular distance above the floor.
In some implementations, the image sensing device and optional trigger sources (e.g., other sensors) communicate to a gateway (e.g., home monitoring panel) within a home or facility. The gateway's memory enables buffering of images and data from the image sensor and other sensors. Data is forwarded to remote monitoring servers over a long range wireless network (e.g., cellular link). Rules and preferences set at the remote monitoring server enable potential fall information (e.g., captured images and data) to be forwarded via an IP
network to users or a central monitoring station for fall verification. If a fall is verified by human inspection of captured images and data, a response can be initiated (e.g., a two-way voice call may be established with the gateway device, emergency responders may be dispatched, etc.) and location information from the system can be communicated to those providing assistance.
In some implementations, the system (e.g,, the system 200 or the system 400) may evaluate context in determining how to handle a fall detection event In these implementations, the system may check other activity in the property and determine how to handle the fall detection event based on the other activity. For instance, when the system detects other activity in the property, the system may attempt to alert someone in the property to the potential fall event (e.g., by providing an audible alert in the home that indicates the fall detection event). When the system does not detect other activity in the property, the system may, based 011 the fall detection event, send electronic messages to a caregiver associated with the property to alert the caregiver to the fall detection event, establish a two-way voice communication session with a monitoring system at the property, and/or dispatch emergency services.
In some examples, the system may tune sensitivity of one or more sensors/contexts used in fall detection and may determine a score as part of fall classification, In these examples, the system may determine the score based on a number of sensors that indicate a potential fall. For instance, the system may determine a relatively high score when the system detects a thud based on an accelerometer sensor, detects multiple motion sensors indicating motion consistent with a fall, and performs image analysis that suggests that a person has moved from a vertical orientation to a horizontal orientation below a plane near the floor. The system may determine a relatively low score when the system only performs image analysis that suggests that a person is horizontally oriented below a plane near the floor. The system may consider the number of motion sensors detecting motion and leverage ail sensor data. The system may typically operate using a subset of sensors and move to a process that leverages all sensors when a potential fall is detected by the subset of sensors.
The system may consider historic data (e.g., classification by caregivers of whether a fall detection event was actually a fall or a mistake) and tune fall detection based on the historic data.
in some implementations, the location in the home where the fall occurred may be determined and communicated to an emergency response team. In addition, the location in the home where the fall occurred may be used to pick the other sensors the system reviews in confirming a potential fall event, For instance, when the system determines that the potential fall occurs in the basement, the system determines not to consider sensors in the upstairs bedroom, as the sensors in the upstairs bedroom are unlikely to be relevant to the potential fall event in the basement.
The described systems, methods, and techniques may be implemented in digital electronic circuitry, computer hardware, firmware, software, or in combinations of these elements. Apparatus implementing these techniques may include appropriate input and output devices, a computer processor, and a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor. A
process implementing these techniques may be performed by a programmable processor executing a program of instructions to perform desired functions by operating on input data and generating appropriate output. The techniques may be implemented in one or more computer programs that are executable on a prop-an-unable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. Each computer program may be implemented in a high-level procedural or object-oriented programming language, or in assembly or machine language if desired; and in any case, the language may be a compiled or interpreted language. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, a processor will receive instructions and data from a read-only memory and/or a random access memory.
Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices; magnetic disks such as internal hard disks and removable disks;
magneto-optical disks; and Compact Disc Read-Only Memory (CD-ROM). Any of the foregoing may be supplemented by, or incorporated in, specially-designed ASICs (application-specific integrated circuits).
It will be understood that various modifications may be made. For example, other useful implementations could be achieved if steps of the disclosed techniques were performed in a different order and/or if components in the disclosed systems were combined in a different manner and/or replaced or supplemented by other components.
Accordingly, other implementations are within the scope of the disclosure.
Claims (38)
1. A method comprising:
monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling;
based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room;
based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image;
determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient, wherein sending the message indicating the potential fall event for the patient comprises sending, to the patient, the message indicating the potential fall event and providing the patient with an opportunity to cancel the potential fall event.
monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling;
based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room;
based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image;
determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient, wherein sending the message indicating the potential fall event for the patient comprises sending, to the patient, the message indicating the potential fall event and providing the patient with an opportunity to cancel the potential fall event.
2. The method of claim 1, further comprising:
determining that the patient has not cancelled the potential fall event within a threshold period of time; and based on determining that the patient has not cancelled the potential fall event within the threshold period of time, sending a message to a monitoring server indicating the potential fall event.
determining that the patient has not cancelled the potential fall event within a threshold period of time; and based on determining that the patient has not cancelled the potential fall event within the threshold period of time, sending a message to a monitoring server indicating the potential fall event.
3. The method of claim 1, further comprising:
receiving, from the patient, an indication to cancel the potential fall event;
and based on receiving the indication to cancel the potential fall event, determining an overall activity of the patient between detecting the potential fall event and receiving the indication to cancel the potential fall event.
receiving, from the patient, an indication to cancel the potential fall event;
and based on receiving the indication to cancel the potential fall event, determining an overall activity of the patient between detecting the potential fall event and receiving the indication to cancel the potential fall event.
4. The method of claim 3, further comprising:
determining that the overall activity of the patient is above a threshold of activity; and based on determining that the overall activity of the patient is above the threshold of activity, signaling that the potential fall event was detection of a false fall.
determining that the overall activity of the patient is above a threshold of activity; and based on determining that the overall activity of the patient is above the threshold of activity, signaling that the potential fall event was detection of a false fall.
5. The method of claim 3, further comprising:
determining that the overall activity of the patient is below a threshold of activity; and based on determining that the overall activity of the patient is below the threshold of activity, determining an orientation of the patient.
determining that the overall activity of the patient is below a threshold of activity; and based on determining that the overall activity of the patient is below the threshold of activity, determining an orientation of the patient.
6. The method of claim 5, further comprising:
determining that the determined orientation of the patient is upright; and based on determining that the determined orientation of the patient is upright, signaling that the potential fall event was detection of a minor fall.
determining that the determined orientation of the patient is upright; and based on determining that the determined orientation of the patient is upright, signaling that the potential fall event was detection of a minor fall.
7. The method of claim 5, further comprising:
determining that the determined orientation of the patient is not upright; and based on determining that the determined orientation of the patient is not upright, sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event.
determining that the determined orientation of the patient is not upright; and based on determining that the determined orientation of the patient is not upright, sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event.
8. The method of claim 7, further comprising:
determining that the patient has not cancelled the potential fall event within a threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event; and based on determining that the patient has not cancelled the potential fall event within the threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event, sending a message to a monitoring server indicating the potential fall event.
determining that the patient has not cancelled the potential fall event within a threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event; and based on determining that the patient has not cancelled the potential fall event within the threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event, sending a message to a monitoring server indicating the potential fall event.
9. The method of claim 7, further comprising:
after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event, receiving, from the patient, an indication to cancel the potential fall event; and based on receiving the indication to cancel the potential fall event, signaling that the potential fall event was a cancelled fall event.
after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event, receiving, from the patient, an indication to cancel the potential fall event; and based on receiving the indication to cancel the potential fall event, signaling that the potential fall event was a cancelled fall event.
10. A system comprising:
at least one processor; and at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising:
monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling;
based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room;
based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image;
determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient, wherein sending the message indicating the potential fall event for the patient comprises sending, to the patient, the message indicating the potential fall event and providing the patient with an opportunity to cancel the potential fall event.
at least one processor; and at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising:
monitoring output from at least one sensor configured to sense, in a room of a building, activity associated with a patient falling;
based on the monitoring of output from the at least one sensor, determining to capture one or more images of the room;
based on the determination to capture one or more images of the room, capturing, with a camera positioned to include the patient within a field of view of the camera, an image of the room;
analyzing the captured image of the room to detect a state of the patient at a time of capturing the image;
determining, based on the detected state of the patient, a potential fall event for the patient; and based on the determination of the potential fall event for the patient, sending, by a communication device, a message indicating the potential fall event for the patient, wherein sending the message indicating the potential fall event for the patient comprises sending, to the patient, the message indicating the potential fall event and providing the patient with an opportunity to cancel the potential fall event.
11. The system of claim 10, wherein the operations comprise:
determining that the patient has not cancelled the potential fall event within a threshold period of time; and based on determining that the patient has not cancelled the potential fall event within the threshold period of time, sending a message to a monitoring server indicating the potential fall event.
determining that the patient has not cancelled the potential fall event within a threshold period of time; and based on determining that the patient has not cancelled the potential fall event within the threshold period of time, sending a message to a monitoring server indicating the potential fall event.
12. The system of claim 10, wherein the operations comprise:
receiving, from the patient, an indication to cancel the potential fall event;
and based on receiving the indication to cancel the potential fall event, determining an overall activity of the patient between detecting the potential fall event and receiving the indication to cancel the potential fall event.
receiving, from the patient, an indication to cancel the potential fall event;
and based on receiving the indication to cancel the potential fall event, determining an overall activity of the patient between detecting the potential fall event and receiving the indication to cancel the potential fall event.
13. The system of claim 12, wherein the operations comprise:
determining that the overall activity of the patient is above a threshold of activity; and based on determining that the overall activity of the patient is above the threshold of activity, signaling that the potential fall event was detection of a false fall.
determining that the overall activity of the patient is above a threshold of activity; and based on determining that the overall activity of the patient is above the threshold of activity, signaling that the potential fall event was detection of a false fall.
14. The system of claim 12, wherein the operations comprise:
determining that the overall activity of the patient is below a threshold of activity; and based on determining that the overall activity of the patient is below the threshold of activity, determining an orientation of the patient.
determining that the overall activity of the patient is below a threshold of activity; and based on determining that the overall activity of the patient is below the threshold of activity, determining an orientation of the patient.
15. The system of claim 14, wherein the operations comprise:
determining that the determined orientation of the patient is upright; and based on determining that the determined orientation of the patient is upright, signaling that the potential fall event was detection of a minor fall.
determining that the determined orientation of the patient is upright; and based on determining that the determined orientation of the patient is upright, signaling that the potential fall event was detection of a minor fall.
16. The system of claim 14, wherein the operations comprise:
determining that the determined orientation of the patient is not upright; and based on determining that the determined orientation of the patient is not upright, sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event.
determining that the determined orientation of the patient is not upright; and based on determining that the determined orientation of the patient is not upright, sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event.
17. The system of claim 16, wherein the operations comprise:
determining that the patient has not cancelled the potential fall event within a threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event; and based on determining that the patient has not cancelled the potential fall event within the threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event, sending a message to a monitoring server indicating the potential fall event.
determining that the patient has not cancelled the potential fall event within a threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event; and based on determining that the patient has not cancelled the potential fall event within the threshold period of time after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event, sending a message to a monitoring server indicating the potential fall event.
18. The system of claim 16, wherein the operations comprise:
after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event, receiving, from the patient, an indication to cancel the potential fall event; and based on receiving the indication to cancel the potential fall event, signaling that the potential fall event was a cancelled fall event.
after sending another message to the patient that provides the patient with another opportunity to cancel the potential fall event, receiving, from the patient, an indication to cancel the potential fall event; and based on receiving the indication to cancel the potential fall event, signaling that the potential fall event was a cancelled fall event.
19. A method comprising:
monitoring output from at least one sensor configured to sense activity associated with a fall;
based on the monitoring of output from the at least one sensor, determining to capture one or more images of an area associated with the at least one sensor;
based on the determination to capture one or more images of the area associated with the at least one sensor, capturing, with a camera, an image of the area associated with the at least one sensor;
detecting, in the captured image, a state of a person included in the captured image;
determining, based on the detected state of the person, a potential fall event in the area associated with the at least one sensor; and based on the determination of the potential fall event, sending, by a communication device and to a device associated with the person involved in the potential fall event, a message that indicates the potential fall event and provides the person with an opportunity to cancel the potential fall event.
monitoring output from at least one sensor configured to sense activity associated with a fall;
based on the monitoring of output from the at least one sensor, determining to capture one or more images of an area associated with the at least one sensor;
based on the determination to capture one or more images of the area associated with the at least one sensor, capturing, with a camera, an image of the area associated with the at least one sensor;
detecting, in the captured image, a state of a person included in the captured image;
determining, based on the detected state of the person, a potential fall event in the area associated with the at least one sensor; and based on the determination of the potential fall event, sending, by a communication device and to a device associated with the person involved in the potential fall event, a message that indicates the potential fall event and provides the person with an opportunity to cancel the potential fall event.
20. The method of claim 19, further comprising:
determining that the person has not cancelled the potential fall event within a threshold period of time; and based on determining that the person has not cancelled the potential fall event within the threshold period of time, sending a message to a monitoring server indicating the potential fall event.
determining that the person has not cancelled the potential fall event within a threshold period of time; and based on determining that the person has not cancelled the potential fall event within the threshold period of time, sending a message to a monitoring server indicating the potential fall event.
21. The method of claim 19, further comprising:
receiving, from the person, an indication to cancel the potential fall event;
and based on receiving the indication to cancel the potential fall event, determining an overall activity of the person between detecting the potential fall event and receiving the indication to cancel the potential fall event.
receiving, from the person, an indication to cancel the potential fall event;
and based on receiving the indication to cancel the potential fall event, determining an overall activity of the person between detecting the potential fall event and receiving the indication to cancel the potential fall event.
22. The method of claim 21, further comprising:
determining that the overall activity of the person is above a threshold level of activity;
and based on determining that the overall activity of the person is above the threshold level of activity, signaling that the potential fall event was detection of a false fall.
determining that the overall activity of the person is above a threshold level of activity;
and based on determining that the overall activity of the person is above the threshold level of activity, signaling that the potential fall event was detection of a false fall.
23. The method of claim 21, further comprising:
determining that the overall activity of the person is below a threshold level of activity;
and based on determining that the overall activity of the person is below the threshold level of activity, determining an orientation of the person.
determining that the overall activity of the person is below a threshold level of activity;
and based on determining that the overall activity of the person is below the threshold level of activity, determining an orientation of the person.
24. The method of claim 23, further comprising:
determining that the determined orientation of the person is upright; and based on determining that the determined orientation of the person is upright, signaling that the potential fall event was detection of a minor fall.
determining that the determined orientation of the person is upright; and based on determining that the determined orientation of the person is upright, signaling that the potential fall event was detection of a minor fall.
25. The method of claim 23, further comprising:
determining that the determined orientation of the person is not upright; and based on determining that the determined orientation of the person is not upright, sending another message to the person that provides the person with another opportunity to cancel the potential fall event.
determining that the determined orientation of the person is not upright; and based on determining that the determined orientation of the person is not upright, sending another message to the person that provides the person with another opportunity to cancel the potential fall event.
26. The method of claim 25, further comprising:
determining that the person has not cancelled the potential fall event within a threshold period of time after sending another message to the person that provides the person with another opportunity to cancel the potential fall event; and based on determining that the person has not cancelled the potential fall event within the threshold period of time after sending another message to the person that provides the person with another opportunity to cancel the potential fall event, sending a message to a monitoring server indicating the potential fall event.
determining that the person has not cancelled the potential fall event within a threshold period of time after sending another message to the person that provides the person with another opportunity to cancel the potential fall event; and based on determining that the person has not cancelled the potential fall event within the threshold period of time after sending another message to the person that provides the person with another opportunity to cancel the potential fall event, sending a message to a monitoring server indicating the potential fall event.
27. The method of claim 25, further comprising:
after sending another message to the person that provides the person with another opportunity to cancel the potential fall event, receiving, from the person, an indication to cancel the potential fall event; and based on receiving the indication to cancel the potential fall event, signaling that the potential fall event was a cancelled fall event.
after sending another message to the person that provides the person with another opportunity to cancel the potential fall event, receiving, from the person, an indication to cancel the potential fall event; and based on receiving the indication to cancel the potential fall event, signaling that the potential fall event was a cancelled fall event.
28. The method of claim 19:
wherein detecting, in the captured image, the state of the person included in the captured image comprises:
performing image foreground segmentation on the captured image to create a segmented image, performing template matching on the segmented image to identify a human shape in the segmented image, and calculating a position and an orientation associated with the identified human shape in the segmented image, and wherein determining the potential fall event in the area associated with the at least one sensor comprises determining a potential fall event for the person based on the calculated position and the calculated orientation.
wherein detecting, in the captured image, the state of the person included in the captured image comprises:
performing image foreground segmentation on the captured image to create a segmented image, performing template matching on the segmented image to identify a human shape in the segmented image, and calculating a position and an orientation associated with the identified human shape in the segmented image, and wherein determining the potential fall event in the area associated with the at least one sensor comprises determining a potential fall event for the person based on the calculated position and the calculated orientation.
29. A system comprising:
at least one processor; and at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising:
monitoring output from at least one sensor configured to sense activity associated with a fall;
based on the monitoring of output from the at least one sensor, determining to capture one or more images of an area associated with the at least one sensor;
based on the determination to capture one or more images of the area associated with the at least one sensor, capturing, with a camera, an image of the area associated with the at least one sensor;
detecting, in the captured image, a state of a person included in the captured image;
determining, based on the detected state of the person, a potential fall event in the area associated with the at least one sensor; and based on the determination of the potential fall event, sending, by a communication device and to a device associated with the person involved in the potential fall event, a message that indicates the potential fall event and provides the person with an opportunity to cancel the potential fall event.
at least one processor; and at least one memory coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising:
monitoring output from at least one sensor configured to sense activity associated with a fall;
based on the monitoring of output from the at least one sensor, determining to capture one or more images of an area associated with the at least one sensor;
based on the determination to capture one or more images of the area associated with the at least one sensor, capturing, with a camera, an image of the area associated with the at least one sensor;
detecting, in the captured image, a state of a person included in the captured image;
determining, based on the detected state of the person, a potential fall event in the area associated with the at least one sensor; and based on the determination of the potential fall event, sending, by a communication device and to a device associated with the person involved in the potential fall event, a message that indicates the potential fall event and provides the person with an opportunity to cancel the potential fall event.
30. The system of claim 29, wherein the operations further comprise:
determining that the person has not cancelled the potential fall event within a threshold period of time; and based on determining that the person has not cancelled the potential fall event within the threshold period of time, sending a message to a monitoring server indicating the potential fall event.
determining that the person has not cancelled the potential fall event within a threshold period of time; and based on determining that the person has not cancelled the potential fall event within the threshold period of time, sending a message to a monitoring server indicating the potential fall event.
31. The system of claim 29, wherein the operations further comprise:
receiving, from the person, an indication to cancel the potential fall event;
and based on receiving the indication to cancel the potential fall event, determining an overall activity of the person between detecting the potential fall event and receiving the indication to cancel the potential fall event.
receiving, from the person, an indication to cancel the potential fall event;
and based on receiving the indication to cancel the potential fall event, determining an overall activity of the person between detecting the potential fall event and receiving the indication to cancel the potential fall event.
32. The system of claim 31, wherein the operations further comprise:
determining that the overall activity of the person is above a threshold level of activity;
and based on determining that the overall activity of the person is above the threshold level of activity, signaling that the potential fall event was detection of a false fall.
determining that the overall activity of the person is above a threshold level of activity;
and based on determining that the overall activity of the person is above the threshold level of activity, signaling that the potential fall event was detection of a false fall.
33. The system of claim 31, wherein the operations further comprise:
determining that the overall activity of the person is below a threshold level of activity;
and based on determining that the overall activity of the person is below the threshold level of activity, determining an orientation of the person.
determining that the overall activity of the person is below a threshold level of activity;
and based on determining that the overall activity of the person is below the threshold level of activity, determining an orientation of the person.
34. The system of claim 33, wherein the operations further comprise:
determining that the determined orientation of the person is upright; and based on determining that the determined orientation of the person is upright, signaling that the potential fall event was detection of a minor fall.
determining that the determined orientation of the person is upright; and based on determining that the determined orientation of the person is upright, signaling that the potential fall event was detection of a minor fall.
35. The system of claim 33, wherein the operations further comprise:
determining that the determined orientation of the person is not upright; and based on determining that the determined orientation of the person is not upright, sending another message to the person that provides the person with another opportunity to cancel the potential fall event.
determining that the determined orientation of the person is not upright; and based on determining that the determined orientation of the person is not upright, sending another message to the person that provides the person with another opportunity to cancel the potential fall event.
36. The system of claim 35, wherein the operations further comprise:
determining that the person has not cancelled the potential fall event within a threshold period of time after sending another message to the person that provides the person with another opportunity to cancel the potential fall event; and based on determining that the person has not cancelled the potential fall event within the threshold period of time after sending another message to the person that provides the person with another opportunity to cancel the potential fall event, sending a message to a monitoring server indicating the potential fall event.
determining that the person has not cancelled the potential fall event within a threshold period of time after sending another message to the person that provides the person with another opportunity to cancel the potential fall event; and based on determining that the person has not cancelled the potential fall event within the threshold period of time after sending another message to the person that provides the person with another opportunity to cancel the potential fall event, sending a message to a monitoring server indicating the potential fall event.
37. The system of claim 35, wherein the operations further comprise:
after sending another message to the person that provides the person with another opportunity to cancel the potential fall event, receiving, from the person, an indication to cancel the potential fall event; and based on receiving the indication to cancel the potential fall event, signaling that the potential fall event was a cancelled fall event.
after sending another message to the person that provides the person with another opportunity to cancel the potential fall event, receiving, from the person, an indication to cancel the potential fall event; and based on receiving the indication to cancel the potential fall event, signaling that the potential fall event was a cancelled fall event.
38. The system of claim 29:
wherein detecting, in the captured image, the state of the person included in the captured image comprises:
performing image foreground segmentation on the captured image to create a segmented image, performing template matching on the segmented image to identify a human shape in the segmented image, and calculating a position and an orientation associated with the identified human shape in the segmented image, and wherein determining the potential fall event in the area associated with the at least one sensor comprises determining a potential fall event for the person based on the calculated position and the calculated orientation.
wherein detecting, in the captured image, the state of the person included in the captured image comprises:
performing image foreground segmentation on the captured image to create a segmented image, performing template matching on the segmented image to identify a human shape in the segmented image, and calculating a position and an orientation associated with the identified human shape in the segmented image, and wherein determining the potential fall event in the area associated with the at least one sensor comprises determining a potential fall event for the person based on the calculated position and the calculated orientation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA3090537A CA3090537A1 (en) | 2011-04-04 | 2012-04-04 | Fall detection and reporting technology |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161471495P | 2011-04-04 | 2011-04-04 | |
US61/471,495 | 2011-04-04 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA3090537A Division CA3090537A1 (en) | 2011-04-04 | 2012-04-04 | Fall detection and reporting technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2773507A1 CA2773507A1 (en) | 2012-10-04 |
CA2773507C true CA2773507C (en) | 2020-10-13 |
Family
ID=46964797
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA3177719A Pending CA3177719A1 (en) | 2011-04-04 | 2012-04-04 | Fall detection and reporting technology |
CA2773507A Active CA2773507C (en) | 2011-04-04 | 2012-04-04 | Fall detection and reporting technology |
CA3090537A Pending CA3090537A1 (en) | 2011-04-04 | 2012-04-04 | Fall detection and reporting technology |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA3177719A Pending CA3177719A1 (en) | 2011-04-04 | 2012-04-04 | Fall detection and reporting technology |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA3090537A Pending CA3090537A1 (en) | 2011-04-04 | 2012-04-04 | Fall detection and reporting technology |
Country Status (2)
Country | Link |
---|---|
US (7) | US8675920B2 (en) |
CA (3) | CA3177719A1 (en) |
Families Citing this family (261)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9318012B2 (en) * | 2003-12-12 | 2016-04-19 | Steve Gail Johnson | Noise correcting patient fall risk state system and method for predicting patient falls |
US10339791B2 (en) | 2007-06-12 | 2019-07-02 | Icontrol Networks, Inc. | Security network integrated with premise security system |
US10522026B2 (en) | 2008-08-11 | 2019-12-31 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US10237237B2 (en) | 2007-06-12 | 2019-03-19 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
GB2428821B (en) | 2004-03-16 | 2008-06-04 | Icontrol Networks Inc | Premises management system |
US20160065414A1 (en) | 2013-06-27 | 2016-03-03 | Ken Sundermeyer | Control system user interface |
US11244545B2 (en) | 2004-03-16 | 2022-02-08 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US9141276B2 (en) | 2005-03-16 | 2015-09-22 | Icontrol Networks, Inc. | Integrated interface for mobile device |
US11489812B2 (en) | 2004-03-16 | 2022-11-01 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US9531593B2 (en) | 2007-06-12 | 2016-12-27 | Icontrol Networks, Inc. | Takeover processes in security network integrated with premise security system |
US11811845B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US10200504B2 (en) | 2007-06-12 | 2019-02-05 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US8635350B2 (en) | 2006-06-12 | 2014-01-21 | Icontrol Networks, Inc. | IP device discovery systems and methods |
US10142392B2 (en) | 2007-01-24 | 2018-11-27 | Icontrol Networks, Inc. | Methods and systems for improved system performance |
US11159484B2 (en) | 2004-03-16 | 2021-10-26 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US7711796B2 (en) | 2006-06-12 | 2010-05-04 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11916870B2 (en) | 2004-03-16 | 2024-02-27 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US10156959B2 (en) | 2005-03-16 | 2018-12-18 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11582065B2 (en) | 2007-06-12 | 2023-02-14 | Icontrol Networks, Inc. | Systems and methods for device communication |
US10721087B2 (en) | 2005-03-16 | 2020-07-21 | Icontrol Networks, Inc. | Method for networked touchscreen with integrated interfaces |
US9729342B2 (en) | 2010-12-20 | 2017-08-08 | Icontrol Networks, Inc. | Defining and implementing sensor triggered response rules |
US20090077623A1 (en) | 2005-03-16 | 2009-03-19 | Marc Baum | Security Network Integrating Security System and Network Devices |
US11677577B2 (en) | 2004-03-16 | 2023-06-13 | Icontrol Networks, Inc. | Premises system management using status signal |
US11277465B2 (en) | 2004-03-16 | 2022-03-15 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US11343380B2 (en) | 2004-03-16 | 2022-05-24 | Icontrol Networks, Inc. | Premises system automation |
US11113950B2 (en) | 2005-03-16 | 2021-09-07 | Icontrol Networks, Inc. | Gateway integrated with premises security system |
US12063220B2 (en) | 2004-03-16 | 2024-08-13 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US20170118037A1 (en) | 2008-08-11 | 2017-04-27 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11368429B2 (en) | 2004-03-16 | 2022-06-21 | Icontrol Networks, Inc. | Premises management configuration and control |
US11316958B2 (en) | 2008-08-11 | 2022-04-26 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11201755B2 (en) | 2004-03-16 | 2021-12-14 | Icontrol Networks, Inc. | Premises system management using status signal |
US20110128378A1 (en) | 2005-03-16 | 2011-06-02 | Reza Raji | Modular Electronic Display Platform |
US11496568B2 (en) | 2005-03-16 | 2022-11-08 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US10999254B2 (en) | 2005-03-16 | 2021-05-04 | Icontrol Networks, Inc. | System for data routing in networks |
US9306809B2 (en) | 2007-06-12 | 2016-04-05 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US11700142B2 (en) | 2005-03-16 | 2023-07-11 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11615697B2 (en) | 2005-03-16 | 2023-03-28 | Icontrol Networks, Inc. | Premise management systems and methods |
US20170180198A1 (en) | 2008-08-11 | 2017-06-22 | Marc Baum | Forming a security network including integrated security system components |
US9189934B2 (en) * | 2005-09-22 | 2015-11-17 | Rsi Video Technologies, Inc. | Security monitoring with programmable mapping |
US12063221B2 (en) | 2006-06-12 | 2024-08-13 | Icontrol Networks, Inc. | Activation of gateway device |
US10079839B1 (en) | 2007-06-12 | 2018-09-18 | Icontrol Networks, Inc. | Activation of gateway device |
US11706279B2 (en) | 2007-01-24 | 2023-07-18 | Icontrol Networks, Inc. | Methods and systems for data communication |
US7633385B2 (en) | 2007-02-28 | 2009-12-15 | Ucontrol, Inc. | Method and system for communicating with and controlling an alarm system from a remote server |
US8451986B2 (en) | 2007-04-23 | 2013-05-28 | Icontrol Networks, Inc. | Method and system for automatically providing alternate network access for telecommunications |
US11646907B2 (en) | 2007-06-12 | 2023-05-09 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11601810B2 (en) | 2007-06-12 | 2023-03-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11218878B2 (en) | 2007-06-12 | 2022-01-04 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11423756B2 (en) | 2007-06-12 | 2022-08-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11237714B2 (en) | 2007-06-12 | 2022-02-01 | Control Networks, Inc. | Control system user interface |
US10523689B2 (en) | 2007-06-12 | 2019-12-31 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11089122B2 (en) | 2007-06-12 | 2021-08-10 | Icontrol Networks, Inc. | Controlling data routing among networks |
US11316753B2 (en) | 2007-06-12 | 2022-04-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11212192B2 (en) | 2007-06-12 | 2021-12-28 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US12003387B2 (en) | 2012-06-27 | 2024-06-04 | Comcast Cable Communications, Llc | Control system user interface |
US10223903B2 (en) | 2010-09-28 | 2019-03-05 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11831462B2 (en) | 2007-08-24 | 2023-11-28 | Icontrol Networks, Inc. | Controlling data routing in premises management systems |
US9936143B2 (en) | 2007-10-31 | 2018-04-03 | Google Technology Holdings LLC | Imager module with electronic shutter |
US11916928B2 (en) | 2008-01-24 | 2024-02-27 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US20170185278A1 (en) | 2008-08-11 | 2017-06-29 | Icontrol Networks, Inc. | Automation system user interface |
US11792036B2 (en) | 2008-08-11 | 2023-10-17 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11258625B2 (en) | 2008-08-11 | 2022-02-22 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11758026B2 (en) | 2008-08-11 | 2023-09-12 | Icontrol Networks, Inc. | Virtual device systems and methods |
US10530839B2 (en) | 2008-08-11 | 2020-01-07 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11729255B2 (en) | 2008-08-11 | 2023-08-15 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US8638211B2 (en) | 2009-04-30 | 2014-01-28 | Icontrol Networks, Inc. | Configurable controller and interface for home SMA, phone and multimedia |
AU2011250886A1 (en) | 2010-05-10 | 2013-01-10 | Icontrol Networks, Inc | Control system user interface |
US8836467B1 (en) | 2010-09-28 | 2014-09-16 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
US11750414B2 (en) | 2010-12-16 | 2023-09-05 | Icontrol Networks, Inc. | Bidirectional security sensor communication for a premises security system |
US9147337B2 (en) | 2010-12-17 | 2015-09-29 | Icontrol Networks, Inc. | Method and system for logging security event data |
CA3177719A1 (en) | 2011-04-04 | 2012-10-04 | Alarm.Com Incorporated | Fall detection and reporting technology |
US20130127620A1 (en) | 2011-06-20 | 2013-05-23 | Cerner Innovation, Inc. | Management of patient fall risk |
US10546481B2 (en) | 2011-07-12 | 2020-01-28 | Cerner Innovation, Inc. | Method for determining whether an individual leaves a prescribed virtual perimeter |
US9367770B2 (en) | 2011-08-30 | 2016-06-14 | Digimarc Corporation | Methods and arrangements for identifying objects |
US9571723B2 (en) * | 2011-11-18 | 2017-02-14 | National Science Foundation | Automatic detection by a wearable camera |
BR112014017784A8 (en) * | 2012-01-18 | 2017-07-11 | Nike Innovate Cv | ACTIVITY AND INACTIVITY MONITORING |
US9652960B2 (en) * | 2012-05-02 | 2017-05-16 | Koninklijke Philips N.V. | Device and method for routing a medical alert to a selected staff member |
US9392322B2 (en) | 2012-05-10 | 2016-07-12 | Google Technology Holdings LLC | Method of visually synchronizing differing camera feeds with common subject |
US9881474B2 (en) | 2012-09-21 | 2018-01-30 | Google Llc | Initially detecting a visitor at a smart-home |
US9960929B2 (en) | 2012-09-21 | 2018-05-01 | Google Llc | Environmental sensing with a doorbell at a smart-home |
US9978238B2 (en) | 2012-09-21 | 2018-05-22 | Google Llc | Visitor options at an entryway to a smart-home |
US9959727B2 (en) | 2012-09-21 | 2018-05-01 | Google Llc | Handling visitor interaction at a smart-home in a do not disturb mode |
US10332059B2 (en) | 2013-03-14 | 2019-06-25 | Google Llc | Security scoring in a smart-sensored home |
US9953514B2 (en) | 2012-09-21 | 2018-04-24 | Google Llc | Visitor feedback to visitor interaction with a doorbell at a smart-home |
US10735216B2 (en) | 2012-09-21 | 2020-08-04 | Google Llc | Handling security services visitor at a smart-home |
GB2506885B (en) * | 2012-10-10 | 2017-04-12 | Read Dale | Occupancy sensor |
US9538158B1 (en) | 2012-10-16 | 2017-01-03 | Ocuvera LLC | Medical environment monitoring system |
US11570421B1 (en) | 2012-10-16 | 2023-01-31 | Ocuvera, LLC | Medical environment monitoring system |
US10229491B1 (en) | 2012-10-16 | 2019-03-12 | Ocuvera LLC | Medical environment monitoring system |
US10229489B1 (en) | 2012-10-16 | 2019-03-12 | Ocuvera LLC | Medical environment monitoring system |
US9798302B2 (en) | 2013-02-27 | 2017-10-24 | Rockwell Automation Technologies, Inc. | Recognition-based industrial automation control with redundant system input support |
US9498885B2 (en) | 2013-02-27 | 2016-11-22 | Rockwell Automation Technologies, Inc. | Recognition-based industrial automation control with confidence-based decision support |
US9804576B2 (en) * | 2013-02-27 | 2017-10-31 | Rockwell Automation Technologies, Inc. | Recognition-based industrial automation control with position and derivative decision reference |
US9393695B2 (en) | 2013-02-27 | 2016-07-19 | Rockwell Automation Technologies, Inc. | Recognition-based industrial automation control with person and object discrimination |
JP6171415B2 (en) * | 2013-03-06 | 2017-08-02 | ノーリツプレシジョン株式会社 | Information processing apparatus, information processing method, and program |
JP6448626B2 (en) | 2013-06-06 | 2019-01-09 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Fall detection system and method |
US10182757B2 (en) * | 2013-07-22 | 2019-01-22 | The Rockefeller University | System and method for optical detection of skin disease |
US9472067B1 (en) | 2013-07-23 | 2016-10-18 | Rsi Video Technologies, Inc. | Security devices and related features |
WO2015020975A1 (en) | 2013-08-05 | 2015-02-12 | Ameer Sami | System and method for automating electrical devices at a building structure |
US10515309B1 (en) * | 2013-09-20 | 2019-12-24 | Amazon Technologies, Inc. | Weight based assistance determination |
US10664795B1 (en) | 2013-09-20 | 2020-05-26 | Amazon Technologies, Inc. | Weight based item tracking |
WO2015069124A1 (en) * | 2013-11-08 | 2015-05-14 | Performance Lab Technologies Limited | Automated prescription of activity based on physical activity data |
US10037821B2 (en) * | 2013-12-27 | 2018-07-31 | General Electric Company | System for integrated protocol and decision support |
US9729833B1 (en) | 2014-01-17 | 2017-08-08 | Cerner Innovation, Inc. | Method and system for determining whether an individual takes appropriate measures to prevent the spread of healthcare-associated infections along with centralized monitoring |
US10225522B1 (en) | 2014-01-17 | 2019-03-05 | Cerner Innovation, Inc. | Method and system for determining whether an individual takes appropriate measures to prevent the spread of healthcare-associated infections |
US9651656B2 (en) | 2014-02-28 | 2017-05-16 | Tyco Fire & Security Gmbh | Real-time location system in wireless sensor network |
US10268485B2 (en) | 2014-02-28 | 2019-04-23 | Tyco Fire & Security Gmbh | Constrained device and supporting operating system |
US10878323B2 (en) | 2014-02-28 | 2020-12-29 | Tyco Fire & Security Gmbh | Rules engine combined with message routing |
US11146637B2 (en) * | 2014-03-03 | 2021-10-12 | Icontrol Networks, Inc. | Media content management |
US11405463B2 (en) * | 2014-03-03 | 2022-08-02 | Icontrol Networks, Inc. | Media content management |
US9405770B2 (en) * | 2014-03-10 | 2016-08-02 | Google Inc. | Three dimensional navigation among photos |
US9357127B2 (en) | 2014-03-18 | 2016-05-31 | Google Technology Holdings LLC | System for auto-HDR capture decision making |
US10043369B2 (en) * | 2014-03-20 | 2018-08-07 | Better Alerts, LLC | System and method for sending medical emergency alerts |
US10657411B1 (en) | 2014-03-25 | 2020-05-19 | Amazon Technologies, Inc. | Item identification |
US10713614B1 (en) | 2014-03-25 | 2020-07-14 | Amazon Technologies, Inc. | Weight and vision based item tracking |
US9729784B2 (en) | 2014-05-21 | 2017-08-08 | Google Technology Holdings LLC | Enhanced image capture |
US9412255B1 (en) * | 2014-05-21 | 2016-08-09 | West Corporation | Remote monitoring of activity triggered sensors and a customized updating application |
US9813611B2 (en) | 2014-05-21 | 2017-11-07 | Google Technology Holdings LLC | Enhanced image capture |
US9774779B2 (en) | 2014-05-21 | 2017-09-26 | Google Technology Holdings LLC | Enhanced image capture |
US9628702B2 (en) | 2014-05-21 | 2017-04-18 | Google Technology Holdings LLC | Enhanced image capture |
US9349268B2 (en) * | 2014-06-08 | 2016-05-24 | Cornelius Tillman | TNT-medical alert system |
US9413947B2 (en) * | 2014-07-31 | 2016-08-09 | Google Technology Holdings LLC | Capturing images of active subjects according to activity profiles |
US20160165387A1 (en) * | 2014-08-26 | 2016-06-09 | Hoang Nhu | Smart home platform with data analytics for monitoring and related methods |
US11580439B1 (en) * | 2014-09-10 | 2023-02-14 | Dp Technologies, Inc. | Fall identification system |
US9654700B2 (en) | 2014-09-16 | 2017-05-16 | Google Technology Holdings LLC | Computational camera using fusion of image sensors |
WO2016042498A1 (en) * | 2014-09-16 | 2016-03-24 | Hip Hope Technologies Ltd. | Fall detection device and method |
US10356303B1 (en) | 2014-10-07 | 2019-07-16 | State Farm Mutual Automobile Insurance Company | Systems and methods for controlling smart devices based upon image data from image sensors |
JP5866551B1 (en) * | 2014-11-20 | 2016-02-17 | パナソニックIpマネジメント株式会社 | Monitoring system and monitoring method in monitoring system |
JP5866540B1 (en) * | 2014-11-21 | 2016-02-17 | パナソニックIpマネジメント株式会社 | Monitoring system and monitoring method in monitoring system |
EP3230970A1 (en) * | 2014-12-10 | 2017-10-18 | Koninklijke Philips N.V. | System and method for fall detection |
US10090068B2 (en) | 2014-12-23 | 2018-10-02 | Cerner Innovation, Inc. | Method and system for determining whether a monitored individual's hand(s) have entered a virtual safety zone |
WO2016101065A1 (en) * | 2014-12-23 | 2016-06-30 | Q-Links Home Automation Inc. | Method and system for determination of false alarm |
US10524722B2 (en) | 2014-12-26 | 2020-01-07 | Cerner Innovation, Inc. | Method and system for determining whether a caregiver takes appropriate measures to prevent patient bedsores |
DK3041321T3 (en) * | 2015-01-05 | 2018-07-09 | Schreder | Method for marking illuminators, control device and illuminators |
US10586114B2 (en) | 2015-01-13 | 2020-03-10 | Vivint, Inc. | Enhanced doorbell camera interactions |
US10635907B2 (en) * | 2015-01-13 | 2020-04-28 | Vivint, Inc. | Enhanced doorbell camera interactions |
US10133935B2 (en) | 2015-01-13 | 2018-11-20 | Vivint, Inc. | Doorbell camera early detection |
US10347108B2 (en) * | 2015-01-16 | 2019-07-09 | City University Of Hong Kong | Monitoring user activity using wearable motion sensing device |
US20180122209A1 (en) * | 2015-04-01 | 2018-05-03 | Smartcare Consultants, Llc | System for determining behavioral patterns and deviations from determined behavioral patterns |
TW201636961A (en) * | 2015-04-07 | 2016-10-16 | Amaryllo International Inc | Emergency reporting device and system thereof |
US10342478B2 (en) | 2015-05-07 | 2019-07-09 | Cerner Innovation, Inc. | Method and system for determining whether a caretaker takes appropriate measures to prevent patient bedsores |
US9892611B1 (en) | 2015-06-01 | 2018-02-13 | Cerner Innovation, Inc. | Method for determining whether an individual enters a prescribed virtual zone using skeletal tracking and 3D blob detection |
EP3309748A4 (en) * | 2015-06-10 | 2018-06-06 | Konica Minolta, Inc. | Image processing system, image processing device, image processing method, and image processing program |
US11864926B2 (en) | 2015-08-28 | 2024-01-09 | Foresite Healthcare, Llc | Systems and methods for detecting attempted bed exit |
CN113367671A (en) * | 2015-08-31 | 2021-09-10 | 梅西莫股份有限公司 | Wireless patient monitoring system and method |
WO2017049188A1 (en) * | 2015-09-17 | 2017-03-23 | Luvozo Pbc | Automated environment hazard detection |
CN105118236B (en) * | 2015-09-25 | 2018-08-28 | 广东乐源数字技术有限公司 | Paralysis falls to monitor and preventing mean and its processing method |
GB201518050D0 (en) * | 2015-10-12 | 2015-11-25 | Binatone Electronics Internat Ltd | Home monitoring and control systems |
US9922524B2 (en) | 2015-10-30 | 2018-03-20 | Blue Willow Systems, Inc. | Methods for detecting and handling fall and perimeter breach events for residents of an assisted living facility |
US10096383B2 (en) | 2015-11-24 | 2018-10-09 | International Business Machines Corporation | Performing a health analysis using a smart floor mat |
KR102638748B1 (en) * | 2015-12-04 | 2024-02-20 | 삼성전자 주식회사 | Apparatus and method for managing device using at least one sensor |
CN108475461A (en) * | 2015-12-30 | 2018-08-31 | 3M创新有限公司 | Electronics fall event communication system |
US10878220B2 (en) | 2015-12-31 | 2020-12-29 | Cerner Innovation, Inc. | Methods and systems for assigning locations to devices |
US10147296B2 (en) * | 2016-01-12 | 2018-12-04 | Fallcall Solutions, Llc | System for detecting falls and discriminating the severity of falls |
US11064912B2 (en) * | 2016-01-26 | 2021-07-20 | Climax Technology Co., Ltd. | Fall sensor |
FR3047342B1 (en) * | 2016-02-03 | 2020-12-11 | Metaleo | PERSONAL BED EXIT DETECTION DEVICE |
US10641013B2 (en) | 2016-02-16 | 2020-05-05 | Go Lock Technology, Inc. | Portable lock with integrity sensors |
WO2017146643A1 (en) * | 2016-02-23 | 2017-08-31 | Apeiron Technology Pte Ltd | A patient monitoring system |
US10489661B1 (en) | 2016-03-08 | 2019-11-26 | Ocuvera LLC | Medical environment monitoring system |
US20170270462A1 (en) | 2016-03-16 | 2017-09-21 | Triax Technologies, Inc. | System and interfaces for managing workplace events |
US11170616B2 (en) | 2016-03-16 | 2021-11-09 | Triax Technologies, Inc. | System and interfaces for managing workplace events |
WO2017160812A1 (en) * | 2016-03-16 | 2017-09-21 | Triax Technologies, Inc. | System and interfaces for managing workplace events |
US10769562B2 (en) | 2016-03-16 | 2020-09-08 | Triax Technologies, Inc. | Sensor based system and method for authorizing operation of worksite equipment using a locally stored access control list |
US11810032B2 (en) | 2016-03-16 | 2023-11-07 | Triax Technologies, Inc. | Systems and methods for low-energy wireless applications using networked wearable sensors |
US10430817B2 (en) | 2016-04-15 | 2019-10-01 | Walmart Apollo, Llc | Partiality vector refinement systems and methods through sample probing |
US10614504B2 (en) | 2016-04-15 | 2020-04-07 | Walmart Apollo, Llc | Systems and methods for providing content-based product recommendations |
WO2017180977A1 (en) | 2016-04-15 | 2017-10-19 | Wal-Mart Stores, Inc. | Systems and methods for facilitating shopping in a physical retail facility |
US10674953B2 (en) * | 2016-04-20 | 2020-06-09 | Welch Allyn, Inc. | Skin feature imaging system |
WO2017223339A1 (en) * | 2016-06-23 | 2017-12-28 | Mayo Foundation For Medical Education And Research | Proximity based fall and distress detection systems and methods |
TWI585283B (en) * | 2016-06-24 | 2017-06-01 | Blue door lock system with emergency notification function and its operation method | |
US10373464B2 (en) | 2016-07-07 | 2019-08-06 | Walmart Apollo, Llc | Apparatus and method for updating partiality vectors based on monitoring of person and his or her home |
CA3029996A1 (en) | 2016-07-07 | 2018-01-11 | Walmart Apollo, Llc | Method and apparatus for monitoring person and home |
JP6983866B2 (en) | 2016-08-08 | 2021-12-17 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Devices, systems, and methods for fall detection |
CA3039828A1 (en) | 2016-10-12 | 2018-04-19 | Koninklijke Philips N.V. | Method and apparatus for determining a fall risk |
JP6725411B2 (en) * | 2016-12-27 | 2020-07-15 | 積水化学工業株式会社 | Behavior evaluation device, behavior evaluation method |
US10600204B1 (en) | 2016-12-28 | 2020-03-24 | Ocuvera | Medical environment bedsore detection and prevention system |
US10671925B2 (en) * | 2016-12-28 | 2020-06-02 | Intel Corporation | Cloud-assisted perceptual computing analytics |
US10147184B2 (en) | 2016-12-30 | 2018-12-04 | Cerner Innovation, Inc. | Seizure detection |
KR101860062B1 (en) * | 2017-01-06 | 2018-05-23 | 한국과학기술원 | Fall detection system and method |
SE541712C2 (en) * | 2017-02-22 | 2019-12-03 | Next Step Dynamics Ab | Method and apparatus for health prediction |
CN106875629B (en) * | 2017-03-07 | 2023-02-10 | 吉林省家人帮信息服务有限公司 | Home-based endowment system based on somatosensory network and control method thereof |
US10878342B2 (en) * | 2017-03-30 | 2020-12-29 | Intel Corporation | Cloud assisted machine learning |
KR102391683B1 (en) * | 2017-04-24 | 2022-04-28 | 엘지전자 주식회사 | An audio device and method for controlling the same |
US10325471B1 (en) | 2017-04-28 | 2019-06-18 | BlueOwl, LLC | Systems and methods for detecting a medical emergency event |
US10304302B2 (en) | 2017-04-28 | 2019-05-28 | Arlo Technologies, Inc. | Electronic monitoring system using push notifications |
US11044445B2 (en) | 2017-05-05 | 2021-06-22 | VergeSense, Inc. | Method for monitoring occupancy in a work area |
US10742940B2 (en) | 2017-05-05 | 2020-08-11 | VergeSense, Inc. | Method for monitoring occupancy in a work area |
US11810664B2 (en) * | 2017-05-09 | 2023-11-07 | Reliant Mso, Llc | Patient treatment status notification system |
KR102013935B1 (en) * | 2017-05-25 | 2019-08-23 | 삼성전자주식회사 | Method and system for detecting a dangerous situation |
EP3422315B1 (en) | 2017-06-28 | 2019-08-14 | Koninklijke Philips N.V. | Method and apparatus for providing feedback to a user about a fall risk |
US11114200B2 (en) * | 2017-07-07 | 2021-09-07 | Careview Communications, Inc. | Smart monitoring safety system using sensors |
US10055961B1 (en) * | 2017-07-10 | 2018-08-21 | Careview Communications, Inc. | Surveillance system and method for predicting patient falls using motion feature patterns |
US20190057190A1 (en) * | 2017-08-16 | 2019-02-21 | Wipro Limited | Method and system for providing context based medical instructions to a patient |
KR102481883B1 (en) * | 2017-09-27 | 2022-12-27 | 삼성전자주식회사 | Method and apparatus for detecting a dangerous situation |
US10629048B2 (en) | 2017-09-29 | 2020-04-21 | Apple Inc. | Detecting falls using a mobile device |
US11282361B2 (en) | 2017-09-29 | 2022-03-22 | Apple Inc. | Detecting falls using a mobile device |
US11527140B2 (en) * | 2017-09-29 | 2022-12-13 | Apple Inc. | Detecting falls using a mobile device |
US11282362B2 (en) | 2017-09-29 | 2022-03-22 | Apple Inc. | Detecting falls using a mobile device |
US11282363B2 (en) | 2017-09-29 | 2022-03-22 | Apple Inc. | Detecting falls using a mobile device |
US11039084B2 (en) * | 2017-11-14 | 2021-06-15 | VergeSense, Inc. | Method for commissioning a network of optical sensors across a floor space |
JP6878260B2 (en) * | 2017-11-30 | 2021-05-26 | パラマウントベッド株式会社 | Abnormality judgment device, program |
US10643446B2 (en) | 2017-12-28 | 2020-05-05 | Cerner Innovation, Inc. | Utilizing artificial intelligence to detect objects or patient safety events in a patient room |
US10482321B2 (en) | 2017-12-29 | 2019-11-19 | Cerner Innovation, Inc. | Methods and systems for identifying the crossing of a virtual barrier |
WO2019152447A1 (en) * | 2018-01-31 | 2019-08-08 | David Hold | Preventive care platform for interactive patient monitoring |
US11232694B2 (en) * | 2018-03-14 | 2022-01-25 | Safely You Inc. | System and method for detecting, recording and communicating events in the care and treatment of cognitively impaired persons |
GB2572412B (en) * | 2018-03-29 | 2021-03-10 | 270 Vision Ltd | Sensor apparatus |
US10825318B1 (en) | 2018-04-09 | 2020-11-03 | State Farm Mutual Automobile Insurance Company | Sensing peripheral heuristic evidence, reinforcement, and engagement system |
US11908581B2 (en) | 2018-04-10 | 2024-02-20 | Hill-Rom Services, Inc. | Patient risk assessment based on data from multiple sources in a healthcare facility |
US11504071B2 (en) | 2018-04-10 | 2022-11-22 | Hill-Rom Services, Inc. | Patient risk assessment based on data from multiple sources in a healthcare facility |
US10827951B2 (en) * | 2018-04-19 | 2020-11-10 | Careview Communications, Inc. | Fall detection using sensors in a smart monitoring safety system |
KR102449905B1 (en) * | 2018-05-11 | 2022-10-04 | 삼성전자주식회사 | Electronic device and method for controlling the electronic device thereof |
WO2020003705A1 (en) | 2018-06-26 | 2020-01-02 | コニカミノルタ株式会社 | Control program, report output method, and report output device |
EP3588458A1 (en) | 2018-06-29 | 2020-01-01 | Koninklijke Philips N.V. | A fall detection apparatus, a method of detecting a fall by a subject and a computer program product for implementing the method |
TWI679613B (en) * | 2018-07-02 | 2019-12-11 | 瀚誼世界科技股份有限公司 | Method for avoiding false alarm by non-fall detection, an apparatus for human fall detection thereof |
US10225492B1 (en) | 2018-07-23 | 2019-03-05 | Mp High Tech Solutions Pty Ltd. | User interfaces to configure a thermal imaging system |
CN108813834B (en) * | 2018-08-03 | 2020-09-29 | 歌尔科技有限公司 | Intelligent bracelet and method for judging motion state |
US10932970B2 (en) | 2018-08-27 | 2021-03-02 | Careview Communications, Inc. | Systems and methods for monitoring and controlling bed functions |
BR102018070596A2 (en) * | 2018-10-05 | 2020-04-22 | Techbalance Solucao Digital Para Reabilitacao Ltda | system and method for preventing and predicting the risk of postural fall |
US11210922B2 (en) | 2018-10-22 | 2021-12-28 | Tidi Products, Llc | Electronic fall monitoring system |
US10692346B2 (en) | 2018-10-22 | 2020-06-23 | Tidi Products, Llc | Electronic fall monitoring system |
US10878683B2 (en) * | 2018-11-01 | 2020-12-29 | Apple Inc. | Fall detection-audio looping |
US10922936B2 (en) | 2018-11-06 | 2021-02-16 | Cerner Innovation, Inc. | Methods and systems for detecting prohibited objects |
FR3090973B1 (en) * | 2018-12-21 | 2021-09-10 | Kapelse | Device and method for monitoring a situation within a volume |
US10916119B2 (en) | 2018-12-27 | 2021-02-09 | Hill-Rom Services, Inc. | System and method for caregiver availability determination |
US11179064B2 (en) * | 2018-12-30 | 2021-11-23 | Altum View Systems Inc. | Method and system for privacy-preserving fall detection |
US11751813B2 (en) | 2019-03-11 | 2023-09-12 | Celloscope Ltd. | System, method and computer program product for detecting a mobile phone user's risky medical condition |
EP3938975A4 (en) | 2019-03-15 | 2022-12-14 | Vergesense, Inc. | Arrival detection for battery-powered optical sensors |
WO2020201969A1 (en) * | 2019-03-29 | 2020-10-08 | University Health Network | System and method for remote patient monitoring |
US20220248970A1 (en) * | 2019-04-03 | 2022-08-11 | Starkey Laboratories, Inc. | Monitoring system and method of using same |
CN110084932B (en) * | 2019-04-23 | 2022-03-18 | 上海救要救信息科技有限公司 | Rescue method and device |
US11699528B2 (en) * | 2019-06-28 | 2023-07-11 | Hill-Rom Services, Inc. | Falls risk management |
US11894129B1 (en) | 2019-07-03 | 2024-02-06 | State Farm Mutual Automobile Insurance Company | Senior living care coordination platforms |
US11367527B1 (en) | 2019-08-19 | 2022-06-21 | State Farm Mutual Automobile Insurance Company | Senior living engagement and care support platforms |
CN112447023A (en) * | 2019-08-29 | 2021-03-05 | 奇酷互联网络科技(深圳)有限公司 | Abnormal condition reminding method and device and storage device |
US11620808B2 (en) | 2019-09-25 | 2023-04-04 | VergeSense, Inc. | Method for detecting human occupancy and activity in a work area |
US20220354387A1 (en) * | 2019-11-28 | 2022-11-10 | Nippon Telegraph And Telephone Corporation | Monitoring System, Monitoring Method, and Monitoring Program |
EP3828854A1 (en) | 2019-11-29 | 2021-06-02 | Koninklijke Philips N.V. | Fall detection method and system |
EP3828855A1 (en) * | 2019-11-29 | 2021-06-02 | Koninklijke Philips N.V. | Personalized fall detector |
EP3836105A1 (en) * | 2019-12-11 | 2021-06-16 | Koninklijke Philips N.V. | Fall detection |
US10916115B1 (en) | 2019-12-19 | 2021-02-09 | Jeff Daniel Grammer | Wearable device adapted for fall detection and transmission of automated notifications for emergency assistance |
CN111160179A (en) * | 2019-12-20 | 2020-05-15 | 南昌大学 | Tumble detection method based on head segmentation and convolutional neural network |
US11450192B2 (en) * | 2020-01-06 | 2022-09-20 | National Cheng Kung University | Fall detection system |
US11022495B1 (en) | 2020-03-06 | 2021-06-01 | Butlr Technologies, Inc. | Monitoring human location, trajectory and behavior using thermal data |
US12050133B2 (en) | 2020-03-06 | 2024-07-30 | Butlr Technologies, Inc. | Pose detection using thermal data |
CN111466903A (en) * | 2020-04-23 | 2020-07-31 | 杭州微萤科技有限公司 | Fall detection method and device |
US11436906B1 (en) * | 2020-05-18 | 2022-09-06 | Sidhya V Peddinti | Visitor detection, facial recognition, and alert system and processes for assisting memory-challenged patients to recognize entryway visitors |
CN111601088B (en) * | 2020-05-27 | 2021-12-21 | 大连成者科技有限公司 | Sitting posture monitoring system based on monocular camera sitting posture identification technology |
CN111739248B (en) * | 2020-06-11 | 2022-04-01 | 湖北美和易思教育科技有限公司 | Artificial intelligent Internet of things security system and control method |
US11282367B1 (en) * | 2020-08-16 | 2022-03-22 | Vuetech Health Innovations LLC | System and methods for safety, security, and well-being of individuals |
US11477614B2 (en) | 2020-10-26 | 2022-10-18 | Motorola Solutions, Inc. | Device, system and method for vertical location change notifications based on periodic location reports |
US11688516B2 (en) | 2021-01-19 | 2023-06-27 | State Farm Mutual Automobile Insurance Company | Alert systems for senior living engagement and care support platforms |
DE202021102454U1 (en) | 2021-04-22 | 2022-07-25 | Ion Cristea | Stationary emergency call unit, monitoring system and electronic data processing system |
US20230008703A1 (en) * | 2021-07-08 | 2023-01-12 | At&T Intellectual Property I, L.P. | Methods, systems, and devices for collaborative design of an equipment site |
US20230141862A1 (en) * | 2021-11-09 | 2023-05-11 | AvaSure, LLC | Predictive system for elopement detection |
CN114038162A (en) * | 2021-12-29 | 2022-02-11 | 神思电子技术股份有限公司 | Vulnerable user nursing and alarming method, equipment and medium |
KR20240111793A (en) * | 2021-12-30 | 2024-07-17 | 구글 엘엘씨 | Fall risk assessment for users |
AU2023241553A1 (en) * | 2022-03-30 | 2024-10-17 | Butlr Technologies, Inc. | Pose detection using thermal data |
KR20240047778A (en) * | 2022-10-05 | 2024-04-12 | 장민석 | Smart home emergency treatment device based on iot camera |
US11922642B1 (en) * | 2023-01-30 | 2024-03-05 | SimpliSafe, Inc. | Methods and apparatus for detecting unrecognized moving objects |
DE102024104805A1 (en) | 2023-02-22 | 2024-08-22 | howRyou GmbH | USER SENSOR MONITORING DATA PROCESSING COMMUNICATION SYSTEM AND USER SENSOR MONITORING DATA PROCESSING COMMUNICATION METHOD |
US11922669B1 (en) | 2023-07-31 | 2024-03-05 | SimpliSafe, Inc. | Object detection via regions of interest |
CN117423210B (en) * | 2023-12-19 | 2024-02-13 | 西南医科大学附属医院 | Nursing is with disease anti-drop intelligent response alarm system |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5331990A (en) * | 1992-10-06 | 1994-07-26 | Hall H Eugene | Safety cane |
CA2495168A1 (en) * | 2002-08-08 | 2004-02-19 | Claire-Lise Boujon | Rescue and security device for swimming pools and amusement parks |
SE0203483D0 (en) * | 2002-11-21 | 2002-11-21 | Wespot Ab | Method and device for fall detection |
US7857771B2 (en) * | 2003-04-03 | 2010-12-28 | University Of Virginia Patent Foundation | Method and system for the derivation of human gait characteristics and detecting falls passively from floor vibrations |
US9318012B2 (en) * | 2003-12-12 | 2016-04-19 | Steve Gail Johnson | Noise correcting patient fall risk state system and method for predicting patient falls |
US9311540B2 (en) * | 2003-12-12 | 2016-04-12 | Careview Communications, Inc. | System and method for predicting patient falls |
US20100231506A1 (en) * | 2004-09-07 | 2010-09-16 | Timothy Pryor | Control of appliances, kitchen and home |
US7420472B2 (en) * | 2005-10-16 | 2008-09-02 | Bao Tran | Patient monitoring apparatus |
US7589637B2 (en) * | 2005-12-30 | 2009-09-15 | Healthsense, Inc. | Monitoring activity of an individual |
JP2008047097A (en) | 2006-06-28 | 2008-02-28 | Sysmex Corp | Patient abnormality notification system |
FR2906629B1 (en) * | 2006-09-29 | 2010-01-08 | Vigilio | METHOD AND SYSTEM FOR DETECTING ABNORMAL SITUATIONS OF A PERSON IN A PLACE OF LIFE |
US7961109B2 (en) * | 2006-12-04 | 2011-06-14 | Electronics And Telecommunications Research Institute | Fall detecting apparatus and method, and emergency aid system and method using the same |
US8217795B2 (en) * | 2006-12-05 | 2012-07-10 | John Carlton-Foss | Method and system for fall detection |
US7987069B2 (en) * | 2007-11-12 | 2011-07-26 | Bee Cave, Llc | Monitoring patient support exiting and initiating response |
US20090174565A1 (en) * | 2008-01-04 | 2009-07-09 | Aviton Care Limited | Fall detection system |
US8773269B2 (en) * | 2008-06-27 | 2014-07-08 | Neal T. RICHARDSON | Autonomous fall monitor |
ES2381712T3 (en) * | 2008-10-16 | 2012-05-30 | Koninklijke Philips Electronics N.V. | Fall detection system |
EP2347397B1 (en) * | 2008-10-17 | 2012-09-05 | Koninklijke Philips Electronics N.V. | A fall detection system and a method of operating a fall detection system |
US8972197B2 (en) * | 2009-09-15 | 2015-03-03 | Numera, Inc. | Method and system for analyzing breathing of a user |
EP2224706B1 (en) * | 2009-02-27 | 2013-11-06 | BlackBerry Limited | Mobile wireless communications device with orientation sensing and corresponding method for alerting a user of an impending call |
DE102009001565A1 (en) * | 2009-03-16 | 2010-09-23 | Robert Bosch Gmbh | Condition detection device for attachment to a living being |
US20100286567A1 (en) * | 2009-05-06 | 2010-11-11 | Andrew Wolfe | Elderly fall detection |
CA2765782C (en) * | 2009-06-24 | 2018-11-27 | The Medical Research, Infrastructure, And Health Services Fund Of The Tel Aviv Medical Center | Automated near-fall detector |
NZ580201A (en) * | 2009-10-06 | 2010-10-29 | Delloch Ltd | A protective device for protecting the hip area including an impact sensor and alarm |
US9058732B2 (en) * | 2010-02-25 | 2015-06-16 | Qualcomm Incorporated | Method and apparatus for enhanced indoor position location with assisted user profiles |
US20130135097A1 (en) * | 2010-07-29 | 2013-05-30 | J&M I.P. Holding Company, Llc | Fall-Responsive Emergency Device |
CA3177719A1 (en) | 2011-04-04 | 2012-10-04 | Alarm.Com Incorporated | Fall detection and reporting technology |
EP3030879A4 (en) * | 2013-08-09 | 2018-01-03 | CNRY Inc. | System and methods for monitoring an environment |
US11678144B2 (en) * | 2020-03-27 | 2023-06-13 | TraKid LLC | Real-time location and alert system |
US11570539B2 (en) * | 2021-03-20 | 2023-01-31 | International Business Machines Corporation | Safeguarding audio device based on detection of freefall or lost scenarios |
-
2012
- 2012-04-04 CA CA3177719A patent/CA3177719A1/en active Pending
- 2012-04-04 CA CA2773507A patent/CA2773507C/en active Active
- 2012-04-04 CA CA3090537A patent/CA3090537A1/en active Pending
- 2012-04-04 US US13/439,690 patent/US8675920B2/en active Active
-
2014
- 2014-03-07 US US14/200,407 patent/US9036019B2/en active Active
-
2015
- 2015-05-18 US US14/714,485 patent/US9495855B2/en active Active
-
2016
- 2016-11-10 US US15/348,550 patent/US10037669B2/en active Active
-
2018
- 2018-07-30 US US16/049,063 patent/US10825315B2/en active Active
-
2020
- 2020-10-30 US US17/085,683 patent/US11328571B2/en active Active
-
2022
- 2022-05-04 US US17/736,147 patent/US20220262224A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
CA3090537A1 (en) | 2012-10-04 |
US20210049887A1 (en) | 2021-02-18 |
CA2773507A1 (en) | 2012-10-04 |
US20140247335A1 (en) | 2014-09-04 |
US8675920B2 (en) | 2014-03-18 |
US11328571B2 (en) | 2022-05-10 |
US10037669B2 (en) | 2018-07-31 |
CA3177719A1 (en) | 2012-10-04 |
US20170061763A1 (en) | 2017-03-02 |
US9036019B2 (en) | 2015-05-19 |
US20220262224A1 (en) | 2022-08-18 |
US9495855B2 (en) | 2016-11-15 |
US10825315B2 (en) | 2020-11-03 |
US20120314901A1 (en) | 2012-12-13 |
US20150248825A1 (en) | 2015-09-03 |
US20180336773A1 (en) | 2018-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11328571B2 (en) | Fall detection and reporting technology | |
US11881295B2 (en) | Medication management and reporting technology | |
US10736582B2 (en) | Monitoring and tracking system, method, article and device | |
US20210007631A1 (en) | Systems and methods for fall detection | |
US20160307428A1 (en) | Remote monitoring system and related methods | |
US11688265B1 (en) | System and methods for safety, security, and well-being of individuals | |
US9710761B2 (en) | Method and apparatus for detection and prediction of events based on changes in behavior | |
US20150302310A1 (en) | Methods for data collection and analysis for event detection | |
US20210306797A1 (en) | Systems, methods and devices for determining social distancing compliance and exposure risks and for generating contagion alerts | |
Sukreep et al. | Recognizing Falls, Daily Activities, and Health Monitoring by Smart Devices. | |
Kaluža et al. | A multi-agent system for remote eldercare | |
Kutzik et al. | Technological tools of the future | |
Ranjan et al. | Human Context Sensing in Smart Cities | |
JP2023105966A (en) | Method and program executed by computer to detect change in state of resident, and resident state change detection device | |
KR20240154859A (en) | Apparatus and method for checking safety of ward | |
JP2023012291A (en) | Method implemented by computer to provide information supporting nursing care, program allowing computer to implement the same, and nursing care supporting information provision device | |
O'neill et al. | Assessing task compliance following reminders | |
Pogorelc | An intelligent system for prolonging independent living of elderly |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |
Effective date: 20170404 |