WO2018194671A1 - Assistance notifications in response to assistance events - Google Patents

Assistance notifications in response to assistance events Download PDF

Info

Publication number
WO2018194671A1
WO2018194671A1 PCT/US2017/028910 US2017028910W WO2018194671A1 WO 2018194671 A1 WO2018194671 A1 WO 2018194671A1 US 2017028910 W US2017028910 W US 2017028910W WO 2018194671 A1 WO2018194671 A1 WO 2018194671A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
assistance
response
biometric
prompt
Prior art date
Application number
PCT/US2017/028910
Other languages
French (fr)
Inventor
Christine I. HARPER
Michael C. Bartha
Maria Natalia RUSSI-VIGOYA
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2017/028910 priority Critical patent/WO2018194671A1/en
Publication of WO2018194671A1 publication Critical patent/WO2018194671A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7465Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
    • A61B5/747Arrangements for interactive communication between patient and care services, e.g. by using a telephone network in case of emergency, i.e. alerting emergency services
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/043Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0453Sensor means for detecting worn on the body to detect health condition by physiological monitoring, e.g. electrocardiogram, temperature, breathing
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0476Cameras to detect unsafe condition, e.g. video cameras
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring

Definitions

  • Fig. 1 is a block diagram of an example arrangement including an assistance system according to some examples.
  • Fig. 2 is a block diagram of an assistance system according to alternative examples.
  • Fig. 3 is a flow diagram of an assistance process according to some examples.
  • FIG. 4 is a block diagram of a system according to some examples.
  • Fig. 5 is a block diagram of a storage medium storing machine-readable instructions to perform tasks according to some examples.
  • a person who has suffered an accident or is experiencing a health issue may not be able to contact others for assistance, such as when the person is immobilized and unable to move to a phone, or the person is unconscious or incapacitated.
  • Devices with panic buttons may be used to contact a monitoring service or emergency personnel for assistance.
  • a person can press the panic button of the device when the person is experiencing wants help.
  • the person may be in a state such that pressing the panic button is not possible, such as when the person is suffering from a severe health issue (e.g., a stroke, a heart attack, etc.) or the person is unconscious or otherwise incapacitated.
  • the person may not be wearing or carrying the device with the panic button, such that pressing the panic button on the device is not possible.
  • an assistance system 102 does not have to rely on a person initiating a request for assistance. Rather, the assistance system 102 is able to initiate an action to render assistance.
  • the assistance system 102 can be implemented with any of various types of devices.
  • the assistance system 102 can be implemented with any or some combination of various different electronic devices, such as a computer (e.g., desktop computer, notebook computer, tablet computer, server computer, etc.), a smartphone, a wearable electronic device, an appliance, and so forth.
  • a computer e.g., desktop computer, notebook computer, tablet computer, server computer, etc.
  • smartphone e.g., a wearable electronic device, an appliance, and so forth.
  • An example of an appliance that can be used to implement the assistance system 102 includes a voice-enabled digital assistant device, which is able to recognize voice commands from users and to perform tasks in response to such voice commands. For example, a user can issue a voice command to perform an Internet search, a voice command to turn on a household appliance or a light fixture, a voice command to open a door or a window, and so forth.
  • the voice-enabled digital assistant device can be operatively coupled to the various household appliances, the Internet or other communication network (e.g., a telephone network, a private home or enterprise network, etc.), and/or other elements to allow the voice- enabled digital assistant device to interact with such elements.
  • the assistance system 102 can include multiple voice-enabled digital assistant devices dispersed at different locations of a facility to allow a person in any of such locations to interact with a corresponding voice-enabled digital assistant device.
  • the assistance system 102 uses image data from a camera 104 to determine whether an assistance event has occurred.
  • the camera 104 is able to capture images of a person 106.
  • the captured images can include still images or video images.
  • the camera 104 can have a relatively wide view, e.g., a view angle of greater than 90°, or 120°, or 180°, or 270°. In some cases, the camera 104 may even have a 360° view angle.
  • a camera can refer to a camera that is able to capture a picture or a video (in color or in monochrome) of an environment in the view of the camera.
  • a camera can refer to any other type of optical sensor that is able to capture an optical image of an environment, such as an infrared sensor to capture an infrared image, an optical depth sensor (e.g., a time-of-flight sensor) that can determine a depth, or more generally a distance, of an object in the view of the optical depth sensor, or any other type of optical sensor.
  • Fig. 1 shows an example with just one camera, it is noted that in other examples, there can be multiple cameras that send image data to the assistance system 102.
  • captured image data 1 10 can include image data captured by just one camera, or by multiple cameras.
  • a facility can include multiple areas (e.g., multiple rooms or other defined locations), and each area can include a respective camera (or a respective set of cameras).
  • facilities can include any or some combination of the following: a home, a work facility (an office, an industrial facility, etc.), an assisted-care facility, a health-care facility (e.g., a hospital, an urgent care center, a doctor's office, etc.), an entertainment park, or any other facility or setting where monitoring persons for assistance may be desired.
  • a home e.g., a home, a work facility (an office, an industrial facility, etc.), an assisted-care facility, a health-care facility (e.g., a hospital, an urgent care center, a doctor's office, etc.), an entertainment park, or any other facility or setting where monitoring persons for assistance may be desired.
  • the captured image data 1 10 including the captured images is transmitted by the camera 104 over a network 108 to the assistance system 102.
  • the network 108 can include a wired network or a wireless network.
  • a wireless network can include a cellular network or a wireless local area network (WLAN).
  • WLAN wireless local area network
  • a WLAN can perform Wi-Fi communications.
  • the wireless network 108 can include a Bluetooth link, a Wi-Fi Direct link, or any other type of wireless network.
  • An assistance event can refer to an event relating to a condition of the person that indicates that the person may have to be assisted by another.
  • the person may have fallen to the ground, may have collided with another object or person, or otherwise has suffered an accident or is experiencing a health issue (e.g., a heart attack, a stroke, etc.) that causes the person to exhibit unnatural or unexpected movement.
  • a health issue e.g., a heart attack, a stroke, etc.
  • the person may be too still for greater than a specified time duration. Even while sleeping, a person is expected to move around in bed every once in a while. If the person is totally still for greater than the specified time duration, then that is an indication that the person may be experiencing a health issue, and thus an assistance event is indicated.
  • a sensor can be used to determine a breathing rate of the person, to determine whether or not the person is breathing. If the person is still in bed and the person's breathing has stopped or slowed, then an assistance event has occurred.
  • the assistance system 102 includes various engines, including an assistance event detection engine 1 12, a biometrics data analysis engine 1 14, a prompt generation engine 1 16, and an assistance notification generation engine 1 18.
  • the term "engine” can refer to a hardware processing circuit, such as a microprocessor, a core of a multi-core microprocessor, a programmable integrated circuit device, a programmable gate array, a microcontroller, or any other type of hardware processing circuit.
  • the term “engine” can refer to a hardware processing circuit, such as a microprocessor, a core of a multi-core microprocessor, a programmable integrated circuit device, a programmable gate array, a microcontroller, or any other type of hardware processing circuit.
  • the term “engine” can refer to a hardware processing circuit, such as a microprocessor, a core of a multi-core microprocessor, a programmable integrated circuit device, a programmable gate array, a microcontroller, or any other type of hardware processing circuit.
  • the assistance event detection engine 1 12 receives the image data 1 10 sent by the camera 104.
  • the assistance event detection engine 1 12 analyzes the image data 1 10 to determine whether an assistance event has occurred.
  • the assistance event detection engine 1 12 analyzes the image data 1 10 to determine whether the image data contains a person, such as the person 106.
  • the assistance event detection engine 1 12 determines, based on the image data 1 10, whether the person 106 is in an unexpected position or location (e.g., lying down on a kitchen floor), or has experienced an unexpected movement.
  • an unexpected position may involve the person 106 lying on the floor for greater than some specified time duration.
  • an unexpected location of the person 106 may be a location where the person 106 is not expected to be at.
  • the person 106 may be expected to be at a given location (in the person's room or in a specific common area) at a specified time. If that person is located outside of that location at the specified time, then the assistance-event detection engine 1 12 can indicate an assistance event. As a further example, the assistance-event detection engine 1 12 may detect whether the person 106 has been still for greater than some specified time duration. Thus, even if the person 106 is lying in bed (presumably sleeping) or sitting in a sofa, the person 106 is still expected to move a little every once in a while. If the person 106 is totally still for greater than a specified time duration, then that can indicate an assistance event.
  • An unexpected movement may include the person falling down from an upright position (standing position or sitting position) to a position where the person is lying on the floor. This analysis involves looking at multiple captured images to determine the movement of the person 106 over time. Another example of unexpected movement can be rapid movement of a part of the person 106, such as when the person 106 is suffering a seizure, a heart attack, a stroke, or other acute health problem.
  • the assistance event detection engine 1 12 can determine whether an assistance event has occurred based on any one or some combination of the following factors: a position of the person 106 (e.g., an
  • orientation of the person 106 such as whether the person is standing up, sitting, or lying down), a location of the person 106, a stillness of the person 106 over a time duration, or a movement of the person 106.
  • the decision to indicate an assistance event by the assistance event detection engine 1 12 can be based just on the captured image data 1 10, or can also be based on an additional factor (or multiple additional factors).
  • the assistance event detection engine 1 12 can determine whether an assistance event has occurred based on biometrics data.
  • the person 106 may wear a wearable electronic device 120, such as a smart watch.
  • the wearable electronic device 120 can include smart eyeglasses, a head-mounted device, or a handheld device such as a smartphone.
  • the wearable electronic device 120 can include a biometric sensor to sense a biometric condition of the person 106.
  • the biometric sensor can be used to detect any one or some combination of the following biometric conditions: a heartrate of the person 106, a skin temperature of the person 106, a respiratory pattern of the person 106 (to determine whether the person 106 is breathing or a rate of breathing), a blood pressure of the person 106, a dilation of a pupil of the person 106, or any other condition relating to the body of the person 106.
  • the wearable electronic device 120 can include multiple biometric sensors to collect biometric data of the person 106.
  • a biometric sensor included on the wearable electronic device worn by the person 106, it is noted that in other examples, a biometric sensor can be located away from the person 106, but can be in the proximity of the person 106 to acquire certain biometric data of the person 106.
  • the biometric sensor sends biometric data 122 over the network 108 to the assistance system 102, such as by using any of various wireless links, such as a Bluetooth link, a Near Field Communication (NFC) link, a Wi-Fi link, or any other wireless link.
  • the biometric data 122 can be received by the biometric data analysis engine 1 14, which is able to analyze the biometric data 122 to determine whether the biometric data is outside an expected range.
  • the heartrate of the person 106 should be within a specified heartrate range
  • the skin temperature of the person 106 should be within a specified temperature range
  • the respiratory pattern of the person 106 is consistent with a specified respiratory pattern
  • the blood pressure of the person 106 should be within a specified pressure range
  • the dilation of the pupil of the person 106 should be within a specified size range.
  • the assistance event detection engine 1 12 can receive an indication from the biometric data analysis engine 1 14 to indicate whether any biometric condition of the person 106 is outside an expected range or inconsistent with an expected pattern. If so, then that can be used by the assistance event detection engine 1 12 to combine with analysis of the image data 1 10 to determine whether or not an assistance event has occurred.
  • the assistance event detection engine 1 12 can indicate an assistance event has occurred in response to detecting an unexpected position and/or location of the person 106, and further in response to a biometric condition of the person 106 being outside an expected range or inconsistent with an expected pattern.
  • the prompt generation engine 1 16 can receive an output from the assistance event detection engine 1 12 and the biometric data analysis engine 1 14.
  • the prompt generation engine 1 16 can generate and output a prompt to the person 106 in response to either one or both of the following: (1 ) an assistance event has occurred, or (2) the biometric data 122 of the person 106 is outside an expected range or inconsistent with an expected pattern.
  • the prompt that is generated by the prompt generation engine 1 16 can include an audio prompt and/or a visual prompt.
  • An audio prompt can be output by a speaker 124 of the assistance system 102.
  • An example of an audio prompt includes speech produced by the prompt generation engine 1 16, where the speech can ask the person 106 whether the person 106 is seeking assistance.
  • the prompt produced by the prompt generation engine 1 16 can be a visual prompt that the person 106 can see, such as on a display device 126 of the assistance system 102.
  • the person 106 can respond to either the audio prompt or the visual prompt, such as by speaking or producing a different sound (e.g., banging on the wall or floor), activating an input device, and so forth.
  • expiration of a specified time duration from when the audio and/or visual prompt was presented can be an indication that an assistance notification should be sent to seek assistance for the person 106.
  • Fig. 1 shows just one speaker 124 and one display 126
  • the assistance system 102 can include multiple speakers and/or multiple displays dispersed throughout a facility, such as in different areas or rooms of a house or work facility, different areas of an assisted-care facility or health-care facility, and so forth.
  • the assistance system 102 also includes a microphone 128, which is able to receive voice spoken by the person 106 or other audio input. Although just one microphone 128 is shown in Fig. 1 , it is noted that in some examples, the assistance system 102 can include multiple microphones dispersed at different locations. In response to an audio prompt and/or a visual prompt, the person 106 may state that the person is seeking assistance, which is picked up by the microphone 128. The detected voice from the person 106 is sent by the microphone 128 to an assistance notification generation engine 1 18.
  • the response from the person 106 (or lack of a response from the person 106 for greater than a specified time duration) can be processed by the assistance notification generation engine 1 18 to determine whether to produce an assistance notification.
  • the response from the person 106 can include a statement that confirms that the person 106 is seeking assistance.
  • the response from the person 106 can include a specific request, such as a request to call an emergency center, a request to call a family member or friend, a request to call a doctor, and so forth.
  • the assistance notification generation engine 1 18 can detect that there has been no response from the person 106 for longer than a specified time duration.
  • the assistance notification generation engine 1 18 can generate the assistance notification, which can be sent to a target entity to seek assistance on behalf of the person 106.
  • the target entity can include an emergency center (e.g., 91 1 call center), personnel at an assisted-care facility or health-care facility, the person's relative or friend, or any other target entity.
  • the assistance notification generation engine 1 18 can also provide a notification to a target entity in a situation where there is not a lack of response, but a noticeable degradation of a human body condition, e.g., an increase in body temperature beyond a specified threshold, a person vomiting while sleeping, and so forth.
  • a noticeable degradation of a human body condition e.g., an increase in body temperature beyond a specified threshold, a person vomiting while sleeping, and so forth.
  • the term "assistance notification” can refer to any notification, such as in the form of a message, an audio output, a visual output, and so forth, that provides an indication that someone is seeking assistance.
  • the assistance notification can include information identifying the location of the person 106, information indicating a biometric condition of the person 106, the name of the person 106, a telephone number of the person 106, and so forth.
  • the assistance notification can include the foregoing information, or alternatively the assistance notification can include references (e.g., addresses, storage locations, uniform resource locators, etc.) to such information. The target entity that receives the assistance notification can use such references to retrieve the information.
  • the assistance system 102 includes a communication transceiver 130, which communicates over the network 108.
  • the communication transceiver 130 or different communication transceiver, can also communicate over another network, such as a telephone network, the Internet, and so forth, to allow the assistance system 102 to communicate with a remote target entity to send an assistance notification.
  • the communication transceiver 130 can include a wired transceiver to communicate over a wired network, or a wireless transceiver to communicate over a wireless network.
  • the assistance system 102 can be implemented with a voice-enabled digital assistant device.
  • An example of such an assistance system 202 is shown in Fig. 2, which includes multiple voice-enabled digital assistant devices 204.
  • each voice-enabled digital assistant device 204 includes the assistance event detection engine 1 12, the biometrics data analysis engine 1 14, the prompt generation engine 1 16, and the assistance notification generation engine 1 18.
  • each voice-enabled digital assistant device 204 a subset of the engines 1 12, 1 14, 1 16, and 1 18 is provided in each voice-enabled digital assistant devices 204, while the remainder of the engines 1 12, 1 14, 1 16, and 1 18 is provided in a separate system, such as a server computer or another electronic device.
  • Fig. 3 is a flow diagram of an example process that can be performed by the assistance system 102 or 202, according to some examples.
  • the process of Fig. 3 includes analyzing (at 302) image data (e.g., 1 10 in Fig. 1 ) collected by a camera (e.g., 104 in Fig. 1 ) to determine whether an assistance event associated with a person has occurred.
  • the process of Fig. 3 further includes collecting (at 304) biometric data (e.g., 122) from a biometric sensor.
  • the biometric sensor can be worn on the person, or can be located away from but in sufficient proximity to the person to detect the biometric condition of the person.
  • the process of Fig. 3 further includes determining (at 306), in response to the assistance event and based on the collected biometric data, whether to send an assistance notification to a target entity.
  • Fig. 4 is a block diagram of an example system 400, which can include a computer system or multiple computer systems, or other electronic device(s).
  • the system 400 is an example implementation of the assistance system 102.
  • the system 400 includes a processor (or multiple processors) 402.
  • a processor can include a microprocessor, a core of a multi-core microprocessor, a microcontroller, a programmable integrated circuit device, a programmable gate array, or any other hardware processing circuit.
  • the processor 402 is to perform certain tasks, such as based on machine-readable instructions executed on the processor 402.
  • Machine- readable instructions executed on a processor can refer to the machine-readable instructions executed on one processor or on multiple processors.
  • a processor performing a task can refer to one processor performing the task, or multiple processors performing the task.
  • the processor 402 can perform an assistance event determining task 404 to determine, based on data from a camera, whether an assistance event associated with a person has occurred.
  • the processor 402 further performs a biometric condition determining task 406 to determine, in response to the assistance event and based on biometric data from a biometric sensor, a biometric condition of the person.
  • the processor 402 further performs an assistance notification transmission task 408, to cause transmission of an assistance notification to a target entity in response to the biometric condition satisfying a specified criterion.
  • Fig. 5 is a block diagram of a non-transitory machine-readable or computer-readable storage medium 500 that stores machine-readable instructions that upon execution cause a system to perform various tasks.
  • the machine- readable instructions include an assistance event determining instructions 502 to determine, based on data from cameras arranged at a plurality of locations in a facility, whether an assistance event associated with a person has occurred.
  • the machine-readable instructions also include an assistance event responding instructions 504 to determine, based on biometric data from a biometric sensor, a biometric condition of the person, and to present a prompt to the person asking if assistance is requested.
  • the machine-readable instructions further include assistance notification transmission instructions 506 to cause transmission of an assistance notification to a target entity in response to one or both of the following: the biometric condition satisfying a specified criterion, or a response of the person to the prompt.
  • the storage medium 500 can include any or some combination of the following: a semiconductor memory device such as a dynamic or static random access memory (a DRAM or SRAM), an erasable and programmable read-only memory (EPROM), an electrically erasable and programmable read-only memory (EEPROM) and flash memory; a magnetic disk such as a fixed, floppy and removable disk; another magnetic medium including tape; an optical medium such as a compact disk (CD) or a digital video disk (DVD); a cloud storage; or another type of storage device.
  • a semiconductor memory device such as a dynamic or static random access memory (a DRAM or SRAM), an erasable and programmable read-only memory (EPROM), an electrically erasable and programmable read-only memory (EEPROM) and flash memory
  • a magnetic disk such as a fixed, floppy and removable disk
  • another magnetic medium including tape an optical medium such as a compact disk (CD) or a digital video disk (DVD); a cloud storage; or another type of
  • Such computer-readable or machine-readable storage medium or media is (are) considered to be part of an article (or article of manufacture).
  • An article or article of manufacture can refer to any manufactured single component or multiple components.
  • the storage medium or media can be located either in the machine running the machine-readable instructions, or located at a remote site from which machine-readable instructions can be downloaded over a network for execution.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Business, Economics & Management (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Emergency Management (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Gerontology & Geriatric Medicine (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Physiology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Critical Care (AREA)
  • Emergency Medicine (AREA)
  • Nursing (AREA)
  • Pulmonology (AREA)
  • Cardiology (AREA)
  • Multimedia (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Alarm Systems (AREA)

Abstract

In some examples, a system determines, based on data from a camera, whether an assistance event associated with a person has occurred, and in response to the assistance event, determines, based on biometric data from a biometric sensor, a biometric condition of the person. In response to the biometric condition satisfying a specified criterion, the system causes transmission of an assistance notification to a target entity.

Description

ASSISTANCE NOTIFICATIONS IN RESPONSE TO ASSISTANCE EVENTS Background
[0001 ] People, particularly the elderly or disabled persons, can suffer accidents while at home, in an assisted-care facility, in a health-care facility, at work, or in any other facility or setting. Once an accident occurs, a person may not be able to contact emergency personnel to seek assistance.
Brief Description of the Drawings
[0002] Some implementations of the present disclosure are described with respect to the following figures.
[0003] Fig. 1 is a block diagram of an example arrangement including an assistance system according to some examples.
[0004] Fig. 2 is a block diagram of an assistance system according to alternative examples.
[0005] Fig. 3 is a flow diagram of an assistance process according to some examples.
[0006] Fig. 4 is a block diagram of a system according to some examples.
[0007] Fig. 5 is a block diagram of a storage medium storing machine-readable instructions to perform tasks according to some examples.
[0008] Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements. The figures are not necessarily to scale, and the size of some parts may be exaggerated to more clearly illustrate the example shown. Moreover, the drawings provide examples and/or implementations consistent with the description; however, the description is not limited to the examples and/or implementations provided in the drawings. Detailed Description
[0009] In the present disclosure, use of the term "a," "an", or "the" is intended to include the plural forms as well, unless the context clearly indicates otherwise. Also, the term "includes," "including," "comprises," "comprising," "have," or "having" when used in this disclosure specifies the presence of the stated elements, but do not preclude the presence or addition of other elements.
[0010] A person who has suffered an accident or is experiencing a health issue may not be able to contact others for assistance, such as when the person is immobilized and unable to move to a phone, or the person is unconscious or incapacitated. Devices with panic buttons may be used to contact a monitoring service or emergency personnel for assistance. A person can press the panic button of the device when the person is experiencing wants help. However, in some cases, the person may be in a state such that pressing the panic button is not possible, such as when the person is suffering from a severe health issue (e.g., a stroke, a heart attack, etc.) or the person is unconscious or otherwise incapacitated.
Alternatively, the person may not be wearing or carrying the device with the panic button, such that pressing the panic button on the device is not possible.
[001 1 ] In accordance with some implementations of the present disclosure, as shown in Fig. 1 , an assistance system 102 does not have to rely on a person initiating a request for assistance. Rather, the assistance system 102 is able to initiate an action to render assistance.
[0012] The assistance system 102 can be implemented with any of various types of devices. For example, the assistance system 102 can be implemented with any or some combination of various different electronic devices, such as a computer (e.g., desktop computer, notebook computer, tablet computer, server computer, etc.), a smartphone, a wearable electronic device, an appliance, and so forth.
[0013] An example of an appliance that can be used to implement the assistance system 102 includes a voice-enabled digital assistant device, which is able to recognize voice commands from users and to perform tasks in response to such voice commands. For example, a user can issue a voice command to perform an Internet search, a voice command to turn on a household appliance or a light fixture, a voice command to open a door or a window, and so forth. The voice-enabled digital assistant device can be operatively coupled to the various household appliances, the Internet or other communication network (e.g., a telephone network, a private home or enterprise network, etc.), and/or other elements to allow the voice- enabled digital assistant device to interact with such elements. In some examples, the assistance system 102 can include multiple voice-enabled digital assistant devices dispersed at different locations of a facility to allow a person in any of such locations to interact with a corresponding voice-enabled digital assistant device.
[0014] In some examples, the assistance system 102 uses image data from a camera 104 to determine whether an assistance event has occurred. The camera 104 is able to capture images of a person 106. The captured images can include still images or video images. In some examples, the camera 104 can have a relatively wide view, e.g., a view angle of greater than 90°, or 120°, or 180°, or 270°. In some cases, the camera 104 may even have a 360° view angle.
[0015] In some examples, a camera can refer to a camera that is able to capture a picture or a video (in color or in monochrome) of an environment in the view of the camera. In further examples, a camera can refer to any other type of optical sensor that is able to capture an optical image of an environment, such as an infrared sensor to capture an infrared image, an optical depth sensor (e.g., a time-of-flight sensor) that can determine a depth, or more generally a distance, of an object in the view of the optical depth sensor, or any other type of optical sensor.
[0016] Although Fig. 1 shows an example with just one camera, it is noted that in other examples, there can be multiple cameras that send image data to the assistance system 102. Thus, captured image data 1 10 can include image data captured by just one camera, or by multiple cameras. In the ensuing discussion, reference is made to captured image data 1 10 sent by the camera 104. It is noted that such reference includes captured image data 1 10 sent by multiple cameras. [0017] In some examples, a facility can include multiple areas (e.g., multiple rooms or other defined locations), and each area can include a respective camera (or a respective set of cameras). Examples of facilities can include any or some combination of the following: a home, a work facility (an office, an industrial facility, etc.), an assisted-care facility, a health-care facility (e.g., a hospital, an urgent care center, a doctor's office, etc.), an entertainment park, or any other facility or setting where monitoring persons for assistance may be desired.
[0018] The captured image data 1 10 including the captured images is transmitted by the camera 104 over a network 108 to the assistance system 102. The network 108 can include a wired network or a wireless network. A wireless network can include a cellular network or a wireless local area network (WLAN). In some examples, a WLAN can perform Wi-Fi communications. In further examples, the wireless network 108 can include a Bluetooth link, a Wi-Fi Direct link, or any other type of wireless network.
[0019] An assistance event can refer to an event relating to a condition of the person that indicates that the person may have to be assisted by another. For example, the person may have fallen to the ground, may have collided with another object or person, or otherwise has suffered an accident or is experiencing a health issue (e.g., a heart attack, a stroke, etc.) that causes the person to exhibit unnatural or unexpected movement. Alternatively, the person may be too still for greater than a specified time duration. Even while sleeping, a person is expected to move around in bed every once in a while. If the person is totally still for greater than the specified time duration, then that is an indication that the person may be experiencing a health issue, and thus an assistance event is indicated. In further examples, a sensor can be used to determine a breathing rate of the person, to determine whether or not the person is breathing. If the person is still in bed and the person's breathing has stopped or slowed, then an assistance event has occurred.
[0020] Whether an assistance event has occurred can be based on the captured image data 1 10 and possibly on other factors, including biometric information of the person 106 (discussed further below). [0021 ] The assistance system 102 includes various engines, including an assistance event detection engine 1 12, a biometrics data analysis engine 1 14, a prompt generation engine 1 16, and an assistance notification generation engine 1 18. As used here, the term "engine" can refer to a hardware processing circuit, such as a microprocessor, a core of a multi-core microprocessor, a programmable integrated circuit device, a programmable gate array, a microcontroller, or any other type of hardware processing circuit. Alternatively, the term "engine" can refer to a
combination of a hardware processing circuit and machine-readable instructions (software and/or firmware) executable on the hardware processing circuit.
[0022] The assistance event detection engine 1 12 receives the image data 1 10 sent by the camera 104. The assistance event detection engine 1 12 analyzes the image data 1 10 to determine whether an assistance event has occurred. The assistance event detection engine 1 12 analyzes the image data 1 10 to determine whether the image data contains a person, such as the person 106. Next, the assistance event detection engine 1 12 determines, based on the image data 1 10, whether the person 106 is in an unexpected position or location (e.g., lying down on a kitchen floor), or has experienced an unexpected movement. As another example, an unexpected position may involve the person 106 lying on the floor for greater than some specified time duration. As a further example, an unexpected location of the person 106 may be a location where the person 106 is not expected to be at. For example, in an assisted-care facility or health-care facility, the person 106 may be expected to be at a given location (in the person's room or in a specific common area) at a specified time. If that person is located outside of that location at the specified time, then the assistance-event detection engine 1 12 can indicate an assistance event. As a further example, the assistance-event detection engine 1 12 may detect whether the person 106 has been still for greater than some specified time duration. Thus, even if the person 106 is lying in bed (presumably sleeping) or sitting in a sofa, the person 106 is still expected to move a little every once in a while. If the person 106 is totally still for greater than a specified time duration, then that can indicate an assistance event. [0023] An unexpected movement may include the person falling down from an upright position (standing position or sitting position) to a position where the person is lying on the floor. This analysis involves looking at multiple captured images to determine the movement of the person 106 over time. Another example of unexpected movement can be rapid movement of a part of the person 106, such as when the person 106 is suffering a seizure, a heart attack, a stroke, or other acute health problem.
[0024] Thus, more generally, the assistance event detection engine 1 12 can determine whether an assistance event has occurred based on any one or some combination of the following factors: a position of the person 106 (e.g., an
orientation of the person 106, such as whether the person is standing up, sitting, or lying down), a location of the person 106, a stillness of the person 106 over a time duration, or a movement of the person 106.
[0025] The decision to indicate an assistance event by the assistance event detection engine 1 12 can be based just on the captured image data 1 10, or can also be based on an additional factor (or multiple additional factors). For example, the assistance event detection engine 1 12 can determine whether an assistance event has occurred based on biometrics data. The person 106 may wear a wearable electronic device 120, such as a smart watch. Alternatively, the wearable electronic device 120 can include smart eyeglasses, a head-mounted device, or a handheld device such as a smartphone. The wearable electronic device 120 can include a biometric sensor to sense a biometric condition of the person 106. For example, the biometric sensor can be used to detect any one or some combination of the following biometric conditions: a heartrate of the person 106, a skin temperature of the person 106, a respiratory pattern of the person 106 (to determine whether the person 106 is breathing or a rate of breathing), a blood pressure of the person 106, a dilation of a pupil of the person 106, or any other condition relating to the body of the person 106.
[0026] In further examples, the wearable electronic device 120 can include multiple biometric sensors to collect biometric data of the person 106. Although reference is made to a biometric sensor included on the wearable electronic device worn by the person 106, it is noted that in other examples, a biometric sensor can be located away from the person 106, but can be in the proximity of the person 106 to acquire certain biometric data of the person 106.
[0027] The biometric sensor sends biometric data 122 over the network 108 to the assistance system 102, such as by using any of various wireless links, such as a Bluetooth link, a Near Field Communication (NFC) link, a Wi-Fi link, or any other wireless link. The biometric data 122 can be received by the biometric data analysis engine 1 14, which is able to analyze the biometric data 122 to determine whether the biometric data is outside an expected range. For example, the heartrate of the person 106 should be within a specified heartrate range, the skin temperature of the person 106 should be within a specified temperature range, the respiratory pattern of the person 106 is consistent with a specified respiratory pattern, the blood pressure of the person 106 should be within a specified pressure range, or the dilation of the pupil of the person 106 should be within a specified size range.
[0028] In examples where the assistance event detection engine 1 12 relies on biometric data to determine whether an assistance event has occurred, the assistance event detection engine 1 12 can receive an indication from the biometric data analysis engine 1 14 to indicate whether any biometric condition of the person 106 is outside an expected range or inconsistent with an expected pattern. If so, then that can be used by the assistance event detection engine 1 12 to combine with analysis of the image data 1 10 to determine whether or not an assistance event has occurred.
[0029] For example, the assistance event detection engine 1 12 can indicate an assistance event has occurred in response to detecting an unexpected position and/or location of the person 106, and further in response to a biometric condition of the person 106 being outside an expected range or inconsistent with an expected pattern.
[0030] The prompt generation engine 1 16 can receive an output from the assistance event detection engine 1 12 and the biometric data analysis engine 1 14. The prompt generation engine 1 16 can generate and output a prompt to the person 106 in response to either one or both of the following: (1 ) an assistance event has occurred, or (2) the biometric data 122 of the person 106 is outside an expected range or inconsistent with an expected pattern.
[0031 ] The prompt that is generated by the prompt generation engine 1 16 can include an audio prompt and/or a visual prompt. An audio prompt can be output by a speaker 124 of the assistance system 102. An example of an audio prompt includes speech produced by the prompt generation engine 1 16, where the speech can ask the person 106 whether the person 106 is seeking assistance. Alternatively, the prompt produced by the prompt generation engine 1 16 can be a visual prompt that the person 106 can see, such as on a display device 126 of the assistance system 102. The person 106 can respond to either the audio prompt or the visual prompt, such as by speaking or producing a different sound (e.g., banging on the wall or floor), activating an input device, and so forth.
[0032] In some cases, such as when the person 106 is unconscious or incapacitated, the person 106 may not be able to respond to the prompt— in such cases, expiration of a specified time duration from when the audio and/or visual prompt was presented can be an indication that an assistance notification should be sent to seek assistance for the person 106.
[0033] Although Fig. 1 shows just one speaker 124 and one display 126, it is noted that in other examples, the assistance system 102 can include multiple speakers and/or multiple displays dispersed throughout a facility, such as in different areas or rooms of a house or work facility, different areas of an assisted-care facility or health-care facility, and so forth.
[0034] The assistance system 102 also includes a microphone 128, which is able to receive voice spoken by the person 106 or other audio input. Although just one microphone 128 is shown in Fig. 1 , it is noted that in some examples, the assistance system 102 can include multiple microphones dispersed at different locations. In response to an audio prompt and/or a visual prompt, the person 106 may state that the person is seeking assistance, which is picked up by the microphone 128. The detected voice from the person 106 is sent by the microphone 128 to an assistance notification generation engine 1 18.
[0035] The response from the person 106 (or lack of a response from the person 106 for greater than a specified time duration) can be processed by the assistance notification generation engine 1 18 to determine whether to produce an assistance notification. For example, the response from the person 106 can include a statement that confirms that the person 106 is seeking assistance. As another example, the response from the person 106 can include a specific request, such as a request to call an emergency center, a request to call a family member or friend, a request to call a doctor, and so forth. Alternatively, the assistance notification generation engine 1 18 can detect that there has been no response from the person 106 for longer than a specified time duration. In either case, the assistance notification generation engine 1 18 can generate the assistance notification, which can be sent to a target entity to seek assistance on behalf of the person 106. The target entity can include an emergency center (e.g., 91 1 call center), personnel at an assisted-care facility or health-care facility, the person's relative or friend, or any other target entity.
[0036] The assistance notification generation engine 1 18 can also provide a notification to a target entity in a situation where there is not a lack of response, but a noticeable degradation of a human body condition, e.g., an increase in body temperature beyond a specified threshold, a person vomiting while sleeping, and so forth.
[0037] The term "assistance notification" can refer to any notification, such as in the form of a message, an audio output, a visual output, and so forth, that provides an indication that someone is seeking assistance. The assistance notification can include information identifying the location of the person 106, information indicating a biometric condition of the person 106, the name of the person 106, a telephone number of the person 106, and so forth. [0038] In some examples, the assistance notification can include the foregoing information, or alternatively the assistance notification can include references (e.g., addresses, storage locations, uniform resource locators, etc.) to such information. The target entity that receives the assistance notification can use such references to retrieve the information.
[0039] The assistance system 102 includes a communication transceiver 130, which communicates over the network 108. The communication transceiver 130, or different communication transceiver, can also communicate over another network, such as a telephone network, the Internet, and so forth, to allow the assistance system 102 to communicate with a remote target entity to send an assistance notification. The communication transceiver 130 can include a wired transceiver to communicate over a wired network, or a wireless transceiver to communicate over a wireless network.
[0040] As noted above, the assistance system 102 can be implemented with a voice-enabled digital assistant device. An example of such an assistance system 202 is shown in Fig. 2, which includes multiple voice-enabled digital assistant devices 204. In some examples, each voice-enabled digital assistant device 204 includes the assistance event detection engine 1 12, the biometrics data analysis engine 1 14, the prompt generation engine 1 16, and the assistance notification generation engine 1 18.
[0041 ] In other examples, a subset of the engines 1 12, 1 14, 1 16, and 1 18 is provided in each voice-enabled digital assistant devices 204, while the remainder of the engines 1 12, 1 14, 1 16, and 1 18 is provided in a separate system, such as a server computer or another electronic device.
[0042] Fig. 3 is a flow diagram of an example process that can be performed by the assistance system 102 or 202, according to some examples. The process of Fig. 3 includes analyzing (at 302) image data (e.g., 1 10 in Fig. 1 ) collected by a camera (e.g., 104 in Fig. 1 ) to determine whether an assistance event associated with a person has occurred. [0043] The process of Fig. 3 further includes collecting (at 304) biometric data (e.g., 122) from a biometric sensor. The biometric sensor can be worn on the person, or can be located away from but in sufficient proximity to the person to detect the biometric condition of the person.
[0044] The process of Fig. 3 further includes determining (at 306), in response to the assistance event and based on the collected biometric data, whether to send an assistance notification to a target entity.
[0045] Fig. 4 is a block diagram of an example system 400, which can include a computer system or multiple computer systems, or other electronic device(s). The system 400 is an example implementation of the assistance system 102. The system 400 includes a processor (or multiple processors) 402. A processor can include a microprocessor, a core of a multi-core microprocessor, a microcontroller, a programmable integrated circuit device, a programmable gate array, or any other hardware processing circuit. The processor 402 is to perform certain tasks, such as based on machine-readable instructions executed on the processor 402. Machine- readable instructions executed on a processor can refer to the machine-readable instructions executed on one processor or on multiple processors. A processor performing a task can refer to one processor performing the task, or multiple processors performing the task.
[0046] The processor 402 can perform an assistance event determining task 404 to determine, based on data from a camera, whether an assistance event associated with a person has occurred. The processor 402 further performs a biometric condition determining task 406 to determine, in response to the assistance event and based on biometric data from a biometric sensor, a biometric condition of the person. The processor 402 further performs an assistance notification transmission task 408, to cause transmission of an assistance notification to a target entity in response to the biometric condition satisfying a specified criterion.
[0047] Fig. 5 is a block diagram of a non-transitory machine-readable or computer-readable storage medium 500 that stores machine-readable instructions that upon execution cause a system to perform various tasks. The machine- readable instructions include an assistance event determining instructions 502 to determine, based on data from cameras arranged at a plurality of locations in a facility, whether an assistance event associated with a person has occurred. The machine-readable instructions also include an assistance event responding instructions 504 to determine, based on biometric data from a biometric sensor, a biometric condition of the person, and to present a prompt to the person asking if assistance is requested. The machine-readable instructions further include assistance notification transmission instructions 506 to cause transmission of an assistance notification to a target entity in response to one or both of the following: the biometric condition satisfying a specified criterion, or a response of the person to the prompt.
[0048] The storage medium 500 can include any or some combination of the following: a semiconductor memory device such as a dynamic or static random access memory (a DRAM or SRAM), an erasable and programmable read-only memory (EPROM), an electrically erasable and programmable read-only memory (EEPROM) and flash memory; a magnetic disk such as a fixed, floppy and removable disk; another magnetic medium including tape; an optical medium such as a compact disk (CD) or a digital video disk (DVD); a cloud storage; or another type of storage device. Note that the instructions discussed above can be provided on one computer-readable or machine-readable storage medium, or alternatively, can be provided on multiple computer-readable or machine-readable storage media distributed in a large system having possibly plural nodes. Such computer-readable or machine-readable storage medium or media is (are) considered to be part of an article (or article of manufacture). An article or article of manufacture can refer to any manufactured single component or multiple components. The storage medium or media can be located either in the machine running the machine-readable instructions, or located at a remote site from which machine-readable instructions can be downloaded over a network for execution. [0049] In the foregoing description, numerous details are set forth to provide an understanding of the subject disclosed herein. However, implementations may be practiced without some of these details. Other implementations may include modifications and variations from the details discussed above. It is intended that the appended claims cover such modifications and variations.

Claims

What is claimed is: 1 . A system comprising:
a processor to:
determine, based on data from a camera, whether an assistance event associated with a person has occurred;
in response to the assistance event, determine, based on biometric data from a biometric sensor, a biometric condition of the person; and
in response to the biometric condition satisfying a specified criterion, cause transmission of an assistance notification to a target entity.
2. The system of claim 1 , wherein the processor is to further:
in response to the assistance event, provide a prompt asking the person if assistance is requested.
3. The system of claim 2, wherein the processor is to further:
responsive to a response to the prompt seeking assistance, cause
transmission of an assistance notification to the target entity.
4. The system of claim 2, wherein the processor is to further:
responsive to a lack of a response to the prompt, cause transmission of an assistance notification to the target entity.
5. The system of claim 2, wherein the prompt comprises one or both of an audio prompt and a visual prompt.
6. The system of claim 2, wherein the processor is to provide the prompt further according to an analysis of the biometric data.
7. The system of claim 1 , wherein the biometric data is from the biometric sensor worn by the person.
8. The system of claim 7, wherein the biometric sensor is part of a wearable electronic device worn by the person.
9. The system of claim 1 , comprising a voice-enabled digital assistant device, wherein the processor is part of the voice-enabled digital assistant device, the voice- enabled digital assistant device to recognize voice inputs and to perform tasks in response to the voice inputs.
10. A non-transitory machine-readable storage medium storing instructions that upon execution cause a system to:
determine, based on data from cameras arranged at a plurality of locations in a facility, whether an assistance event associated with a person has occurred;
in response to the assistance event,
determine, based on biometric data from a biometric sensor, a biometric condition of the person, and
present a prompt to the person asking if assistance is requested; and in response to at least one selected from among the biometric condition satisfying a specified criterion or a response of the person to the prompt, cause transmission of an assistance notification to a target entity.
1 1 . The non-transitory machine-readable storage medium of claim 10, wherein the biometric data is selected from among a heartrate of the person, a skin temperature of the person, a dilation of a pupil of the person, a respiratory pattern of the person, and a blood pressure of the person.
12. The non-transitory machine-readable storage medium of claim 10, wherein the instructions upon execution cause the system to recognize a voice response from the person in response to the prompt, and perform an action responsive to a request made by the user in the voice response.
13. The non-transitory machine-readable storage medium of claim 10, wherein the biometric sensor is part of a wearable electronic device worn by the person.
14. A method of a system comprising a processor, comprising:
analyzing image data collected by a camera to determine whether an assistance event associated with a person has occurred;
collecting biometric data of the person from a biometric sensor; and determining, in response to the assistance event and based on the collected biometric data, whether to send an assistance notification to a target entity.
15. The method of claim 14, wherein the system comprising a plurality of voice- enabled digital assistant devices, the method further comprising:
issuing, by a voice-enabled digital assistant device of the plurality of voice- enabled digital assistant devices, an audio prompt in response to the assistance event; and
processing, by the voice-enabled digital assistant device, a voice response to the audio prompt to determine whether the assistance notification is to be sent to the target entity.
PCT/US2017/028910 2017-04-21 2017-04-21 Assistance notifications in response to assistance events WO2018194671A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2017/028910 WO2018194671A1 (en) 2017-04-21 2017-04-21 Assistance notifications in response to assistance events

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2017/028910 WO2018194671A1 (en) 2017-04-21 2017-04-21 Assistance notifications in response to assistance events

Publications (1)

Publication Number Publication Date
WO2018194671A1 true WO2018194671A1 (en) 2018-10-25

Family

ID=63855995

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/028910 WO2018194671A1 (en) 2017-04-21 2017-04-21 Assistance notifications in response to assistance events

Country Status (1)

Country Link
WO (1) WO2018194671A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160042637A1 (en) * 2014-08-11 2016-02-11 Clandestine Development, Llc Drone Safety Alert Monitoring System and Method
WO2016205246A1 (en) * 2015-06-15 2016-12-22 Knit Health, Inc. Remote biometric monitoring system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160042637A1 (en) * 2014-08-11 2016-02-11 Clandestine Development, Llc Drone Safety Alert Monitoring System and Method
WO2016205246A1 (en) * 2015-06-15 2016-12-22 Knit Health, Inc. Remote biometric monitoring system

Similar Documents

Publication Publication Date Title
US11024142B2 (en) Event detector for issuing a notification responsive to occurrence of an event
US20220012470A1 (en) Multi-user intelligent assistance
US11632661B2 (en) Systems and methods for health monitoring and providing emergency support
US11158179B2 (en) Method and system to improve accuracy of fall detection using multi-sensor fusion
US11382511B2 (en) Method and system to reduce infrastructure costs with simplified indoor location and reliable communications
TWI745930B (en) Computer-implemented method, computer program product, and system for emergency event detection and response
US20150194034A1 (en) Systems and methods for detecting and/or responding to incapacitated person using video motion analytics
US11663888B2 (en) Home security response using biometric and environmental observations
JP7325651B2 (en) System to ensure health safety when charging wearable health
AU2017338619B2 (en) Alert system
US20230326318A1 (en) Environment sensing for care systems
JP2021033677A (en) Information processing apparatus and program
WO2018194671A1 (en) Assistance notifications in response to assistance events
US20230260134A1 (en) Systems and methods for monitoring subjects
US11538125B1 (en) Automated event detection in controlled-environment areas using biometric identification
CN111722535A (en) Security monitoring method and device, computer equipment and storage medium
JP7354549B2 (en) Monitoring device and monitoring program
US20230360507A1 (en) In-home event intercom and notifications
US20230210372A1 (en) Passive assistive alerts using artificial intelligence assistants
JP7268387B2 (en) Monitoring device and program for monitoring device
JP2020135063A (en) Surveillance device and program for surveillance device
JP2021009549A (en) Watching system and watching method
JP2021009070A (en) Watching system and watching method
JP2020030749A (en) Resident watching system
JP2017183848A (en) Terminal device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17906740

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17906740

Country of ref document: EP

Kind code of ref document: A1