AU2021101323A4 - Method for fall prevention, fall detection and electronic fall event alert system for aged care facilities - Google Patents

Method for fall prevention, fall detection and electronic fall event alert system for aged care facilities Download PDF

Info

Publication number
AU2021101323A4
AU2021101323A4 AU2021101323A AU2021101323A AU2021101323A4 AU 2021101323 A4 AU2021101323 A4 AU 2021101323A4 AU 2021101323 A AU2021101323 A AU 2021101323A AU 2021101323 A AU2021101323 A AU 2021101323A AU 2021101323 A4 AU2021101323 A4 AU 2021101323A4
Authority
AU
Australia
Prior art keywords
fall
data
patient
depth
care receiver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2021101323A
Inventor
Zejun Hu
Yifei Wang
Jun Yi
Chaorong Zhang
Simon Chaoxin Zhang
Taicheng Zhou
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Top Ai Research Centre Pty Ltd
Original Assignee
Top Ai Res Centre Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2021900262A external-priority patent/AU2021900262A0/en
Application filed by Top Ai Res Centre Pty Ltd filed Critical Top Ai Res Centre Pty Ltd
Application granted granted Critical
Publication of AU2021101323A4 publication Critical patent/AU2021101323A4/en
Assigned to TOP AI RESEARCH CENTRE PTY LTD reassignment TOP AI RESEARCH CENTRE PTY LTD Request for Assignment Assignors: AIBUILD PTY LTD
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • A61B5/1117Fall detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/043Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/08Elderly
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work

Abstract

AIBUILD has developed an innovative machine-learning fall prevention system which can be used to effectively predict, detect and reduce patient falls in both aged-care facilities and hospitals. The invented fall prevention system uses a Microsoft Azure Kinect depth camera as the primary sensor, allowing the system to extract human body skeleton data and hence track human movements in the camera's field of vision. With the extracted real-time data, the developed machine learning method is able to successfully detect patient falls with an accuracy of at least 90%. With regards to personal security, the privacy of both patients and hospital workers can be protected by transforming the appearance of those monitored into a simple stick-figure format. When a fall is detected or is predicted to occur, the developed mobile or computer app will automatically send a notification to the doctors and nurses associated with the patient. This allows hospital staff to quickly intervene and stop the fall before it occurs, or provide immediate assistance to the patient after the fall has occured to mitigate the severity of any fall-inflicted injury. 2/3 10210 103 104 207 205 206 209 208 6. Fig. 2

Description

2/3
10210
103 104 207
205
206
209 208 6.
Fig. 2
AUSTRALIA Patents Act 1990
INNOVATION PATENT SPECIFICATION METHOD FOR FALL PREVENTION, FALL DETECTION AND ELECTRONIC FALL EVENT ALERT SYSTEM FOR AGED CARE FACILITIES
The invention is described in the following statement.
METHOD FOR FALL PREVENTION, FALL DETECTION AND ELECTRONIC FALL EVENT ALERT SYSTEM FORAGED CARE FACILITIES BACKGROUND
[0001] The World Health Organisation (WHO) defines a fall as an event which results in a person coming to rest inadvertently on the ground, floor or other lower level (1). Falls is a major cause of morbidity and mortality in elderly patients, representing 75% of hospitalisations relating to injuries in older people (2). Tsai and Hsu's study (2019) (3) reports that one in five elderly people experience a fall every year. Gasparrini et al. (2015) (4) states that approximately 28~35% of people aged over 65 fall each year. This proportion increases up to 42% for elders aged over 70. These statistics clearly illustrate a pattern that the risk of an eldery person's fall has a strong proportional correlation with their age. Australia is facing an aging population. Based on statistics from Australian Institute of Health and Welfare (AIHW) (5) in 2017, there were 3.8 million Australians aged 65 and over, representing 15% of the total Australian population. Moreover, this proportion is projected to increase to 18% in 2027 and 22% in 2057. Consequently, it can be expected that the number of Australian elderly fall incidents will increase dramatically in the future.
[0002] Falls experienced by elderly people can be catastrophic, leading to a range of consequences such as tissue damage and bone fractures. The repercussions are exacerbated when elderly people remain on the ground for a long period of time after a fall as a result of late discovery by others. This situation is defined as a 'long-lie' condition. Bourke et al.'s study (2007) (6) states that nearly 50% of elderly people who have experienced a 'long-lie' die within 6 months. Thus, mortality due to falls in elderly people are mainly caused by late discovery and treatment. On the other hand, Tsai and Hsu's study (2019) (3) shows that the risk of death can be reduced by 80% if an elderly patient who has fallen can be discovered and taken care of quickly.
[0003] During the COVID-19 pandemic, there has been an increased risk of falls in aged care facilities due to depletion of staff and lack of experience. Furthermore, due to the decreased number of visits from friends and families, there is an increased risk of undetected falls occurring in private homes, especially in elders who live alone. Thus, there is an urgent demand for a fall prevention and detection system which can both predict and alert falls in the elderly. As such, we present our invented fall prevention and detection system, which has a high accuracy of fall detection and easy-to-deploy attributes.
EXISTING TECHNOLOGY
[0004] In Australia, a variety of different technological devices are used to prevent potentially dangerous falls in both hospital and household settings. However, current fall-prevention devices may be inconvenient for both nurses and patients, or may be ineffective at protecting them from fall-inflicted harm.
[0005] Fall detection technologies can broadly be split into three classifications * Wearable sensors * Ambient sensors " Vision based technologies
[0006] Wearable sensors are usually worn around the wrist of the at-risk individual and typically use accelerometers to measure body inclination and gyroscopes to estimate rotational acceleration. Though these devices can be easily applied in any setting, its applications are limited due to the inconvenience of patients needing to remember to wear the sensors, the discomfort associated with wearing these devices, people forgetting to carry their wearable sensors with them and the requirement of needing to recharge or replace batteries for the device to function. Use of such devices also relies on the user's ability to manually activate the alarm after the fall has occurred; if a serious accident were to occur, such as breaking a bone or losing consciousness following the fall, the user would be unable to call for help and attain assistance quickly. To counter this, some wearable sensors are able to automatically detect if a patient has fallen, for example, the MePACS medical watch. However, as stated on the website of the product, not all falls can be automatically detected by the device as sensors are only triggered in the event of significant falls (falls from approximately waist height). Moreover, if the individual's fall is broken by an obstacle, or if the user braces themselves, slumps, or rolls onto the floor, this will unlikely activate the device and will require manual activation. Since movements when falling, such as bracing, are common responses for many whilst in the motion of falling, this limits the device's ability to accurately perceive a fall from its user. In addition, these devices tend to be too sensitive, resulting in many false alarms as real falls are unable to be discriminated from abrupt movements. The sensors usually require frequent manual calibration as a result of fluctuations in humidity and temperature which increase the likelihood of false alarms occurring (7).
[0007] Ambient sensors detect changes in the environment around the patient as a means of detecting patient falls. These types of technological devices include acoustic sensors to measure the sound of patient falls, infrared sensors mapping out the heat signature of patients and pressure sensors to detect weight changes to the floor. Whilst these devices do report a very high percentage of accuracy in detecting patient falls, large modifications are required to the installation environment and will thus require high financial investments for such a measure to be installed (8). Every room, home or hospital environment consists of different configurations, and ambient sensors are too expensive and impractical for installation in the majority of circumstances .Another challenge consistent with ambient sensors is the inability to differentiate between animals, humans and objects. For example, an excessive weight change recorded by the pressure sensors on the floor may be the result of falling furniture instead of a patient fall, resulting in false alarms that would serve to become a consistent inconvenience for hospital staff.
[0008] Vision based-devices use a single or multi-camera system to analyse the volume distribution of the target patient along the vertical axis, activating an alarm when most of the volume is close to the ground for a given period of time. This type of fall-prevention technology tends to be unable to detect falls with more than one person in the target room or when the patient is partially occluded by furniture. The greatest concern surrounding vision-based devices is the breach in privacy of both patients and hospital staff who will undergo continuous surveillance in vision-based device installed rooms. If a breach in the video surveillance system or storage system were to occur, this could be detrimental to both staff and patients and could severely undermine their trust and confidence in the technology.
[0009] Another type of fall-prevention technology that exists outside of the three main categories are bed and chair alarms, which alert personnel when at-risk patients leave a chair or bed without any assistance. However, current evidence suggests that the use of these alarms as a single fall prevention strategy has no effect on the rate of falls (9). The effectiveness of the alarms is also mitigated by the many frequent false alarms this device generates which bear a constant inconvenience to hospital staff. Another limitation lies in alarm fatigue, in which hospital personnel become desensitized to the alarms.
SUMMARY
[0010] The proposed fall detection system, named CaptureFall, consists of a server, a mobile application and one or more depth sensors, each sensor is connected with a computer to execute fall detection and risk analysis algorithms. When a fall is detected or is predicted to occur, the computer sends notifications to the server along with a short recording captured by the depth sensor, and the developed application will automatically send a notification to the doctors and nurses associated with the patient. This allows hospital staff to quickly intervene and stop the fall before it occurs, or provide immediate assistance to the patient after the fall has occured to mitigate the severity of any fall-inflicted injury.
[0011] The proposed fall detection algorithm takes the depth frame captured by the sensor as an input, then retrieves the spatial coordinates of the body joints of a person and other information from the depth frame. With the extracted real-time data, the developed machine learning method is able to successfully detect patient falls with an accuracy of at least 90%. With regards to personal security, the privacy of both patients and hospital workers can be protected by transforming the appearance of those monitored into a simple stick-figure format.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 shows the overall architecture of the fall detection system. Fig. 2 shows the flow chart of the process of the alert system when a fall is detected. Fig. 3 shows the sequence diagram of the process when a fall is detected.
DESCRIPTION OF EMBODIMENTS FALL PREVENTION ALGORITHM
[0012] Fall prediction is a method intended to predict falls risk and send alerts before a fall occurs. Current methods mostly focus on wearable devices to collect data on user behaviour and estimate falls risk. Majumder et al. (2013) (10) developed a falls prediction system based on smartphones and smart shoes with pressure sensors. The inertial measurement unit (IMU) data and pressures from smart shoes can provide gait and balance information. When an abnormal gait is detected, the mobile app will send a notification.
[0013] Compared to wearable devices, depth cameras are less intrusive, and there is no need to worry about the inconveniences associated with wearing the device or battery problems. Staranowicz et al. (2013) (11) developed a falls prediction system using Kinect devices. Gait can be extracted from the output of skeleton data from Kinect devices. They extract features that include stride length, stride duration, and centre of mass motion, and this achieves a reasonable accuracy of measurements. Tao & Yun (2017) (12) proposed a fall prediction method based on Recurrent Neural Network (RNN) using Kinect devices. The RNN is trained to classify the activity of daily living and falls. This method achieves an accuracy of 91.7% and can provide a 333 ms warning time before the person hits the ground.
[0014] Our fall prediction method is based on similar patterns and activities that patients undertake that could cause them to be at risk of a fall. Therefore, features should be able to capture action that may lead to falling. Changing their posture could be a high fall-risk action for elderly people as it could lead to postural hypotension (Nyberg, 1997) (13). Based on research conducted by the Canadian Institutes of Health Research (CIHR) (14) team during the period between 2007 and 2010, 227 falls by elderly people were captured and analysed. The most common cause of falls in elderly people was the incorrect transfer or shifting of body weight, which accounted for 41% of all captured falls. This can be caused whilst performing ordinary activities such as walking forward, turning, getting up, sitting down, etc. Therefore, for the purpose of fall prediction, we found a feature which is closely related to the balance of human body posture.
[0015] Considering the main scenario where aged people need help when attempting to sit down or stand up, a feature of the detection technology should capture the transition between a sitting and standing posture. This feature is the horizontal distance between the pelvis joint and the feet joints' centre which can change significantly if the person is about to lose balance or fall. Thus, we choose the speed of this distance change as one of the features used to predict fall risk.
[0016] Another important feature is the vertical speed of the pelvis joint. A high vertical speed can happen when the person is about to lose balance or has a tendency to fall. These kinds of actions may require too much strength for elderly people. Thus, this is also a factor in calculating the risk of fall.
[0017] Finally, the time that a person lays down or sits on the floor should be taken into consideration. If someone is lying on the floor for an extended period, the underlying reason for their inability to get up could be a serious health condition like a heart attack. According to research by the CIHR team, 11% of falls in elderly people lead to collapse or loss of consciousness (14). Therefore, our fall prediction algorithm also takes these scenarios into account. We regard the pelvis joint located below a specific height as an indication that the person is on the floor. Thus, the time that the pelvis joint is below a specific height is used as a feature to predict fall risk.
[0018] Three features mentioned above are chosen to predict the fall risk: " The horizontal speed between the pelvis joint and the feet joints' centre " The vertical speed of the pelvis joint. " The length of time in which the pelvis joint is below a specific height
[0019] These three features will be weighted to calculate the risk of fall. When the risk reaches the highest limit, a lower-level alarm will be sent to the healthcare professional involved in the patient's care, as well as the patient's relatives, to alert them of the potential risk.
[0020] The CIHR team's research (14) also states that trips, hits and slips are also common causes of elderly people falls, which accounted for 21%, 11%, and 3% respectively of all captured falls in the research study. Generally, the falls caused by these factors are unpredictable. The falls caused by these factors should be prevented by the means of intense care and potential hazard elimination in aged care facilities.
DESCRIPTION ON FALL DETECTION ALGORITHM
[0021] The proposed fall detection methods in the study by Gasparrini et al. (2015) (4) are based on heterogeneous data, which consists of depth image data and inertial measurement unit (IMU) data gathered by the joint use of a camera-based system and wearable IMU devices. The camera-based system used was the Microsoft Azure Kinect, which allowed them to obtain data on the subjects'human body skeleton. Furthermore, they mounted IMU onto the wrist and waist of the subject to acquire the orientation and acceleration data on these points of interest.
In the first algorithm, the variation in skeleton joint (base of the spine) position, magnitude of the acceleration of the wrist IMU and the relative angle between the gravity and arm (obtained by the waist IMU) are the three major features used for fall detection. The system will recognise a fall by the subject when the difference in vertical position of the spine base joint between two successive frames (about 33ms) is over a threshold of 50cm, the wrist IMU detects acceleration greater than 3g within a 2 second time interval where time is centered the instant the first feature is triggered, and when the relative angle between the arm and gravity (detected by waist IMU) is within the interval of 70 and 110 degree for at least 0.5s. The algorithm has an average accuracy of 79%.
[0022] The second algorithm takes the magnitude of the acceleration of the waist IMU, and vertical height of the spine base joint from the floor into consideration. They use the vector-mathematics to calculate the real-time vertical height of the spine base joint of the subject from the floor. If the vertical height of the spine base joint is below 20cm, and the acceleration of the waist obtained by IMU is over 3g within the 2s time window, the center of the time window is the time instance where the first feature is triggered, the system will recognize the fall behaviour of the subject. This algorithm has achieved an average accuracy of 99%.
[0023] On the other hand, Nizam et al.'s team (2018) (15) proposed a fall detection method which is purely based on the Microsoft Azure Kinect. The proposed method adopts two basic features for fall detection: the velocity and the height of the subject skeleton. The algorithm uses the floor plan and the human body skeleton coordinates generated from the Kinect as the basis for the computation of the velocity and height.
[0024] Besides this, the team also introduced a falls risk factor in the fall detection process. The different fall risk level of the subject can lead to different orders in which the two basic features (height and velocity) are applied in the fall detection procedure. The fall risk factor is computed based on the three parameters including step symmetry, trunk sway, and arm spread- features which are widely used in the fall risk assessment. These features are determined based on the obtained subject skeleton data from the Kinect sensor.
[0025] For the entire fall detection algorithm, the system will calculate the fall risk factor, the height and the velocity of the subject. If a high velocity of the subject is detected, a potential fall alert will be directly triggered for fall confirmation. If the fall risk factor is high, the system will calculate another new velocity and height for the fall detection. If both the fall risk factor and the velocity is low, the system will use the height and acceleration (derivative of velocity) to identify the potential fall event. The accuracy of the proposed method is 88.57% based on the University of Rzeszow Fall Detection (URFD) dataset.
[0026] Our project is based on the Microsoft Azure Kinect depth camera, which is a device that can capture depth frames. A depth frame shows the distance from the camera and object in each pixel. A windows computer is applied to collect depth frames from the depth camera using the Microsoft Azure Kinect Software development kit in the C++ environment. Once depth frames are collected, skeleton data, including the coordinates of joints in the human body, is extracted from the depth frames. Both skeleton data and depth frame should be written into a named pipe (Interprocess Communications) once the Python program requests a new frame. The Python program can read skeleton data and depth frames from the named pipe. The Python program sends depth frames to the server via the Real-Time Messaging Protocol (RTMP). Limited amount of depth frame is also stored in a buffer so that a video can be generated later.
[0027] In our designed method, the skeleton data is analyzed by the Python program and organized into several features. These features are: " The distance between pelvis joint and floor " The vertical speed of the pelvis joint. " Step symmetric index which is calculated by the difference of step length between different feet. " Trunk sway index which is calculated by the angle that the trunk rolls from the normal vector. " Balance index which is calculated by the horizontal distance between the pelvis joint and the centre of both feet.
[0028] Once these features are calculated, they should be fed into a trained support vector machine (SVM) to identify whether these features indicate a fall. The SVM is trained by the same features extracted from the TST Fall Detection Dataset. The accuracy of this SVM model is around 90 per cent. When a fall is detected, the Python program will send a message to the server including the device ID and a 15-second video recording of the falling process captured by the camera.
DETAILED DESCRIPTION OF THE FUNCTIONALITIES
[0029] Fig. 1 shows the overall architecture of the fall detection and prevention system. The entire system consists of a depth sensor 102 which is used to monitor the movement of the care receiver 101, a local computer 103, and a remote server 104 that connects to every local computer 103 by network and to a mobile application 105. The main functionality of the system is to help medical staff 208 monitor and detect patient 101 movements.
[0030] The fall detection and prevention system provides a feature that allows the healthcare workers 208 to monitor the actions of the patients 101 in real time by watching the live stream session recorded by the depth sensors. With the camera set in different locations (e.g. bathroom), the mobile application provides a page that shows one or more live stream sessions. This functionality allows medical staff to keep an eye on their patients through this panel and possibly predict an accident before it happens. Even if all medical staff are too busy to sit in front of the screen, the system will automatically react to emergency situations. For example, once the camera detects a patient falling down, the application will send a notification to the medical staff and their family members, so that they can react to the situation immediately. RTMP transmission protocol is used to send the captured data from the local computer 103 to the server 104.
[0031] Furthermore, the systems will also display a real-time prediction of the patient's fall risk level, computed by the local computer 103 using Al and machine learning algorithms. If the predicted risk is over a preset threshold, the system will automatically send the alert to relevant healthcare workers 208 so that they can quickly attend to the patient's room to intervene and prevent a potential patient 101 fall.
[0032] In terms of the depth sensor 102 and the local computer 103, the implemented systems use Microsoft Kinect technology. RealSense from Intel is another alternative option for use as depth sensors. The local computer connects to the depth sensor by a wired connection and the fall detection and prevention algorithms run on the local computer.
[0033] In terms of the mobile application, the main users of the mobile application 105 are healthcare workers 208. Additionally, the family members 209 of the care receiver can use the mobile application as well, however, less information is displayed to family members. When the medical staff finishes the registration process, they can manage the care receivers by adding, modifying, and deleting their patient's information, along with their family members' information. The records of the care receivers, medical staff and family members are stored in the database 207 of the server 104.
[0034] In terms of the server 104, the main responsibilities of the server are as follows: 1. Data Recording; 2. Request Transfering; 3. Video Streaming. The server is created using a set of tools and services provided by Amazon Web Services (AWS), which can be used to run the program of the server.
[0035] Preferably, as the data captured from the depth sensor 102 can be sent to the server 104, the captured data can be used as a source of training data, which can then be used to further improve the performance of the machine learning model. A de-identification process takes place to remove any sensitive information, preventing the machine learning model from becoming too overfitting, and hindering the system operator's ability to deduce the identity of the person from the collected data. However, it is also possible to set up the server in a local area network (LAN) for security reasons.
[0036] The process of the alarm system is described as follows, which can be shown in Fig. 2 and Fig. 3. Initially, when a care receiver falls 101, it is captured by the depth sensor 102. The local computer 103 will evaluate the situation based on the machine learning model and detect the care receiver's fall. After the evaluation, the computer 103 will send a request to the server 104 to notify the server about the event. The server program 205 will write the information of the event into a database 207 and search the database to find out the corresponding healthcare workers 208 and other family members to notify 209. The program of the server 205 will conduct searches on relative information such as the information of the device that sent the request in the database 207, the corresponding location of the device and the information about the care receiver who is associated with the device. The program of the server 205 will also search for responsible healthcare workers 208 and family members that should be notified. Next, push notifications will be sent using a push notification module 206. The push notifications will be sent to the devices of healthcare workers 208 and family members 209. The notification will then be confirmed by the healthcare worker or family members when they have viewed the video capturing a fall.
IMPROVEMENTS
[0037] Unlike wearable sensors, CaptureFall functions completely automatically. In this way, the CaptureFall device is able to provide immediate assistance to patients who may not have the capacity (e.g. those who have been knocked unconscious or severely imparied) to manually activate an alarm. It also has a higher accuracy as its skeletal tracking system will be able to differentiate between real falls and other types of falls that wearable sensors cannot detect, such as a braced fall and rolling onto the floor. Furthermore, CaptureFall has an extremely useful function that wearable sensors do not possess - it can assess the risk of the fall and only alert nurses of the danger if the patient's fall is serious. Using the skeletal tracking system, CaptureFall analyses the movements and positions of the patient's joints and limbs and makes a risk level assessment. The longer the patient remains on the floor after the fall, the higher the risk assessment. The CaptureFall system will then alert nurses to the fall once it reaches a certain risk level. This is beneficial as it reduces the potential of false positives occurring, reducing the effect of alarm fatigue of other devices previously mentioned.
[0038] With regards to vision-based devices, CaptureFall also utilises a camera that streams the patient's room, and hence shares the controversy surrounding the privacy of patients. However, footage of patients captured by CaptureFall can reduce the patient's appearance to a skeletal format, thus completely protecting the patient's identity. Additionally, unlike many other vision-based devices, the CaptureFall skeletal tracking system is able to track the patient's body and movements through furniture, allowing falls obscured by furniture to be still picked up and analysed by the CaptureFall system by employing prediction of patient positions. It can also detect falls with multiple people in the same room, a feature that is non-existent in many vision-based devices.
[0039] In comparison to ambient sensors, CaptureFall only utilises a single specialised camera in each room whereas ambient sensors, such as pressure sensors, require inordinate amounts of modifications and installations in the target room for it to be effective. Since both devices have a similar measurement of accuracy when it comes to their precision in detecting falls, the CaptureFall system would be the far better cost effective alternative between the two. Its skeletal tracking system also allows the CaptureFall camera to distinguish between humans and objects unlike some types of ambient sensors.
USAGE
[0040] When implementing CaptureFall in medical institutions, four different stakeholder groups will need to be considered: " Hospital operators * Hospital staff * Aged care center " Rehabilitation center * Patients * Legal stakeholders
[0041] In regards to collaborating with hospital operators, several considerations will have to be given. First and foremost, evidence of the efficacy of CaptureFall will have to be trialled and established before widespread usage and implementation. In doing so, we will have to demonstrate that our product outperforms current strategies for fall prevention and detection, in terms of both safety and cost-benefit analyses. It is also important to consider where the cameras will be installed, whether it be in high risk falls patient rooms, or the corridor of high risk wards. According to a study performed by researchers who were looking to implement similar video technology in Germany, hospital operators reported that falls are most problematic on corridors between wards and in remote areas of the medical institution where patients may leave their rooms to go outside (16). In the late evenings, there are not enough nurses to patrol these areas as qualified nursing staff is a scarce resource and many are usually too occupied with their own tasks around the wards. This can leave many night emergencies such as hospital falls in remote areas undetected for critical periods of time. Hence, in addition to CaptureFall cameras being installed in patient wards, it may also serve a great benefit if they are installed in remote areas as well, allowing nursing staff to efficiently detect and deal with such emergencies.
[0042] In addition, the CaptureFall system would need to be implemented in a way that does not hinder the nursing staff's ability to perform their daily tasks. This is especially critical as nurses may be attending critical emergency situations such as medical emergency calls in hospitals, or may be performing other procedures such as collection of blood tests, changing of catheters, or administration of medications, etc.. As such, this poses a potential risk to patient safety as well as the standard of care. Moreover, there exists the potential for nurses to be unaware of notifications and alerts regarding detection of patient falls. To overcome this, CaptureFall employs a monitoring function that will allow healthcare workers at "staff stations" to potentially be able to monitor patient rooms and receive alerts about falls centrally. This allows for the remote assessment of potential falls and the decision on whether action needs to be taken or to dismiss a potential false alarm as soon as possible.
[0043] Inevitably, the implementation of such a system emphasises a high demand for privacy protection for both patients and hospital workers. If footage were leaked of a patient in an emergency situation, this may have severe repercussions on the patient's dignity and would draw a deleterious effect on the acceptance of the CaptureFall device. Thus, mechanisms would be put into place to ensure the safe storage of any video footage and processing of videos (such as modifying a person's appearance to a stick-figure format) would be undertaken to protect the identity and relieve the concerns for patients and hospital staff who will experience this ongoing surveillance. Ultimately, these privacy protection endeavours will need to be approved by ethics committees to ensure the safe installation of the CaptureFall system in Australian medical institutions. Finally, informed consent will have to be obtained from both hospital workers and patients to ensure that they understand the role of CaptureFall on their responsibilities and safety respectively, the risks and limitations of the product, as well as the safeguards that have been put in place to mitigate these risks. Staff training and policy will need to be implemented to ensure safety and understanding of use to maximise patient outcomes.
FUTURE POTENTIAL USES
[0044] In addition to the main goal of fall prevention in hospitals, nursing facilities and private homes, CaptureFall also has the potential to branch out its use in other medical fields. For instance, CaptureFall could prove to be a valuable device in physiotherapy and rehabilitation. Physiotherapists can create specific exercises for their patient and input these into the CaptureFall system. The CaptureFall camera would then capture the movement of skeletal points whilst the patient is conducting these exercises to ensure the correct execution of the training program and evaluate their rehabilitation progress.
[0045] The promising potential of this type of rehabilitation method is demonstrated in a collaborative research project between Seoul National University and Microsoft Research Asia called "Stroke Recovery with Kinect", in which researchers used Microsoft Kinect technology to create a home-rehabilitation system to assist in the motor recovery of patients who have experienced strokes. The developed system utilises Microsoft Kinect's three-dimensional camera to evaluate and measure the patient's movements as they conduct the therapy whilst also assessing the patient's rehabilitation progress. In order to reduce the potential of patients becoming bored and less motivated to perform the assigned exercises, the training program designed by the physiotherapist is redesigned into a game format. For example, one of the research group's three developed programs is the box-and-block test. This program involves patient's virtually picking up blocks and putting them into a box in a certain period of time, analysing their motor skills, gross manual dexterity and motor skills as they perform the task. In order to reinforce patients to continue the training, patients' scores are displayed after they finish the program, encouraging them to improve their scores with each consecutive session.
[0046] This would be very efficient and convenient for patients as difficult exercises that would otherwise be best performed with the assistance of physiotherapists in hospitals can be conducted in the comfort of their own homes, empowering patients to become independent in the rehabilitation process. Physiotherapists will also be able to easily track the progress of their patients and alter the program during the training process to effectively aid in the full recovery of their patient.
[0047] Another application of CaptureFall could apply in the detection of certain gaits, which can often be indicative of certain conditions or pathologies. For example, those who suffer an acute ischemic stroke event may exhibit a hemiplegic gait as a result of unilateral lesions to the brain. Since the CaptureFall system uses a skeletal tracking system, the walking patterns characteristic to certain medical conditions can be inputted into the device via machine learning, allowing CaptureFall to analyse the walking pattern of patients and provide potential diagnoses that may warrant medical attention. This allows for a more rapid alert and call for help in emergency situations, which may be life saving in situations where minutes and seconds count in altering patient prognosis.
CITATION LIST
1. World Health Organization. January 2018. Falls Fact Sheet. Available: https://www.who.int/news-room/fact-sheets/detail/falls [Accessed 05.03.2021] 2. Australian Institute of Health and Welfare. Trends in hospitalised injury due to falls in older people 2007-08 to 2016-17. Canberra: AIWH; 2019. 3. Tsai T-H, Hsu C-W. Implementation of fall detection system based on 3D skeleton for deep learning technique. IEEE Access. 2019;7:153049-59. 4. Gasparrini S, Cippitelli E, Gambi E, Spinsante S, Wahs16n J, Orhan I, et al., editors. Proposal and experimental evaluation of fall detection solution based on wearable and depth data fusion. International conference on ICT innovations; 2015: Springer. 5. Australian Institute of Health and Welfare. Older Australia at a glance. Canberra: AIWH; 2018. 6. Bourke A, O'brien J, Lyons G. Evaluation of a threshold-based tri-axial accelerometer fall detection algorithm. Gait & posture. 2007;26(2):194-9. 7. Shu F, Shu J. An eight-camera fall detection system using human fall pattern recognition via machine learning by a low-cost android box. Scientific reports. 2021;11(1):1-17. 8. De Miguel K, Brunete A, Hernando M, Gambao E. Home camera-based fall detection system for the elderly. Sensors. 2017;17(12):2864. 9. Mileski M, Brooks M, Topinka JB, Hamilton G, Land C, Mitchell T, et al., editors. Alarming and/or alerting device effectiveness in reducing falls in long-term care (LTC) facilities? A systematic review. Healthcare; 2019: Multidisciplinary Digital Publishing Institute. 10. Majumder AJA, Zerin I, Uddin M, Ahamed SI, Smith RO. SmartPrediction: A real-time smartphone-based fall risk prediction and prevention system. Proceedings of the 2013 Research in Adaptive and Convergent Systems2013. p. 434-9. 11. Staranowicz A, Brown GR, Mariottini G-L, editors. Evaluating the accuracy of a mobile Kinect-based gait-monitoring system for fall prediction. Proceedings of the 6th international conference on PErvasive technologies related to assistive environments; 2013. 12. Tao X, Yun Z. Fall prediction based on biomechanics equilibrium using Kinect. International Journal of Distributed Sensor Networks. 2017;13(4):1550147717703257. 13. Nyberg L, Gustafson Y. Fall prediction index for patients in stroke rehabilitation. Stroke. 1997;28(4):716-21. 14. Robinovitch SN, Feldman F, Yang Y, Schonnop R, Leung PM, Sarraf T, et al. Video capture of the circumstances of falls in elderly people residing in long-term care: an observational study. The Lancet. 2013;381(9860):47-54. 15. Nizam Y, Mohd MNH, Jamil M. Development of a user-adaptable human fall detection based on fall risk levels using depth sensor. Sensors. 2018;18(7):2260. 16. Krempel E, Birnstill P, Beyerer J. A Privacy-Aware Fall Detection System for Hospitals and Nursing Facilities. European Journal for Security Research. 2017;2(2):83-95.

Claims (5)

1. A method of fall detection and/or prevention algorithm that is able to analyse the potential risk of a care receiver and detect whether the care receiver has fallen down to the ground. The body tracking algorithm comprises of: data recorded from depth sensing devices as input; a method of processing depth data and extracting data on body joints; and a machine learning model which can analyse the data of a body skeleton sequence and output the risk of falling based on the movement of body joints;
2. A fall detection and/or prevention system, comprising: one or more depth sensors with each sensor connected with an infrared camera that is configured to capture depth information and record video in low light environments; one or more local computers, where each computer is connected with one or more depth sensors. The fall detection algorithm uses data from the depth sensors to determine if a fall has occurred. This fall detection algorithm is configured to: extract the care receiver's biometric data and the spatial coordinates of the care receiver's body joints from the data captured by the depth sensors; analyse the care receiver's body movement from all joint coordinates of the care receiver; analyse the fall risks of the care receiver, based on the care receiver's current actions; report the fall event with a short video about the course of the event to the central server; detect whether a fall has occurred; a central server which can be either cloud-based or can run in a local area network to store the captured video data and data of the care receivers and hospital staff. This central server is configured to: summarise all fall events sent from the local computers; send an alarm alert to care givers and a short video recording showing the fall event that has taken place; a mobile application which is configured to: receive sent notifications from the server allow the user of the application to check the video recording and send alerts to the relevant healthcare worker;
3. The fall detection and/or prevention system as claimed in claim 2, wherein the sensors are further configured with a speaker and a microphone, and local computers are further configured to allow hospital staff to communicate with the care receiver by telecom using the speaker and microphone.
4. The fall detection and/or prevention system as claimed in claim 2, wherein the server is further configured to process video live stream data for real time video transmission, and the mobile application is configured to show video live streams.
5. A method of training the machine learning model which is used in the fall detection algorithm claimed in claim 1, wherein the training method is based on a series of body joint data, and the video recorded by the depth sensor can be used as training data after de-identification.
AU2021101323A 2021-02-04 2021-03-14 Method for fall prevention, fall detection and electronic fall event alert system for aged care facilities Active AU2021101323A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2021900262A AU2021900262A0 (en) 2021-02-04 Method for fall prediction, fall detection and electronic fall event alert system for aged care facilities
AU2021900262 2021-02-04

Publications (1)

Publication Number Publication Date
AU2021101323A4 true AU2021101323A4 (en) 2021-05-06

Family

ID=75714292

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021101323A Active AU2021101323A4 (en) 2021-02-04 2021-03-14 Method for fall prevention, fall detection and electronic fall event alert system for aged care facilities

Country Status (1)

Country Link
AU (1) AU2021101323A4 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113537051A (en) * 2021-07-14 2021-10-22 安徽炬视科技有限公司 Marine personnel safety operation monitoring algorithm under complex large scene

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113537051A (en) * 2021-07-14 2021-10-22 安徽炬视科技有限公司 Marine personnel safety operation monitoring algorithm under complex large scene

Similar Documents

Publication Publication Date Title
US20210049887A1 (en) Fall detection and reporting technology
El-Bendary et al. Fall detection and prevention for the elderly: A review of trends and challenges
US11282367B1 (en) System and methods for safety, security, and well-being of individuals
Fu et al. Fall detection using an address-event temporal contrast vision sensor
US10262517B2 (en) Real-time awareness of environmental hazards for fall prevention
JP2015103116A (en) Automatic report system
KR102413893B1 (en) Non-face-to-face non-contact fall detection system based on skeleton vector and method therefor
Potter et al. Evaluation of sensor technology to detect fall risk and prevent falls in acute care
AU2022203004A1 (en) System for recording, analyzing risk(s) of accident(s) or need of assistance and providing real-time warning(s) based on continuous sensor signals
CN115116133A (en) Abnormal behavior detection system and method for monitoring solitary old people
AU2021101323A4 (en) Method for fall prevention, fall detection and electronic fall event alert system for aged care facilities
Amir et al. Real-time threshold-based fall detection system using wearable IoT
JP2020145595A (en) Viewing or monitoring system, or program
Chiu et al. A convolutional neural networks approach with infrared array sensor for bed-exit detection
Van Wieringen et al. Real-time signal processing of accelerometer data for wearable medical patient monitoring devices
Ojetola Detection of human falls using wearable sensors
KR102608941B1 (en) Abnormal behavior detecting system using artificial intelligence
JP2014092945A (en) Physical condition determination system and physical condition determination method
KR102364424B1 (en) Remote fall detection and determination system using an image collection device
Khawandi et al. Integrated monitoring system for fall detection in elderly
Silapasuphakornwong et al. A conceptual framework for an elder-supported smart home
Mendulkar et al. A survey on efficient human fall detection system
Huq et al. Evaluation of tri-axial accelerometery data of falls for elderly through smart phone
KR102558653B1 (en) Smart care notification and method performing theof
AU2021104545A4 (en) Smart IoT based Third Eye for Protection from Abnormal Activities

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
PC Assignment registered

Owner name: TOP AI RESEARCH CENTRE PTY LTD

Free format text: FORMER OWNER(S): AIBUILD PTY LTD