GB2605623A - Methods and tools for remote frailty and gait analysis - Google Patents

Methods and tools for remote frailty and gait analysis Download PDF

Info

Publication number
GB2605623A
GB2605623A GB2104966.3A GB202104966A GB2605623A GB 2605623 A GB2605623 A GB 2605623A GB 202104966 A GB202104966 A GB 202104966A GB 2605623 A GB2605623 A GB 2605623A
Authority
GB
United Kingdom
Prior art keywords
frailty
analysis software
gait
pose estimation
human
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2104966.3A
Other versions
GB202104966D0 (en
Inventor
Gkouzionis Ioannis
Mccarthy Ben
Boyd Stephen
Bhatta Devaki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perseptive Ltd
Original Assignee
Perseptive Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perseptive Ltd filed Critical Perseptive Ltd
Priority to GB2104966.3A priority Critical patent/GB2605623A/en
Publication of GB202104966D0 publication Critical patent/GB202104966D0/en
Publication of GB2605623A publication Critical patent/GB2605623A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • G06V40/25Recognition of walking or running movements, e.g. gait recognition
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Abstract

Using a motion capture system with a camera and computer arrangement, performing gait analysis of a person using machine learning (artificial intelligence) to automatically diagnose frailty. Video is taken of patients and used as input and a human pose estimation software retrieves coordinates of body parts which are then transformed into signals and movement histograms to be used as feature descriptors. The camera and software work in a high frames per second rate and thus lead to greater accuracy. No marker-based (i.e. wearable markers/devices) motion capture is needed, meaning the method of diagnosing frailty is less invasive.

Description

Methods and Tools for remote Frailty and Gait Analysis This invention relates to a software incorporated solution and workflow methodology for health monitoring and intervention in frailty by applying machine vision and indoor positioning technologies.
Figure 1 shows the complete frailty and gait analysis system tools.
Figure 2 shows the exemplary components of the frailty and gait analysis method structure.
Gait can be defined as a manner or style of walking. There are studies asserting that every individual has aunique gait pattern, what has led gait to be considered as a new biometric feature. Gait analysis is the systematic study of human walking for recognizing of gait pattern abnormalities, postulating its causes, and proposing suitable treatments. Gait analysis is commonly used in clinical applications for recognition of a health problem or monitoring patient's recovery status. The traditional clinical gait analysis is performed by clinicians who observe the patients' gait characteristics while he/she is walking.However, this method is subjective and depends on the experience and judgment of the clinician. As a result, it can lead to a confusion and has a negative effect on the diagnosis and treatment decision making of pathologies.
The process of clinical gait analysis can be facilitated through the use of new technologies, which allow an objective measurement and reduces the confusion and error margin of the subjective methods.
There has been extensive research in the recent past to develop a methodology using different techniques for gait analysis. Previously, methods based on silhouettes, like Gait Energy Image (GEO/were employed to solve the gait analysis problem. Although these methods have shown some advancement in the process of gait analysis, there is the unavailability of a proper gait analysis mechanism for the subjects where problematic constraints such as the view angle, walking speed, clothing, surface, carrying status, shoe and elapsed time are considered and minimalized in an efficacious way.
To address these needs, the present invention proposes an automatic system for analysing key clinically validated aspects of physical frailty including gait speed, gait instability, waddling gait, using props to stabilize gait, decreasing stride length, widened gait, reduced or abnormal cadence, decreased foot ground clearance and the time taken to rise out of a chair. The invention analyzes and classifies gait patterns using digital cameras as the only required equipment and provides a tool hr constant and ubiquitous gait monitoring of patients and elderly people while living in their homes. Such a tool can be additionally used on patients with orthopedic problems, Parkinson's disease as well as on post stroke patients to evaluate their gait and allow medical professionals to offer personalized treatment to individuals suffering from varying degrees of gait degeneration.
Human pose estimation is combined with a rule-based method for gait analysis and recognition. This information is processed generating signals and movement histograms, which are used as features, robust to the aforementioned aspects. The reason to use human pose estimation for gait analysis, is that in pose estimation, deep learning-based Convolutional Neural Network (CNN) is used to detect the body points of a person without the worry of any problematic constraint. The algorithm used in this invention takes an RGB video as input and produces the two-dimensional (2D) locations of anatomical key points, such as knees, elbows, hips, for each person in every frame of the video. This provides skeletal images where body points are joined to form a skeleton of a person. Only these obtained skeletal images are further used for classification and gait analysis.
In Figure 1: The Patient Frailty Analysis System may be configured to analyse input data (e.g., image data, pose metrics) from the Edge MV device in order to identify and quantify arange of motion of a body part or limb and to assist a clinician with the diagnosis. [2]
The Patient Frailty Analysis System may include a display or display device. The display may be a screen. The display screen may or may not be a touchscreen. The display may be configured to show a user interface (UI), or a graphical user interface (GUI) rendered through an application (e.g.) via an application programming interface (API) executed on the patient frailty analysis system). The GUI may show images, charts and analytics relating to the gait. The Ul may be configured for representing and delivering analytics, sensor data (e.g.) video), and processed data to a user (e.g., clinician). A user may navigate within the GUI through the application. For example, the user may select a link by directly touching the screen (e.g., touchscreen). Alternatively, the user may select a portion of an image with aid of a user interactive device (e.g., mouse, joystick) keyboard, trackball, touchpad, or any other device). The patient frailty analysis system would provide a front end to the API to review the patients results in detail, compare to previous tests,track improvements, etc. Also in Figure 1: The Edge MV (Machine Vision) Device includes both hardware and software elements. The Edge MV Device may contain a camera or imaging sensor operable coupled to an embedded computing board. The Edge MV Device can be controlled by an application/software configured to take image(s) or video(s) of the patient. The camera is able to capture dynamic image data (e.g., video). The camera may comprise optical elements (e.g., lens,filters). The camera may capture colour images (RGB images). The camera may be a monocular camera and images of the patient may be taken from a single view/angle.
The Edge MV Device is configured to obtain image data to track motion and posture of a patient. As described herein, computer vision techniques and deep learning techniques may be used to reconstruct the pose using 2D imaging data. The Edge MV Device may include a display. The display may be a screen. The display screen may or may not be a touchscreen. The display may be configured to show a user interface (UI), or a graphical user interface (GUI) rendered through an application (e.g., via an application programming interface (API) executed on the Edge MV Device). The GUI may assist the clinician with the patient's gait test management. The Edge MV Device may employ machine learning and computer vision-based pose estimation algorithm to provide accurate quantification of range of motion based on image data. The pose estimation algorithm may be capable of quantifying range of motion using 2D image data with improved precision and accuracy. The pose estimation algorithm can be any type of machine learning network such as aneural network. Examples of neural networks include a deep neural network, convolutional neural network (CNN), and recurrent neural network (RNN). The machine learning algorithm may comprise one or more of the following: a support vector machine (SVM), a naive Bayes classification, a linear regression, a quantile regression, a logistic regression, a random forest, a neural network, CNN, RNN, a gradient-boosted classifier or repressor, or another supervised or unsupervised machine learning algorithm (e.g., generative adversarial network (GAN), Cycle-GAN, etc.) As illustrated in Figure 1: The Network itself may be a network that is configured to provide communication between the various components illustrated in Figure 1 which includes the illustrated Cloud API and DB. In this context, Cloud APIs are application programming interfaces used to build applications in the cloud computing market. Cloud APIs allow software to request data and computations from one or more services through a direct or indirect interface. The DB element illustrated in Figure 1 is a generic term for any interfacing Data Base.
In Figure 1: The Cloud API & DB: includes a server in a data network (e.g.) a cloud computing network,) and can be computer programmed to accept requests (e.g.) HTTP, or other protocols that can be initiate data transmission) from a computing device (e.g.) Edge MV Device and/or Patient Frailty Analysis System) and to serve the computing device with requested data.
The Cloud API & DB may include a data management system that may construct the database for fast and [3] efficient data retrieval, query, and delivery. The databases may store, for example, raw data collected by the camera located on the Edge MV Device. The databases may also store userinformation, historical data patterns, data relating to a testing, medical records, analytics, user input (e.g., statements or comments indicative of how the user is feeling at different points in time, etc.), and so forth.
The Cloud API & DB is essentially handling all of the data, accepting new data, and allows it to be retrieved at a subsequently required point in time.
The Network as illustrated in Figure 1 may be implemented as one or more networks that connect devices and/or components in the network layout for allowing communication between them.
For example, the Patient Frailty Analysis System, Edge MV Device, and Cloud API & DB may be in operable communication with one another over network. Direct communications may be provided between two or more of the above components. The direct communications may occur without requiring any intermediary device or network. Indirect communications may be provided between two or more of the above components. The indirectcommunications may occur with the aid of one or more intermediary devices or network. For instance, indirect communications may utilize a telecommunications network. Indirect communications may be performed with the aid of one or more routers, communication towers, satellites, or any other intermediary device or network. Examples of types of communications may include, (but are not limited to,) communications via the Internet, Local Area Networks (LANs), Wide Area Networks (WANs), Bluetooth, Near Field Communication (NFC) technologies,networks based on mobile data protocols -such as General Packet Radio Services (GPRS), GSM, Enhanced Data GSM Environment (EDGE), 3G, 4G, SG or Long Term Evolution (LTE) protocols, Infra-Red (IR) communication technologies, and/or Wi-Fi, and may be wireless, wired, or a combination thereof.
In essence, the Network itself may be wireless, wired, and/or a combination thereof.
Figure 2 shows the exemplary components and methodology of the Patient Frailty Analysis System.
The system may comprise of a VioCapture module, Pose Estimation module, Pose Post-Processing module, Activity Recognition module, Gait Analysis module and an API Upload module.
The Video Capture module provides the video of the patient's gait during a test captured by the camera of the Edge MV Device.
The Pose Estimation module is configured to analyse the video stream from the Video Capture module to quantify a range ofmotion in real-time using algorithms described above.
The Pose Post-Processing module filters the output of the pose estimation module. For example, a Kalman filter is used to track the joint key pointsestimated by the previous module, and a simple filter is used to remove bad poses. Moreover, person tracking is applied to differentiate between people in a frame of the video stream.
The Activity Recognition module is used for the gait recognition and provides information about the gait activity of the patient (e.g., walking, standing, sitting).
The Gait Analysis module takes the pose data produced in the previous modules and turns them into useful metrics that describe a patient's gait.
The API Upload module takes the patient data from a test (e.g., the calculated metrics) and sends them to the API that would store them in a database. [4]

Claims (6)

  1. Claims 1. Human gait analysis software that can automatically diagnose the onset of physical frailty in clinical patients using machine learning based human pose estimation and biometricanalysis.
  2. 2. The frailty analysis software works in a high frames per second rate and thus, issues like frame skipping, that lead to reduced accuracy of the pose estimation and tracking, are avoided.
  3. 3. The frailty analysis software does not comprise a marker-based motion capture method where a multitude of markers need to be carefully positioned on an individual's body. These marker-based methods are very time consuming and fatiguing processes and are also not suitable for real-life noninvasive applications and the equipment needed is quite expensive.
  4. 4. The frailty analysis software is not based on human silhouette as this method suffers from manyfactors such as movement on scene, clothing and carrying conditions.
  5. 5. The human pose estimation algorithm used in the frailty analysis software does not result inlocal maxima as in the case of the Iterative Closet Point (ICP) algorithm where initial configuration is critical.
  6. 6. The frailty analysis software does not make use of any Inertial Measurement Unit (IMU) sensorsand wearable devices to measure patient motion which increases the complexity of the processing and disturbs the in-clinic patient's experience.
GB2104966.3A 2021-04-07 2021-04-07 Methods and tools for remote frailty and gait analysis Pending GB2605623A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2104966.3A GB2605623A (en) 2021-04-07 2021-04-07 Methods and tools for remote frailty and gait analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2104966.3A GB2605623A (en) 2021-04-07 2021-04-07 Methods and tools for remote frailty and gait analysis

Publications (2)

Publication Number Publication Date
GB202104966D0 GB202104966D0 (en) 2021-05-19
GB2605623A true GB2605623A (en) 2022-10-12

Family

ID=75883705

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2104966.3A Pending GB2605623A (en) 2021-04-07 2021-04-07 Methods and tools for remote frailty and gait analysis

Country Status (1)

Country Link
GB (1) GB2605623A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070015995A1 (en) * 1998-09-14 2007-01-18 Philipp Lang Joint and cartilage diagnosis, assessment and modeling
CN101807245A (en) * 2010-03-02 2010-08-18 天津大学 Artificial neural network-based multi-source gait feature extraction and identification method
US20180342329A1 (en) * 2017-05-24 2018-11-29 Happie Home, Inc. Happie home system
US20190156496A1 (en) * 2017-11-21 2019-05-23 Reliance Core Consulting LLC Methods, systems, apparatuses and devices for facilitating motion analysis in an environment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070015995A1 (en) * 1998-09-14 2007-01-18 Philipp Lang Joint and cartilage diagnosis, assessment and modeling
CN101807245A (en) * 2010-03-02 2010-08-18 天津大学 Artificial neural network-based multi-source gait feature extraction and identification method
US20180342329A1 (en) * 2017-05-24 2018-11-29 Happie Home, Inc. Happie home system
US20190156496A1 (en) * 2017-11-21 2019-05-23 Reliance Core Consulting LLC Methods, systems, apparatuses and devices for facilitating motion analysis in an environment

Also Published As

Publication number Publication date
GB202104966D0 (en) 2021-05-19

Similar Documents

Publication Publication Date Title
US20210059565A1 (en) Gait-based assessment of neurodegeneration
US9996739B2 (en) System and method for automatic gait cycle segmentation
Lawal et al. Deep human activity recognition using wearable sensors
CN108553081B (en) Diagnosis system based on tongue fur image
Chaaraoui et al. Abnormal gait detection with RGB-D devices using joint motion history features
CN111933275A (en) Depression evaluation system based on eye movement and facial expression
CN113728394A (en) Scoring metrics for physical activity performance and training
Loureiro et al. Using a skeleton gait energy image for pathological gait classification
Eichler et al. Non-invasive motion analysis for stroke rehabilitation using off the shelf 3d sensors
Kupryjanow et al. Updrs tests for diagnosis of parkinson's disease employing virtual-touchpad
Zhang et al. Comparison of OpenPose and HyperPose artificial intelligence models for analysis of hand-held smartphone videos
Romeo et al. Video based mobility monitoring of elderly people using deep learning models
Ivorra et al. Azure Kinect body tracking under review for the specific case of upper limb exercises
US11642046B2 (en) System and method for shoulder proprioceptive analysis
KR20220106026A (en) Apparatus and method for diagnosing disease
Ahmed et al. Kalman filter-based noise reduction framework for posture estimation using depth sensor
Soltaninejad et al. Body movement monitoring for Parkinson’s disease patients using a smart sensor based non-invasive technique
CN116543455A (en) Method, equipment and medium for establishing parkinsonism gait damage assessment model and using same
GB2605623A (en) Methods and tools for remote frailty and gait analysis
Sethi et al. Multi‐feature gait analysis approach using deep learning in constraint‐free environment
Albuquerque et al. Remote Pathological Gait Classification System
Alcaraz et al. Mobile quantification and therapy course tracking for gait rehabilitation
Khokhlova et al. Kinematic covariance based abnormal gait detection
JP7169213B2 (en) Physical health video analysis device, method and system
Katiyar et al. Clinical gait data analysis based on spatio-temporal features