GB2605654A - Remote physiotherapy application - Google Patents

Remote physiotherapy application Download PDF

Info

Publication number
GB2605654A
GB2605654A GB2105143.8A GB202105143A GB2605654A GB 2605654 A GB2605654 A GB 2605654A GB 202105143 A GB202105143 A GB 202105143A GB 2605654 A GB2605654 A GB 2605654A
Authority
GB
United Kingdom
Prior art keywords
exercise
data
user
patient
performance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2105143.8A
Other versions
GB202105143D0 (en
Inventor
Ainsworth Matthew
Mccarthy Ben
Boyd Stephen
Bhatta Devaki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perseptive Ltd
Original Assignee
Perseptive Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perseptive Ltd filed Critical Perseptive Ltd
Priority to GB2105143.8A priority Critical patent/GB2605654A/en
Publication of GB202105143D0 publication Critical patent/GB202105143D0/en
Publication of GB2605654A publication Critical patent/GB2605654A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring

Abstract

Method and system for remote physiotherapy comprising: providing a platform (100 Fig.1) having a user interface (102,103) through which a user may log-on to their account and select an exercise to carry out 407; providing an animated avatar to guide the user through a chosen exercise (309, Fig.3); providing an imaging device (107A,107B, Fig.1) connected to a user machine (106A,106B) to obtain image data of the user performing an exercise 408; using the image data to estimate joint positions using a first image-processing model 410; calculating angle of a relevant joint 411 dependent on chosen exercise 404; generating real-time feedback to correct movement using a second image-processing model 414. The platform may provide login and access for a user’s clinician to prescribe an exercise regime 406, see recorded exercise data 403 such as image data or performance assessment results against chosen exercise parameters 415, and input comments on progress for the user. Some image processing may be carried out remotely (see Figs. 1&2). Real-time feedback may be provided through augmented video, audio and/or visual cues. Performance assessment may be categorised into three groups; reduced, fair and good (see Figs. 5-7). Also claimed cloud-based processing of live video data.

Description

Title: Remote Physiotherapy Application
Introduction and Background:
The pandemic had had a pronounced effect on the health and social care sectors. The unprecedented surge in demand for COVID related healthcare has led to a reduction in other essential services such as chronic disease and elderly care management, particularly face-to-face consultations, preventative interventions, and rehabilitation including physiotherapy. This has left vulnerable groups at an even greater risk of functional decline, in turn increasing the risk of falls, which are the leading cause of accidental death within older people.
New methods of delivering healthcare for vulnerable groups while minimising human contact and maintaining social distancing are required to help maintain care standards, especially in an increasingly aged population and digitally based applications are a potential solution to the problem. This invention can be used remotely by patients in the comfort of their own homes using a computer-based software application and a webcam to provide an avatar led, personalised physiotherapy programme of clinician prescribed exercises with real time, on-screen feedback provided to the patient which is also available for further assessment by the clinician.
This invention also contributes towards the UK Government's Clean Growth Strategy, for reducing public-sector carbon emissions to net-zero by 2050. By providing a physiotherapy application that can be used by patients remotely, the requirement for both patient transport and the requirement for clinical space is reduced.
Methods of providing remote physiotherapy programmes do already exist, however current methods have specific limitations which are addressed by our invention.
An existing method is for the patient to self-report their performance via post exercise questionaries or surveys, which means that data from these approaches is entirely subjective with no consistency of assessment. Our invention is based on wholly objective data, using consistent machine learning based methods to determine how the patient has performed and allowing for the application of customisable success criteria to be applied by a clinician on a per patient level.
Another approach is to record the patient performing exercises and then submit these to the clinician for later review which means that real-time feedback is not provided to the patient. Our invention provides both avatar guidance and real-time on-screen feedback to the patient as they perform the exercise.
Another approach is for the patient to wear motion tracking devices on their limbs, where motion data is captured and used for performance assessment. This type of approach only looks at movement of specific body parts and is dependent on patients putting motion tracking devices on correctly and in the correct position which could be challenging for more elderly patients. Our invention assesses movement of the entire body and uses a basic webcam to achieve this without the need for any wearable motion tracking devices.
Summary of the Invention
The invention consists of an application that is downloaded and installed onto a compatible device that may be a laptop, desktop, or mobile device with either an internal or external imaging device such as a webcam. The application connects via the internet to secure servers for data storage and remote computing resources. The installed application contains the user interface elements of the invention, including security and login, exercise parameters and selection, avatar animations and guidance, performance assessment and data visualisation.
The invention enables a clinician to setup and monitor a programme of personalised exercises for a patient which may be the result of an initial or follow-up face-to-face consultation but can also be done remotely from the patient.
The patient can then access their programme of exercises from the comfort of their own home or elsewhere. A schedule of exercises is provided from which the patient choses which to perform. An onscreen avatar then guides the patient how to perform the exercise correctly and a webcam or other imaging device is used to capture a video of the patient as they perform the exercise. Real time onscreen feedback is provided to the patient by augmenting this video with a summary of their current performance measured against the clinician specified success criteria for that patient calculated using pose estimation data from the capture images and ML based techniques.
Both patients and clinicians can then view detailed performance assessment data for a single exercise or long-term data to determine whether improvement or deterioration has occurred.
Access to performance data may motivate patients to improve their results leading to better clinical outcomes in terms of both prevention and rehabilitation.
For the clinician, access to detailed performance data and pose estimation videos enables them to make more accurate diagnoses and consequently modify the programme of personalised exercises to best suit the patient's needs.
Figure 1 (Fig 1) shows an exemplary environment in which the remote physiotherapy application may be implemented.
Figure 2 (Fig 2) shows exemplary components of the remote physiotherapy application.
Figure 3 (Fig 3) shows a map of the user interface elements of the remote physiotherapy application.
Figure 4 (Fig 4) shows a flow chart of the methods implemented by the remote physiotherapy application.
Detailed Description regarding the Drawings Fig 1
Figure 1 demonstrates an example of a use scenario. The rehabilitation system (100) is mainly implemented through a physiotherapy application (101). Through a login-based system, the system provides both a patient based (102) and clinician-based interface (103) for these persons respectively (104,105).
This application is installed to a compatible user device (106A, 106B) that has a method of connecting to the internet so that a secure server (108) can be accessed that in turn interfaces to several sub services including cloud processing (109) and remote secure data storage (110).
The elements of the physiotherapy application (101) installed on a user device (106A,106B) include the user interface, image capture, data visualisation and elements of data processing. In addition to a working internet connection, the other requirement for user device hardware is that it is fitted with or can interface to a webcam or similar device (107A, B) in order to take video required to produce an assessment of their performance.
In all applications, data related to patient performance, details, schedules, and anything else personally sensitive cannot be stored on any type of user device to properly comply with data protection laws. In all cases this data must be stored within a secure remote database (110), and access to this data regulated so that only the correctly authorised medical personnel can access or allow others to access this data.
For users to access this data, or the service, they are required to use a unique login and secure password. The necessity to maintain security means that whenever the app is opened, a person must login before starting, though an option may be available to log in automatically if the user device is adequately secured. This mandatory login means that the application is restricted to being online only.
Each user of the system can either be registered as a patient (104) or clinician (105), where in each case they are able to perform differing, but intersecting sets of actions (see fig 2 for more details.
The fundamental difference between these two account types is that for patients, it gives the options of viewing the regime, performance, and simple results outputs.
For clinicians, they can set the exercises for patients in addition to view their results. It may be the case that the data provided to the clinician is more detailed than that given to the patient, whether because it's of no use in the standard rehabilitation loop or because it's not easily interpretable by patients.
This login service, as well as all user facing functionality within the application is handled via a user interface. This allows for viewing, selecting, and performing exercises to happen smoothly, as well as the basis for visually provided feedback mechanisms. This interface would have a variety of settings, including accessibility options, and audio options, among others.
The nature of the cloud processing segment (109) is that it provides methods of implementing computing intense methods onto lighter devices (particularly mobile phones) such as pose estimation. To accomplish this, the physiotherapy app (101) connects to the service via the internet and can then send requests to resources on the server. Once it is allocated those resources, they stay allocated until that session has ended, or a specific time of inactivity has passed.
The cloud processing takes in the 2D video of the patient and outputs a set of vectors representing their 'pose skeleton' as well as the feedback of their performance. These elements are then both overlaid onto the original video data of the patient and fed back to them in real time.
As outlined above the first step of calculating exercise performance is to extract the coordinates corresponding to each of the patient's key joints. This is achieved through ML based pose estimation methods, that return a set of coordinates corresponding to key bodily joints (e.g., elbows and knees). By drawing lines between related joints, a set of vectors representing the current pose can then be calculated.
These joint vectors can be further used to calculate the angle between any two given joints. This uses the vector dot product method, which is analogous to the use of a goniometer within clinical practices. This process can be done repeatedly to produce a set of angular measures for a single pose position.
Once the set of angles, vectors and joint positions has been calculated, these can either be compared with a set of pre-defined success criteria or passed through a second ML based method to derive a measure of performance. This is assessed on a percentile scoring system, where scores below 33% are classified as "reduced', scores between 33 and 66% are classified as 'fair' and scores above 66% are classified as "good" respectively.
Once calculated, the current classification is then output to the screen, providing the patient with real time feedback. A repetition of the exercise is detected once the score falls below 0% (i.e., that the patient is no longer classified as performing the exercise), and this is also fed back to the patient via the Ul. Fig 2
Figure 2 gives an overview of how the application functionalities will be distributed in a normal offline installation case.
The fundamental idea with the structure is to reduce the amount of the data that is passed through the online portion of the application to increase the accessibility and affordability of the application. The offline side of the application (201) focuses on the implementation of GUI and data visualisation while the online portion (202) is responsible for dealing with computing heavy processing and secure data storage.
The application offers a range of features to clinicians and patients (203,204), that have several shared and unique features.
One example of these shared features is the data processing and visualisation elements (207). In the patient's case, this relates to how performance and pose data is processed, both to the processing of data to produce visualisations such as bar or line graphs and processes occurring immediately before or after an online task.
For the physiotherapist's side, they can view all the data that is accessible to the patient and may also be able to access more detailed information or additional parameters not accessible by the patient.
A second shared feature is the user interface module (208). This is at the heart of the application and offers the gateway through which all other system functionality can be accessed. As such, the interfaces for the patient and clinician are different, with the patient Ul oriented around the exercising elements, while the clinician side is more around the data elements' although exercises can still be viewed to allow for clinicians to make their selections. The full layout and capabilities of the Ul are laid out in figure 3 and explained in the corresponding text section.
The avatar module (209) is mostly for the benefit of patients, providing demonstrations of the exercise methods to them as well as giving the user encouragement as they go, congratulating their performances and working as a friendly face for the application as a whole. The avatar may be of some use to clinicians, who may want to see exactly what each exercise entails to ensure maximum suitability.
The final shared attribute across patients and clinicians is the ability to monitor progress (210). This is a key part of the application and is useful for both parties. In the case of patients, the more personalised feedback that this invention offers can highlight specific problem areas of errors in exercise performance and help to rectify them quickly. The presence of score and real time feedback also gives more of an incentive to both continue engaging with the exercises and to improve in the necessary areas.
For clinicians, the availability of long-term progress reports helps to remind patients of how far they have already come and allow for clinicians to spot any long-term trends of deterioration of improvement. In addition, the online based features of the application will enable clinicians to access patient performance remotely and monitor their progress as they go, allowing for "on-the-fly" feedback, recommendations, and routine changes to be possible.
In addition to these shared features, clinicians are also provided with tools to help them monitor and set their patient's exercising routines. The first of these gives the ability to set and alter patient routines at any time. Routines can be prescribed on daily, weekly, or fully customised schedules, depending on the particular needs of the patient.
A second tool that is available to clinicians is the ability to define certain success criteria for specific exercises such as angle reached or detected range of movement. This allows for the exercise to be customised to suit individual needs and rehabilitation schemes. For exercises that expect multiple limbs to be moved, this could account for any lack of limbs! lack of motor function in specific limbs for applicable cases.
As for the online components of the application (202), these relate to computationally intense or data sensitive tasks. The first instance of online functionality is the processing of 2D data. To properly process image data into pose positions for further processing requires a machine with a CUDA compatible GPU, not a standalone piece of computing kit for the standard consumer. To alleviate this, these resources are accessed via an online client on demand, so that video data from the user device can be proceed using the cloud-based GPU, before the output is returned to the user device. On compatible devices, an option would be available to run the pose estimation locally, that would likely reduce the overall computation time.
The second important online service is data storage, as mentioned above, the law requires that all medical data is stored securely and cannot be on or accessible through user's devices. As such, all data relating to performance, details and so on is stored in a secure remote server, ensuring the necessary security for personal data is provided. Fig 3
Fig 3 shows all of the screen states the user interface (300) can exist in for patients.
When loaded, the interface starts off with a start screen (301), featuring the company logo and software title. From this screen, a 'start' button is used to move to the login screen (302). By default, login is always required, though an option may be available to save details, though this has issues of privacy and use by individuals other than the patient (that would reduce the utility of any recoded data).
Once the user has logged in successfully as a patient, they are then taken to the main patient menu screen (303). This presents them with all the available functionalities of the application, provided using separate buttons. These include viewing results (305) and assigned exercises (306), as well as a logout (307 A) feature as required. There is also a settings menu (304) available that manages accessibility features such as subtitles and text size.
The view results (305) menu is where the entire history of a patient's performance can be viewed. This has several functions and views that allow for this data to be viewed seamlessly. When accessed, the results screen lets the records for the past week and month the be viewed independently, with prior months also accessible. The visualisation also allows for outputs to viewed as either line or bar charts, as well as for performance across many exercises to be compared using stacked bar charts.
Other information such as how close to goals / targets as well as if there is an overall trend of improvement or deterioration are also available. More targeted information, such as the movements of specific joints may also be available, thought this may not always be true depending on the context of the exercise and its intended purpose.
As a patient, one of the key factors of the exercise application is the ability to view and complete exercise routines as assigned on either a daily, weekly, or other basis (306). This screen shows what they are assigned to do for that day and gives them the dual options of either watching a demonstration on how to perform the exercise (308 A), or the quick start option (308 B) if they are already familiar with the procedure.
In the case of the demo (308 A), the avatar will instruct the patient on the method of performing the exercise, as well as walk them through setting up their camera so that the pose estimation can occur optimally (309). Once this is done, the real time processing element (310) will begin, and this is also where the quick start takes the user.
While the exercise is running the Ul displays the avatar and user video side by side on the screen. At any time during this process, the user can press a 'return to menu' button that will cancel the current exercise. If cancelled, the data for this exercise will not be recorded.
After the exercise has finished, the results for that particular exercise (311) are then displayed to the user. From here, they can either return to their routine (306) to continue exercising or view a summary of all their results (305) if they so wish.
As for the clinician, they are presented with a main menu (312) featuring a select patient (313) or register new patient (314) option. In the first case, they are then presented with a list of their registered patients (315) and can then select appropriately and are brought to the options menu for that patient. When registering a patient, the clinician must enter their details, and the patient then set up their account (i.e., type is password) (316). Once the account is then setup, it can be accessed and modified in the same way as a pre-existing account.
These options consist of viewing and the ability to modify the exercises currently prescribed (316), to look at the performance of the user for each exercise (317) and a 'patient view button' (318 A) that allows the physiotherapist to explore the menus that their patient can interact with. Once they have confirmed that everything is as expected, they can return to the clinician menu via a 'master view' button (318 B).
The avatar elements of the Ul are handled automatically and are only used during the exercising portions of the interface cycle. Fig 4
Fig 4 shows a flow diagram of the various events that occur when the application is in use and how they interconnect (400). The chart demonstrates how the high-level user interface elements of the application (401) interact with the low-level pose estimation functions and generalises the workings of the estimation loop (402).
The application events can be approximately divided into those related to human interactions, either by the clinician or the patient (401) and those related to the real time calculation and feedback of exercise performance (402).
As stated in above, one of the main actions of the physiotherapist is viewing a patient's results and the parameters (including success criteria) for their assigned exercises (403, 404). After viewing these, the clinician can then decide to set or alter the prescribed set of exercises for the patient (405), which will be updated (with notification) on the patients' side of the interface (406).
As for the patient, they can view their assigned exercises for the day (406) and at any time, select an exercise to be performed (407). One they have confirmed their selection and the optional introductory segment has ended; the real time processing element of the exercise begins (402, 408).
In short, real time feedback is provided to the patient by showing them an augmented version of a real time video of themselves (408), with added flavour text to reflect their current performance of the exercise, as well as the line skeleton of the pose superimposed over their body to give a better idea of how measurement is occurring. To provide this output, each frame of video data goes through the same process, as detailed in the graph (402).
Firstly, the imaging device on the user machine captures a single frame of video data (409), the standard rate for this is 30 frames per second (fps), though this may be reduced to around 20 fps (which is still acceptable for our application) depending on the processing speed and quality of online connection.
This two-dimensional image data is then passed through a machine learning based estimator that returns the coordinates for a set of major bodily joints (410). These points can then have lines drawn in between them to from a skeleton, with each line between joints a joint vector.
Once this pose has been estimated (411), this data can then be saved to the secure server (412) for physician review if required. This is unlikely to always be the case, as this is a large dataset, but could still be useful in more niche or complex scenarios. Back in the main loop, the joint vectors can be used to calculate the angles of various joints. These can be calculated regardless of if the vectors are immediately adjacent and can be used as simple metrics for assessing exercise performance.
The alternative method to using angular parameters is to pass the 2D pose or joint angle data thorough a pre trained network, using hundreds of images to develop a method of classifying new image frames as they are produced. Through this method, it may even be possible for the pose estimation part of the system to be completely removed if the network is sufficiently powerful.
One the performance assessment has been generated (413) this is then fed back around to the patient's video screen (414). If on that particular frame a repetition is counted (by detecting a transition to the passive state), then this is also recorded into the exercise overview statistics.
Once the necessary number of repetitions have been counted, the system then moves to visualising the performance data (415, 416). From exercising this only shows an overview of the immediately prior exercise, where long term data is accessed by returning to the menu and selecting that particular option.
The exercise output data can include information on the quality of repetitions, their speed as well as points for improvement next time. If the patient views long term data, they can access information for all prior exercises, with the options to view individual overviews (the information given immediately after completion) or more comprehensive overviews.
These can focus on singular exercises, showing how elements have improved or otherwise, as well as multiple exercises, comparing and contrasting the performance measures across each case. For clinicians, additional data such as range of movement may also be available, as well as options to mark entries as anomalous as well as to view the skeletal data if recorded in that instance.
Fig 5 and Fig 6 give examples of how data is presented to both clinicians and patients.
Fig 5 is a line chart, showing how a patient's performance across two exercises has changed for the current week, where in the application, an option would be available to expand this out to a month if required. The chart can also show more or fewer exercises depending on what is required.
Fig 6 is a bar chart showing the average performance for multiple exercises, allowing for progress to be compared between different exercises, and potentially evaluate their effectiveness at providing a routine to that particular person. Each bar is colour coded according to how the average score has stacked up against the current goals set by the clinician. Similarly, to fig 5, this chart can cover the current month and day if this data is what is requested.
Whilst figs 5 and 6 show data covering many exercises, Fig 7 demonstrates how data is displayed for a single exercise. A rep-by-rep rundown of performance is given to the patient, as well as an average score metric. As with Fig 6, each bar is colour coded to indicate how the performance of that rep stacks up against the currently set success criteria. As with all other forms of data, individual exercise metrics can be accessed at a later date as required.

Claims (20)

  1. Claims 1. A method for providing a platform for rehabilitation to a patient consisting of: a. use of a user interface to handle logon and exercise selection, as well as acting as a platform from which all other functionalities can be accessed; b. an animated avatar to guide users through their respective exercises; c. use of an imaging device connected to the user machine to obtain image data of the user performing an exercise; d. using the image data above to generate estimates of joint positions at any given time using a pre trained model; e. subsequent calculation of the angles between a set of relevant joints, parametrised for each exercise; f. generating a feedback in real-time to correct the movement using a second trained model.
  2. 2. The method of claim 1, wherein the image data is two-dimensional image data.
  3. 3. The method of claim 1, wherein angle data can be measured as that between a joint vector and an axis or as that between two joint vectors.
  4. 4. The method of claim 1 -3, wherein a joint's range of movement is defined as the difference between the maximum and minimum observed angular values in that case.
  5. 5. The method of claim 1 -4 wherein unsupervised learning is used to generate the pre trained model.
  6. 6. The method of claims 1 -4 wherein a pre-existing model is used to estimate joint positions, as opposed to generation of this model as in claim 5.
  7. 7. The methods in all prior claims, wherein feedback, including measurement of performance, is provided through augmentation of video data, in addition to audio and visual cues.
  8. 8. The methods in all previous claims, where the calculated pose data is used to produce one or a number of metrics including completion time, rep time and rep count.
  9. 9. The methods in all previous cases where a second trained algorithm is used to interpret the exercise performance using the estimated pose positions.
  10. 10. The methods in claims 1-8 where a set of pre-defined exercise parameters are used to classify the movement into one of three distinct categories of 'reduced', fair' and 'good'.
  11. 11. The methods described in claim 10, where the set of parameters can be defined with a series of angle thresholds.
  12. 12. The method described in claim 10, where the set of parameters can be defined as a set of height criteria, where the relative metrics are calculated as the decrease in a joint's height relative to a secondary vector.
  13. 13. A system for providing rehabilitation training to a user comprising: a. a Unity based interfacing software (claim 1) installed on both the users and client machines. This software contains the capabilities of; b. secure login functionality for both clients and clinicians; c. capturing raw video information and overlaying performance information; d. processing or video for devices with a compatible GPU; e. all handling of avatar-based functions including sounds and relevant animations; f. all handling of data visualisation such as graphs and charts; g. handling of generating performance assessment from the calculated pose positions; 13.1: A Connection to web-based service that handles: a. processing of video data for machines without a compatible GPU; b. secure storage of all patient data, including exercise schedules and historical exercise information.
  14. 14. The system in claim 13, wherein the data parameters and models for providing real time feedback are as describes in claims 1 -12.
  15. 15. The system in claim 14, wherein a secure sign in system is used to access a personal account.
  16. 16. The system in claim 16, wherein patients can access their prescribed routines, perform exercises, and view their results through the interface (claim 13).
  17. 17. The system in claim 15, where physiotherapists can view patient progress, set, and review current exercise routines and leave comments on progress for patients.
  18. 18. The system in claims 13-17, where the functionality is instead handled through a web site or web-based application.
  19. 19. A cloud-based service for processing of live video data to produce pose estimation.
  20. 20. The service in claim 19, where the service is also used to estimate exercise performance.
GB2105143.8A 2021-04-11 2021-04-11 Remote physiotherapy application Pending GB2605654A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2105143.8A GB2605654A (en) 2021-04-11 2021-04-11 Remote physiotherapy application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2105143.8A GB2605654A (en) 2021-04-11 2021-04-11 Remote physiotherapy application

Publications (2)

Publication Number Publication Date
GB202105143D0 GB202105143D0 (en) 2021-05-26
GB2605654A true GB2605654A (en) 2022-10-12

Family

ID=75949593

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2105143.8A Pending GB2605654A (en) 2021-04-11 2021-04-11 Remote physiotherapy application

Country Status (1)

Country Link
GB (1) GB2605654A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230215532A1 (en) * 2012-08-31 2023-07-06 Blue Goji Llc Cloud - based healthcare diagnostics and treatment platform

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130252216A1 (en) * 2012-03-20 2013-09-26 Microsoft Corporation Monitoring physical therapy via image sensor
US20140081661A1 (en) * 2012-07-05 2014-03-20 Home Team Therapy Method and system for physical therapy using three-dimensional sensing equipment
US20140287389A1 (en) * 2013-03-14 2014-09-25 The Regents Of The University Of California Systems and methods for real-time adaptive therapy and rehabilitation
US20170293742A1 (en) * 2016-04-07 2017-10-12 Javad Sadeghi Interactive mobile technology for guidance and monitoring of physical therapy exercises
US20200185097A1 (en) * 2017-08-17 2020-06-11 Xr Health Il Ltd Guiding user motion for physiotherapy in virtual or augmented reality
WO2020132110A1 (en) * 2018-12-18 2020-06-25 4D Health Science Llc Real-time, fully interactive, virtual sports and wellness trainer and physiotherapy system
WO2020249855A1 (en) * 2019-06-12 2020-12-17 Sanoste Oy An image processing arrangement for physiotherapy

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130252216A1 (en) * 2012-03-20 2013-09-26 Microsoft Corporation Monitoring physical therapy via image sensor
US20140081661A1 (en) * 2012-07-05 2014-03-20 Home Team Therapy Method and system for physical therapy using three-dimensional sensing equipment
US20140287389A1 (en) * 2013-03-14 2014-09-25 The Regents Of The University Of California Systems and methods for real-time adaptive therapy and rehabilitation
US20170293742A1 (en) * 2016-04-07 2017-10-12 Javad Sadeghi Interactive mobile technology for guidance and monitoring of physical therapy exercises
US20200185097A1 (en) * 2017-08-17 2020-06-11 Xr Health Il Ltd Guiding user motion for physiotherapy in virtual or augmented reality
WO2020132110A1 (en) * 2018-12-18 2020-06-25 4D Health Science Llc Real-time, fully interactive, virtual sports and wellness trainer and physiotherapy system
WO2020249855A1 (en) * 2019-06-12 2020-12-17 Sanoste Oy An image processing arrangement for physiotherapy

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230215532A1 (en) * 2012-08-31 2023-07-06 Blue Goji Llc Cloud - based healthcare diagnostics and treatment platform
US11791026B2 (en) * 2012-08-31 2023-10-17 Blue Goji Llc Cloud-based healthcare diagnostics and treatment platform

Also Published As

Publication number Publication date
GB202105143D0 (en) 2021-05-26

Similar Documents

Publication Publication Date Title
Iruthayarajah et al. The use of virtual reality for balance among individuals with chronic stroke: a systematic review and meta-analysis
Aigrain et al. Multimodal stress detection from multiple assessments
Johansson et al. Telerehabilitation in stroke care–a systematic review
US11069144B2 (en) Systems and methods for augmented reality body movement guidance and measurement
US20210174934A1 (en) Remote assessment of emotional status
Veras et al. Scoping review of outcome measures used in telerehabilitation and virtual reality for post-stroke rehabilitation
Grafsgaard et al. Multimodal analysis of the implicit affective channel in computer-mediated textual communication
Bodala et al. Teleoperated robot coaching for mindfulness training: A longitudinal study
Aromaa et al. Human factors and ergonomics evaluation of a tablet based augmented reality system in maintenance work
WO2019173765A1 (en) Systems for monitoring and assessing performance in virtual or augmented reality
Döllinger et al. Resize me! Exploring the user experience of embodied realistic modulatable avatars for body image intervention in virtual reality
GB2605654A (en) Remote physiotherapy application
Tharatipyakul et al. Pose estimation for facilitating movement learning from online videos
Dias Barkokebas et al. VR-RET: A Virtual Reality–Based Approach for Real-Time Ergonomics Training on Industrialized Construction Tasks
Jani et al. Virtual reality and its transformation in forensic education and research practices
Ruíz ERIN: A practical tool for assessing exposure to risks factors for work-related musculoskeletal disorders
US20200013311A1 (en) Alternative perspective experiential learning system
Elor et al. Physical therapist impressions of telehealth and virtual reality needs amidst a pandemic
Li et al. Evaluating the robustness of an appearance-based gaze estimation method for multimodal interfaces
Lee et al. A study on virtual reality sickness and visual attention
WO2019157448A1 (en) Virtual and augmented reality telecommunication platforms
Murali et al. Towards Automated Pain Assessment using Embodied Conversational Agents
Kostolani et al. ErgoMaps: Towards interpretable and accessible automated ergonomic analysis
Khan et al. Embodied tele-presence system (ets): Designing tele-presence for video teleconferencing
Eguíluz et al. Use of a time-of-flight camera with an omek beckon™ framework to analyze, evaluate and correct in real time the verticality of multiple sclerosis patients during exercise