GB2575299A - Method and system for directing and monitoring exercise - Google Patents

Method and system for directing and monitoring exercise Download PDF

Info

Publication number
GB2575299A
GB2575299A GB1811032.0A GB201811032A GB2575299A GB 2575299 A GB2575299 A GB 2575299A GB 201811032 A GB201811032 A GB 201811032A GB 2575299 A GB2575299 A GB 2575299A
Authority
GB
United Kingdom
Prior art keywords
user
exercise
data
trainer
biomechanical data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1811032.0A
Other versions
GB201811032D0 (en
Inventor
Moises Moya Mendoza Guido
Parry Simon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uplec Industries Ltd
Original Assignee
Uplec Industries Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uplec Industries Ltd filed Critical Uplec Industries Ltd
Priority to GB1811032.0A priority Critical patent/GB2575299A/en
Publication of GB201811032D0 publication Critical patent/GB201811032D0/en
Publication of GB2575299A publication Critical patent/GB2575299A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Business, Economics & Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Primary Health Care (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Epidemiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Geometry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A user’s performance of an exercise, especially physiotherapeutic exercise, is directed and monitored. Body movements of a user (e.g. skeletal joint angles) are measured using a portable console having a range imaging sensor. The sensor measures the user during performance of the exercise and outputs resultant three-dimensional image data. The data is processed involving detecting the user in the image data and obtaining user biomechanical data representing skeletal movement. An on-screen user avatar is generated which moves in accordance with the measured data thus representing the user's exercise performance. Trainer biomechanical data representing body movement during proper performance of an exercise is stored, and used to generate an on-screen virtual trainer avatar. During performance of exercise by the user the virtual trainer avatar and the user avatar are displayed concurrently. Trainer and user movements may be compared to provide feedback to the user on their exercise performance. A peripheral device (e.g. handheld, attached to user or attached to an object) monitoring motion/orientation may also be provided.

Description

METHOD AND SYSTEM FOR DIRECTING AND MONITORING EXERCISE
The present invention relates to a system configured to direct a user in the performance of exercise and to monitor the user's performance of the exercise, and to a method of directing a user in the performance of exercise and monitoring the user's performance of the exercise.
The invention is applicable particularly in relation to physiotherapy. Physiotherapeutic treatment typically begins with assessment of the individual's condition by a trained practitioner, who may then make a treatment plan. A variety of treatments are routinely employed in physiotherapy but an element of the treatment plan is often a programme of exercises to be carried out by the individual over an extended period subsequent to the initial consultation. Practical problems relating to such exercise programmes include for example (a) poor or unknown levels of compliance by the individual, (b) imperfect performance of the exercises by the individual, and (c) a lack of ongoing assessment of the individual's progress and performance during the period of the exercise programme. These are all problems that can be at least partly alleviated by high levels of attention on the part of the practitioner - follow-up consultations etc. - but the professional time of trained practitioners is in itself a valuable resource.
There are known computer-based systems intended to assist individuals in the performance of rehabilitative exercise regimes.
WO2015/148676 (application number PCT/US2015/022504) applied for by Reflexion Health, Inc. describes a system in which motion tracking cameras are used to control an avatar in the context of instruction in a virtual world. The Reflexion Health website (wwwjgf!exionheaith.cpm) describes the firm's commercially offered system Vera™ which gives users an on-screen animated trainer to emulate.
US2014371633 (application number 14/364351) applied for by Jintronix, Inc. describes a method for evaluating a user during a virtual-reality rehabilitation exercise involving use of a motion tracking unit or motion sensing device.
There are particular problems and challenges in this context.
Provision of a practical and effective computer-based system for conducting physiotherapeutic and/or rehabilitative exercise, and for guiding the user in the performance of such exercise, remains a challenging problem.
The present invention provides a system and a method according to the appended claims. Specifically, according to a first aspect of the present invention there is a system configured to direct a user in the performance of exercise and to monitor the user's performance of the exercise, the system comprising:
a range imaging sensor arrangement configured to sense the user during performance of the exercise and to output resultant three-dimensional image data;
image processing logic configured to detect the user in the image data and to obtain from the image data user biomechanical data representing skeletal movement of the user;
avatar generation logic for generating an on-screen avatar which moves in accordance with the user biomechanical data;
a memory storing trainer biomechanical data for at least one exercise, the trainer biomechanical data representing movement of a body during proper performance of the exercise;
virtual trainer generation logic for generating an on-screen virtual trainer which moves in accordance with the trainer biomechanical data;
so that during performance of exercise by the user the virtual trainer and the avatar are displayable concurrently to the user in real time or near real time.
According to a second aspect of the present invention there is method of directing a user in the performance of exercise and monitoring the user's performance of the exercise, the method comprising:
imaging the user during performance of the exercise to obtain three-dimensional image data of the user;
processing the image data to obtain user biomechanical data representing skeletal movement of the user;
using the biomechanical data to generate an on-screen avatar generation which moves in accordance with the user biomechanical data;
storing trainer biomechanical data for at least one exercise, the trainer biomechanical data representing movement of a body during proper performance of the exercise;
using the trainer biomechanical data to generate an on-screen virtual trainer which moves in accordance with the trainer biomechanical data; and displaying the avatar and the virtual trainer on-screen to the user during performance of the exercise.
Specific embodiments of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which:Figure 1 represents a console embodying the present invention;
Figure 2 shows a screen shot during use of the console;
Figure 3 is a conceptual diagram showing a user carrying out exercise, a skeletal model being overlaid on her;
Figure 4 represents a peripheral device for use with the console.
The present embodiment is an interactive training system which:-
- provides a user 8 with direction in the performance of exercises, specifically physiotherapeutic exercises. This direction typically comprises display of a virtual trainer to demonstrate the exercises,
- monitors bodily movements of the user 8 in three dimensions,
- models the user's skeletal movements during the exercise, and
- based on the modelling, provides the user 8 with feedback on his/her performance of the exercises, as well as providing data about the user's skeletal movements for other analysis.
The term exercise as used in the present description and in the appended claims is not limited to cardiovascular types of exercise intended to raise the heart rate, but encompasses any form of directed physical activity by the user 8 including for example stretches, mobility exercises, balances etc. The exercises undertaken using the system will often be selected by a skilled practitioner such as a physiotherapist.
The console 10 comprises a sensor arrangement 12 for sensing the user 8 and onboard processing capacity for carrying out functions including:-
- receiving sensor data,
- processing it to model the user's skeletal movements,
- generating an Avatar displayed on screen which mimics the user's skeletal movements and so provides the user with feedback on his/her performance of the exercises,
- implementing a graphical user interface for control of the system by the user, and so on. The console 10 is thus a self-contained unit, in the sense that it can be provided to a user who can then simply connect it to a power supply and a suitable display screen 13, providing all the hardware needed to carry out an exercise programme. The console 10 is nonetheless configured for connectivity to (a) a wide area network, which may be the internet, for exchange of data, (b) the aforementioned display screen 13, and (c) a peripheral 14, to be described below.
Various aspects of the present invention may in other embodiments be implemented using a nondedicated processing device in place of the console 10. This may, by way of example and not limitation, take the form of a laptop, desktop computer, tablet or even a mobile computing device which may or may not operate in a thin client architecture. The processing device is in such instances communicatively coupled to a suitable sensor array, and runs an application (computer program) implementing the necessary functions of the training system.
The exterior of the console 10 is shaped to form a docking port 16 for the peripheral 14. In the present embodiment this takes the form of a shallow depression complementarily shaped to the underside of the peripheral 14, which can thus be laid in the docking port 16 and located by it in a fixed orientation when not in use. Only the edge of this depression is seen in Figure 1 since the peripheral 14 is shown in situ. In the present embodiment the docking port 16 and the peripheral 14 are provided with respective magnets (not shown) disposed within their respective housings and arranged such as to attract one another when the peripheral 14 is placed in the docking port 16, to provide positive location of the peripheral and mild resistance (easily overcome by moderate manual force) to its removal. The console 10 is configured to charge the peripheral 14 when it is docked. In the present embodiment this is achieved without need of a wired connection between the two through an inductive charging arrangement housed in the console 10 and a complementary inductive receiver in the peripheral 14. In other embodiments complementary contacts on the two parts may be arranged to form a charging circuit when the peripheral 14 is docked.
The console 10 has a housing 18 shaped to provide a carrying handle 20, and is of a size and weight that make it easily portable .
The sensor arrangement 12 is provided in a front face of the console 10 and serves in use to image the user 8 as exercises are performed. The image of the user 8 provided by the sensor arrangement is three-dimensional. That is, its elements (e.g. pixels) are not only characterised by their x and y coordinates in a two dimensional image space, but also by a third coordinate z, which is related to the distance of a sensed object from the sensor arrangement 12 (its depth). Sensors capable of this depth sensing function are often referred to as range imaging sensors. The present invention does not impose any particular limitation on the type of range imaging sensor that can be used. Suitable technologies, which may be employed singly or in combination, include the following:
stereoscopic range imaging cameras, in which images are obtained from two laterally separated viewpoints (that is, the viewpoints are separated in the x-y plane). The two images are typically obtained using two cameras. The images are fused to obtain depth data. The Kinect® line of motion controllers from Microsoft® is believed to operate in this manner. Image fusion relies on the capacity to identify corresponding elements in the two images. This may be facilitated by projecting markers onto surfaces in the sensor's field of view, so that image processing logic can recognise corresponding image elements in the two images because they carry corresponding markers. For example, these markers may take the form of an infra-red speckle pattern which is invisible to the naked eye but perceptible by the imaging cameras;
structured light techniques, in which a known structured pattern is projected onto objects in the sensor's field of view and the deformation of this pattern in the sensor image is used to obtain depth data;
time-of-flight techniques, in which the time taken for a projected light signal to return is measured for individual image pixels to provide depth data for those pixels.
This is not an exhaustive list of range imaging technologies that can be employed in the present invention. This is an area of technology that is evolving quickly at the present time and the present invention may, now or in the future, be implemented using other range imaging techniques. Optical range imaging sensors are currently favoured but the term optical in this context must be understood to include imaging using light frequencies outside the visible range, particularly in the infra-red. Non-optical range imaging technologies may be used in this context.
Prototypes of the present invention have used the aforementioned Kinect® system. Other current embodiments use a D400 Depth Camera supplied by Intel® in the sensor arrangement. As well as the range imaging sensors this has a colour digital camera arranged to image a field of view which matches or at least overlaps that of the range imaging sensor arrangement.
In operation, the user 8 carries out exercises in front of the console 10, within the field of view of the sensor arrangement 12 (although the user 8 may sometimes move out of this field of view during an exercise programme). The sensor arrangement 12 provides real time image data including depth data for its field of view, and this image data includes image data relating to the user 8 (although of course it may also include image data relating to other features in the user's environment). The image data is processed to provide user biomechanical data, which in turn is used to control an avatar 22 displayed to the user 8 on the display screen 13.
The console 10 carries out the necessary processing steps which include:
detection of the user 8 in the image, which is a form of object-class detection. Suitable image processing techniques are well known;
processing of the user image data to obtain the user biomechanical data; and generation of the Avatar 22 based on the user biomechanical data.
Generation of the user biomechanical data from the image data may be carried out using currently available software. For example the Vitruvius9 package from Lightbuzz Software is configured to receive image data from range imaging sensors and to output user biomechanical data including joint angles of a skeletal model of the user 8. This package is used in the present embodiment of the invention. The concept of the skeletal model can be appreciated from a study of Figure 3. Nodes 24 in this model are skeletal joints. Their positions in 3D space can be obtained from the image data. Nodes 24 of the model are connected by bones 26 in a manner that corresponds to the human skeleton, although the bones in this model are represented by straight lines connecting joints. Joint angles are the angles between these lines. Skeletal models suitable for use in the present invention can have between 17 and 26 joints, although other numbers of joints may be modelled in other embodiments. In the present embodiment the user biomechanical data comprises joint angle data for each of these joints sampled 30 times per second, although the sampling frequency can differ over a wide range without departing from the scope of the present invention.
In the present embodiment of the invention, the user biomechanical data is supplied to a game engine to construct, control and display the avatar 22, which thus moves in real time in a manner which reflects the movements of the user 8.
The user biomechanical data is also stored and used by the system to analyse user performance, as will be explained below. It may be compressed for storage. It may be uploaded to a cloud-based server.
The system can direct the user 8 in a variety of different ways during operation, but in the example depicted in Figure 2 the user 8 is provided with an on-screen virtual trainer 28 which demonstrates an exercise or programme of exercises in real time for the user 8 to emulate.
The virtual trainer 28 is a computer animation of a human or humanoid figure. In the present embodiment it is created using trainer biomechanical data obtained by imaging a real person - the demonstrator.
To create the trainer biomechanical data for an exercise, the demonstrator, who may be a suitably experienced physiotherapy practitioner, carries out the required exercise in front of a suitable sensor arrangement. In principle this sensor arrangement could be similar in form to the console 10. In practice, however, a more sophisticated sensor installation in a motion capture studio will typically be used. The resultant 3D demonstrator image data is processed to obtain corresponding demonstrator biomechanical data representing the demonstrator's performance of the exercise. This demonstrator biomechanical data may comprise a set of joint angles for a succession of time points during the exercise. Processing may be carried out on the demonstrator biomechanical data to improve it, e.g. to remove imperfections such as asymmetry (one side of the body moving differently from the other). Exercises are often repetitive and cyclical. Trainer biomechanical data for a given exercise may comprise data for a single repetition, which will be used in a loop. In this instance adjustment may be needed to ensure that there is no discontinuity between one repetition and the next.
The result of this process is a set of trainer biomechanical data representing the skeletal movement of the virtual trainer during the exercise.
A library of exercises may be created in this manner, the resultant trainer biomechanical data for each exercise being stored so that a physiotherapist or other practitioner is able to select from the library an exercise or set of exercises suitable for the needs of the user 8.
A game engine is used to generate the on-screen virtual trainer 28 from the trainer biomechanical data. Looking again at Figure 2, a typical screen display provided by the system during an exercise session includes the Avatar 22 alongside the virtual trainer 28. The virtual trainer 28 demonstrates the exercises to be performed. The user 8 attempts to emulate these in real time. The juxtaposition of the avatar 22 and the virtual trainer 28 enables the user 8 to compare his/her performance to that of the virtual trainer, helping the user 8 to modify and improve their own performance of the exercises. The system is configured to provide further information and prompts to the user 8 during exercise. These may be on-screen, in the form of text or graphics, and/or they may be audible, e.g. in the form of spoken directions. In the example provided in Figure 2, on-screen information 32 comprises patient name, title of current exercise and a real time indication of the number of repetitions that have been performed.
The virtual trainer 28 and the Avatar 22 are depicted in a virtual environment 34 which may be user selectable.
The system carries out a comparison between the user biomechanical data and the trainer biomechanical data, and is able to provide the user with feedback based on the comparison. This can be done in real time, providing user feedback during performance of the exercise, and/or in non-real time, e.g. providing the user with summary information relating to his/her performance after completion of an exercise.
In the present embodiment the analysis of the user's performance is based in particular on biomechanical data in the form of skeletal joint angles. Angles of selected joints in the user biomechanical data are compared to the angles of the corresponding joints in the trainer biomechanical data. An average deviation may be determined, which may take the form of a root mean square deviation.
If the performance of the user 8 falls outside a certain acceptable range then the system may be configured to suspend the exercise and provide the user 8 with additional direction, with a view to correcting his/her performance. In this context the system according to the present embodiment assesses both the tempo of the user's movements-whether he/she is keeping in time with the virtual trainer - and his/her range of joint movement. In particular a current version of this algorithm suspends the exercise if the position of a minimum of three joints deviates outside of a tolerable range. For example if the exercise is a lateral raise, the algorithm may monitor the elbow wrist and shoulder with reference to an adjacent joint or joint cluster (in this example the clavicle and the other shoulder). If the exercise is suspended then the user is advised of the issue a presented with a short on-screen video presentation to show the correct movement.
After performance of an exercise, and/or after an exercise session, the system may provide the user with a numerical rating of his/her performance (e.g. a scoreout of one hundred) and/or with feedback, suggestions for improvement etc.
The system can be configured to calculate and output any of a wide range of performance metrics, for the benefit of the user and/or of a supervising clinician or other practitioner. The metrics may relate to a single performance of an exercise and/or to multiple temporally separated exercise sessions, so that improvement or deterioration of user performance is able to be tracked. Examples of performance metrics include joint mobility (the maxima/minima of the angle of a specific joint), changes of joint mobility over temporally separated exercise sessions (an indicator of the progress a user 8 is making), user compliance (does the user carry out the specified exercises with the intended frequency?) and others.
The peripheral 14 is intended to be carried by or attached to the user 8 during performance of the exercise, to provide peripheral sensor data - additional to that from the sensor arrangement 12 relating to movement of the user 8. The peripheral 14 has onboard sensors responsive to its motion and/or orientation. These may comprise any suitable combination of accelerometers, gyroscopes, inclinometers and magnetometers. The present embodiment uses a commercial nine-axis motion tracking sensor having a three axis gyroscope, a three axis accelerometer and a three axis magnetometer (compass). Suitable PCB-mountable devices are widely commercially available.
The peripheral 14 outputs sensor data through a suitable peripheral interface to the console 10 and may in some embodiments receive instructions or other data from the console 10. In the present embodiment the peripheral interface is wireless. It may use a radio frequency link or may for example use a suitably modulated optical signal (which may be an infra red signal) for data transfer. A radio frequency link conforming to the Bluetooth9 protocol is used in the present embodiment.
The example of the peripheral 14 depicted in Figure 4 is of a size, shape and mass making it comfortable to carry in the hand. In the illustrated embodiment it has a cuboidal housing sized to be easily grippable against the palm. The peripheral 14 has multiple functions.
In the present embodiment the peripheral 14 serves as a controller for the console 10, enabling the user 8 to provide control inputs to the user 10 without needing to resort to a more cumbersome input device such as a keyboard during an exercise session. The peripheral 14 may be configured to serve as a pointing device to enable options to be selected by pointing the peripheral 14 at the display screen
13. It may control an on-screen cursor. The pointing function may be implemented using an optical sensor responsive to an optical beacon, which may be carried by the console 10. Additionally or alternatively the pointer function may make use of the peripheral's motion sensors. The peripheral device 14 comprises in the present embodiment a selector device in the form of a push button 30. Thus for example options may be selected from a menu or display in point and click fashion.
During performance of an exercise, the peripheral 14 transmits the peripheral data through the peripheral interface to the console 10, and the peripheral data can be used in combination with the image data in obtaining the user biomechanical data, which creates possibilities for improving the accuracy of the user biomechanical data (i.e. improving the correspondence between the user biomechanical data and the actual movements of the user).
The image data alone may not provide a good basis for determining the rotational position of the human arm. For example, if a straight arm is extended laterally, it may be difficult to determine from the image data alone whether the thumb is pointing upwards or downwards. The distinction may be a significant one in the context of certain physiotherapeutic exercises. Ambiguity in the image data in this respect can be resolved by making use of the peripheral data. The orientation of the peripheral 14 can be determined from the peripheral data. Once it is known in what orientation the peripheral is carried in the hand (which may be established by giving the user suitable direction and/or by analysis of the peripheral data along with the image data), the orientation of the hand can thus be determined from the peripheral data.
The peripheral 14 can be used to monitor user performance whilst the user is out of the field of view of the sensor arrangement 12. For example a widely used assessment technique involves instructing the user 8 to walk a short distance, such as 10 metres. That may take the user 8 out of the field of view of the sensor arrangement 12, but the peripheral 14 continues to provide peripheral data which can be used to make inferences about stride, cadence etc. of the user's walking gait.
The peripheral 14 need not always be hand held. It may be mounted to another part of the body of the user 8 whose movement is of particular interest. For example it may be provided with a strap or other attachment means in order to be carried on the leg, arm or trunk of the user. Alternatively it may be attachable to or incorporated in a piece of equipment, especially a piece of exercise equipment, to expand the scope of applications of the present system to include monitoring of usage of such equipment.
The console 10 of the present embodiment is configured to connect to a wide area network, and more specifically to the internet, making possible a range of advantageous functions.
The biomechanical data obtained by the system can be uploaded to a remote database for study and analysis. This can be useful both in relation to the treatment of individual users and in relation to statistical or other studies of a population of users. The system may be configured to request, obtain and manage user consent (or denial of consent) to the use of the biomechanical data for such purposes. The system may also be configured to obtain information about the user, e.g. by means of a form delivered on-screen. Such data, along with data provided by the managing practitioner, can then be linked to the user biomechanical data, providing a further resource for statistical study.
As concerns the individual, the biomechanical data and/or performance metrics derived from it may be made available through the wide area network to an authorised clinician or practitioner (or group of practitioners) to enable him/her/them to assist the user. For example, for a given user a treatment process may being with a consultation with a physiotherapist, who makes a determination or diagnosis and selects a set of exercises to be carried out by the user to alleviate the problem. The console 10 can be configured, directly by the patient or remotely by the physiotherapist, to lead the user through the chosen exercises. The console 10 may be sold or lent to the user who can then take it home, or it may be made available to the user in a clinic, hospital etc. In either case, the user will then typically be required to carry out the exercises over a period of days or weeks, and the resultant biomechanical data representing the user's performance of the exercises is made available to the physiotherapist through the internet. This is hugely advantageous. It gives the physiotherapist a direct means of monitoring the user's compliance with the exercise programme and their progress, and the ability to tailor subsequent advice accordingly.
The practitioner may be provided with an on-screen dashboard or other form of interface for management of a group of users (patients) under the practitioner's care. The system may perform analysis of user performance to assist the practitioner in managing their workload. For example, filters may be provided to enable users failing to meet their goals to be identified, to enable the practitioner to prioritise those users. Machine learning techniques may be applied in the filtering process.
Transmission of user data through the wide area network will typically be made using a suitably secure communication process, for example by use of a virtual private network.
The console 10 may be configured to provide for real time communication between the user and a practitioner through a text link, an audio link, or a video (and audio) link. This communication may be conducted through the wide area network. In one such example the console 10 provides the user with the facility to communicate with their practitioner, reducing the need for face to face consultation. The practitioner can receive, through the wide area network and where desired in real time, the user's biomechanical data. The practitioner may view the Avatar and/or a video stream of the user, as well as listening to the user through an audio link. Thus for example the practitioner can - during on online consultation - monitor the user's performance of the exercises, ask questions and provide advice. Such communication may be facilitated by the use of a conventional camera on the console 10, for videoconferencing.
Online consultation need not necessarily be provided by means of a one to one relationship between user and practitioner. An alternative is to provide the user through the console 10 with a communication link to a shared resource, such as a group of advisers dealing with communications for a number of users. In such a system users may be allocated to advisers according to their availability, e.g. by means of some form of queueing system.
To enable functions such as prioritisation of users, artificial intelligence techniques may be applied to perform triage before contact is established with a human practitioner.
Communication may be initiated by the user, e.g. through a selectable on-screen prompt. In some instances some or all responses to the user's communication may be provided by a computer system,
e.g. in the form of a chatbot. The chatbot may serve to carry out the aforementioned triage before making a connection to a human adviser, to weed out nuisance contacts, identify and prioritise emergencies and so on.
Physiotherapists often request patients to complete evaluation questionnaires during or after a treatment programme, which can for example be used as a measure of the treatment's efficacy. These questionnaires can be administered through the console 10, their results being transmitted through the wide area network to the practitioner.
When one looks at the wider uses to which the user biomechanical data can be put, it must be appreciated that the data obtained in this manner from a population of users constitutes a potentially valuable resource. Since this data need not include any images of the user it is not of an intrusively personal type and it can straightforwardly be anonymised, reducing obstacles to sharing of the data. The system may nonetheless be configured to receive data relating to the user which may for example include age, lifestyle data, body weight, medical information and so on. The system can thus yield a valuable resource for studies addressing for example the efficacy of exercises or exercise regimes, trends in population mobility and fitness, the effects of lifestyle on aspect of physical performance, and so on. Machine learning techniques can be applied to the data, for example to identify key flags or precursors in a patient group.

Claims (26)

1. A system configured to direct a user in the performance of exercise and to monitor the user's performance of the exercise, the system comprising:
a range imaging sensor arrangement configured to sense the user during performance of the exercise and to output resultant three-dimensional image data;
image processing logic configured to detect the user in the image data and to obtain from the image data user biomechanical data representing skeletal movement of the user;
avatar generation logic for generating an on-screen avatar which moves in accordance with the user biomechanical data;
a memory storing trainer biomechanical data for at least one exercise, the trainer biomechanical data representing movement of a body during proper performance of the exercise;
virtual trainer generation logic for generating an on-screen virtual trainer which moves in accordance with the trainer biomechanical data;
so that during performance of exercise by the user the virtual trainer and the avatar are displayable concurrently to the user in real time or near real time.
2. A system as claimed in claim 1 which is configured to compare the user biomechanical data with the trainer biomechanical data and to provide any one or more of feedback to the user;
instruction to the user; and one or more metric(s) relating to user performance;
based upon the said comparison.
3. A system as claimed in claim 1 which is configured to monitor in real time deviation of the user biomechanical data from the trainer biomechanical data and to provide the user with an indication if the difference between the two exceeds a predetermined threshold.
4. A system as claimed in claim 3 in which the said indication comprises suspension of the exercise and provision by the system of instructions for its proper performance.
5. A system as claimed in any preceding claim in which the biomechanical data comprises joint positions and/or joint angles of a skeletal model.
6. A system as claimed in any preceding claim in which the range imaging sensor arrangement comprises a structured light projector, an imaging sensor responsive to light from the structured light projector and processing logic for providing a range image by analysis of the imaging sensor's output.
7. A system as claimed in any of claims 1 to 5 in which the range imaging sensor arrangement comprises a stereoscopic arrangement of imaging sensors.
8. A system as claimed in any of claims 1 to 5 in which the range imaging sensor arrangement comprises a time of flight camera.
9. A system as claimed in any preceding claim which further comprises a peripheral configured to be carried on a body part of the user and to move along with it, the peripheral comprising a peripheral sensor arrangement responsive to motion and/or orientation of the peripheral.
10. A system as claimed in claim 9 in which peripheral sensor data output from the peripheral sensor arrangement is used along with the image data to determine the user biomechanical data.
11. A system as claimed in claim 10 in which the peripheral is configured to be held in the user's hand and the peripheral sensor data is used to determine rotational position of the user's hand and/or arm.
12. A system as claimed in any of claims 9 to 11 which is configured to receive user control input from the peripheral.
13. A system as claimed in any preceding claim which comprises a unitary console which comprises:
the range imaging sensor arrangement, the memory, and one or more processors configured to implement the image processing logic, the avatar generation logic and the virtual trainer generation logic.
14. A system as claimed in claim 13 in which the console is configured to connect to a wide area network.
15. A system as claimed in any preceding claim in which the console is configured to transmit any of the following to a remote server through the wide area network:
the biomechanical data;
metrics obtained from the biomechanical data;
a compressed form of the biomechanical data.
16. A method of directing a user in the performance of exercise and monitoring the user's performance of the exercise, the method comprising:
imaging the user during performance of the exercise to obtain three-dimensional image data of the user;
processing the image data to obtain user biomechanical data representing skeletal movement of the user;
using the biomechanical data to generate an on-screen avatar generation which moves in accordance with the user biomechanical data;
storing trainer biomechanical data for at least one exercise, the trainer biomechanical data representing movement of a body during proper performance of the exercise;
using the trainer biomechanical data to generate an on-screen virtual trainer which moves in accordance with the trainer biomechanical data; and displaying the avatar and the virtual trainer on-screen to the user during performance of the exercise.
17. A method as claimed in claim 16 further comprising compiling the trainer biomechanical data by a process comprising imaging a demonstrator performing the exercise to provide demonstrator image data.
18. A method as claimed in claim 17 in which the demonstrator image data is processed to obtain demonstrator biomechanical data and then adjusted to form the trainer biomechanical data.
19. A method as claimed in claim 18 in which the adjustment to the demonstrator biomechanical data includes adjustments to remove asymmetry.
20. A method as claimed in any of claims 16 to 19 comprising monitoring in real time deviation of the user biomechanical data from the trainer biomechanical data and providing the the user with an indication if the difference between the two exceeds a predetermined threshold.
21. A method as claimed in claim 20 in which the said indication comprises suspension of the exercise and provision of instructions for its proper performance.
22. A method as claimed in any of claims 16 to 21 in which the biomechanical data comprises joint positions and/or joint angles of a skeletal model.
23. A method as claimed in any of claims 16 to 22 in which the user carries or wears a peripheral comprising a peripheral sensor arrangement responsive to motion and/or orientation of the peripheral.
24. A method as claimed in claim 23 in which peripheral sensor data output from the peripheral sensor 5 arrangement is used along with the image data to determine the user biomechanical data.
25. A method as claimed in claim 24 in which the peripheral is held in the user's hand and the peripheral sensor data is used to determine rotational position of the user's hand and/or arm.
26. A method as claimed in any of claims 16 to 25 further comprising transmitting to a remote server any of the following:
10 the biomechanical data;
metrics obtained from the biomechanical data;
a compressed form of the biomechanical data.
GB1811032.0A 2018-07-05 2018-07-05 Method and system for directing and monitoring exercise Withdrawn GB2575299A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1811032.0A GB2575299A (en) 2018-07-05 2018-07-05 Method and system for directing and monitoring exercise

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1811032.0A GB2575299A (en) 2018-07-05 2018-07-05 Method and system for directing and monitoring exercise

Publications (2)

Publication Number Publication Date
GB201811032D0 GB201811032D0 (en) 2018-08-22
GB2575299A true GB2575299A (en) 2020-01-08

Family

ID=63170854

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1811032.0A Withdrawn GB2575299A (en) 2018-07-05 2018-07-05 Method and system for directing and monitoring exercise

Country Status (1)

Country Link
GB (1) GB2575299A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113096761A (en) * 2020-01-08 2021-07-09 佛山市云米电器科技有限公司 Entertainment and fitness method, cloud server, entertainment and fitness system and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114822143B (en) * 2022-06-29 2022-09-02 深圳前海壹路科技有限公司 Military training intelligent examination management system and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110098109A1 (en) * 2009-10-23 2011-04-28 Disney Enterprises, Virtual game instructor
US20120021833A1 (en) * 2010-06-11 2012-01-26 Harmonic Music Systems, Inc. Prompting a player of a dance game
WO2012071548A1 (en) * 2010-11-24 2012-05-31 Nike International Ltd. Method and system for automated personal training that includes training programs
WO2015148676A1 (en) * 2014-03-26 2015-10-01 Reflexion Health, Inc. Systems and methods for teaching and instructing in a virtual world including multiple views

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110098109A1 (en) * 2009-10-23 2011-04-28 Disney Enterprises, Virtual game instructor
US20120021833A1 (en) * 2010-06-11 2012-01-26 Harmonic Music Systems, Inc. Prompting a player of a dance game
WO2012071548A1 (en) * 2010-11-24 2012-05-31 Nike International Ltd. Method and system for automated personal training that includes training programs
WO2015148676A1 (en) * 2014-03-26 2015-10-01 Reflexion Health, Inc. Systems and methods for teaching and instructing in a virtual world including multiple views

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113096761A (en) * 2020-01-08 2021-07-09 佛山市云米电器科技有限公司 Entertainment and fitness method, cloud server, entertainment and fitness system and storage medium

Also Published As

Publication number Publication date
GB201811032D0 (en) 2018-08-22

Similar Documents

Publication Publication Date Title
US11037369B2 (en) Virtual or augmented reality rehabilitation
US20170136296A1 (en) System and method for physical rehabilitation and motion training
CN109799900B (en) Wrist-mountable computing communication and control device and method of execution thereof
Zhao et al. A Kinect-based rehabilitation exercise monitoring and guidance system
CA2844651C (en) Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation
EP3986266A1 (en) Wearable joint tracking device with muscle activity and methods thereof
US20210349529A1 (en) Avatar tracking and rendering in virtual reality
US20140371633A1 (en) Method and system for evaluating a patient during a rehabilitation exercise
WO2017210654A2 (en) Methods and devices for assessing a captured motion
RU2107328C1 (en) Method for tracing and displaying of position and orientation of user in three-dimensional space and device which implements said method
US20150018722A1 (en) Determination, communication, and presentation of user body position information
US20220351824A1 (en) Systems for dynamic assessment of upper extremity impairments in virtual/augmented reality
US11049321B2 (en) Sensor-based object tracking and monitoring
Semblantes et al. Visual feedback framework for rehabilitation of stroke patients
Chen et al. Development of an upper limb rehabilitation system using inertial movement units and kinect device
GB2575299A (en) Method and system for directing and monitoring exercise
Mihaľov et al. Potential of low cost motion sensors compared to programming environments
US20240028106A1 (en) System and Method for Utilizing Immersive Virtual Reality in Physical Therapy
White et al. A virtual reality application for stroke patient rehabilitation
Narváez et al. Kushkalla: a web-based platform to improve functional movement rehabilitation
US11436806B1 (en) Dual perspective rendering in virtual reality
Durve et al. Machine learning approach for physiotherapy assessment
Chiensriwimol et al. Frozen shoulder rehabilitation: exercise simulation and usability study
RU2106695C1 (en) Method for representation of virtual space for user and device which implements said method
Murgia et al. Low-cost optical tracking for immersive collaboration in the CAVE using the Wii Remote

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)