WO2010049848A1 - Control unit for a system and method for providing feedback to a user - Google Patents

Control unit for a system and method for providing feedback to a user Download PDF

Info

Publication number
WO2010049848A1
WO2010049848A1 PCT/IB2009/054628 IB2009054628W WO2010049848A1 WO 2010049848 A1 WO2010049848 A1 WO 2010049848A1 IB 2009054628 W IB2009054628 W IB 2009054628W WO 2010049848 A1 WO2010049848 A1 WO 2010049848A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
signals
motion
control unit
point
Prior art date
Application number
PCT/IB2009/054628
Other languages
French (fr)
Inventor
Stefan Winter
Richard D. Willmann
Juergen Te Vrugt
Original Assignee
Koninklijke Philips Electronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., Philips Intellectual Property & Standards Gmbh filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2010049848A1 publication Critical patent/WO2010049848A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Definitions

  • Control unit for a system and method for providing feedback to a user
  • the invention relates to a control unit for a system and a method for providing feedback to a user, and in particular to a control unit for a system and a method that can provide feedback on the execution of an action by a user.
  • Strokes are one of the most significant causes of disability in the industrialized world. Prominent disabilities that stroke patients suffer from include half-sided paralysis of the upper limbs and the impairment of their internal feedback mechanisms. It has been proven that rehabilitation exercises are effective in allowing the patient to regain motor control, provided that the training is intense and the patient is guided in the therapy.
  • 'augmented' or 'external' feedback can be given verbally through a coach (in the case of a healthy person learning, for example, a golf swing) or a physiotherapist (in the case of a stroke patient relearning, for example, to reach for an object).
  • This type of feedback is in contrast to 'internal' feedback where the person uses their own senses, such as vision, touch or proprioception.
  • an unsupervised home-stroke rehabilitation system must automatically provide relevant external feedback (such as voice commands or visual cues) during or after the exercises.
  • relevant external feedback such as voice commands or visual cues
  • the external feedback must be based on features that are extracted from movements of the patient. Ideally, the movements are directly measured, for example, by inertial sensors. Then, in order to extract meaningful features for feedback, relevant segments of the measured data must be identified.
  • the rehabilitation process should be supported by internal feedback as much as possible. In traditional therapy, this is facilitated by using real objects that the patient can interact with and that resemble objects that they use in daily life.
  • the technology-supported rehabilitation system developed in the SMART project (http ://www.thesmartconsortium .org/) is based on a motion capture system that enables detailed qualitative feedback (i.e. it provides 'knowledge of performance').
  • a virtual tabletop workspace has been developed ("A virtual tabletop workspace for the assessment of upper limb function in Traumatic Brain Injury (TBI)" by Wilson, P. et al, Proceedings of Virtual Rehabilitation 2007, 2007, 14-19) that incorporates real objects in the context of technology-supported rehabilitation of stroke patients.
  • AutoCITE Automated constraint-induced therapy extension (AutoCITE) for movement deficits after stroke
  • a control unit configured to receive a first set of signals, the first set of signals representing motion of at least a part of the body of a user in response to an instruction indicating an action for the user to perform in relation to an object; receive a second set of signals, the second set of signals representing the monitored position of the object; analyze the first and second sets of signals to determine a quality of the motion of the user in performing the action; and determine feedback for the user based on the determined quality of the motion.
  • the present invention uses data from a motion tracking unit and data from an object tracking unit in order to provide the user with enhanced feedback on the action they have performed or are performing.
  • the data from the motion tracking unit provides knowledge of the specific movements made by the user's body in performing the action, and the data from the object tracking unit provides knowledge of the result of the action.
  • the invention is able to operate without restrictions on the types of movements or actions that can be undertaken by the user.
  • control unit is configured to analyze the second set of signals to identify a subset of the first set of signals relating to the motion of the user in performing the action.
  • control unit is configured to determine a start point of the subset as the point at which the second set of signals indicate that the object has been moved.
  • control unit is configured to determine a start point of the subset as a predetermined period before or after the point at which the second set of signals indicate that the object has been moved.
  • control unit is configured to determine a start point of the subset by identifying a point in the second set of signals that indicates that the object has been moved; and identifying a point in the first set of signals that corresponds to the user starting to perform the action using said point in the second set of signals.
  • control unit is configured to analyze the second set of signals to determine an end point of the subset as the point at which the second set of signals indicate that the object has been placed in a position indicated by the instruction or another position. In an alternative embodiment, the control unit is configured to determine an end point of the subset as a predetermined period before or after the point at which the second set of signals indicate that the object has been placed in a position indicated by the instruction or another position.
  • control unit is configured to determine an end point of the subset by identifying a point in the second set of signals that indicates that the object has been placed in a position indicated by the instruction or another position; and identifying a point in the first set of signals that corresponds to the user completing the action using said point in the second set of signals.
  • the positions of the object are used to identify important portions in the continuous stream of motion data.
  • Features derived from data in these segments serve as the basis for the enhanced feedback in accordance with the invention.
  • control unit is configured to determine a quality of the motion of the user by determining the smoothness of the motion of the user, the presence of any compensatory motion in the motion of the user, and/or a measure of the deviation between the motion of the user and a predetermined template or pattern for the motion.
  • a system for providing feedback to a user comprising a user interface for providing an instruction to a user, the instruction indicating an action for the user to perform in relation to an object; a motion tracking unit for monitoring the motion of at least a part of the body of the user in response to the instruction and for generating a first set of signals, the first set of signals representing the monitored motion; an object tracking unit for monitoring the position of the object and for generating a second set of signals, the second set of signals representing the monitored position of the object; and a control unit as described above.
  • the motion tracking unit comprises at least one sensor unit for attachment to a part of the body of the user.
  • the object tracking unit comprises a board divided into discrete fields, and wherein the object tracking unit can determine the presence and/or absence of the object in each of the discrete fields.
  • the object tracking unit comprises a RFID tag reader in each field, and wherein the object comprises a corresponding RFID tag.
  • the action for the user to perform in relation to the object comprises moving the object from a first field to a second field.
  • the user interface provides the instruction to the user by illuminating at least one of the first field and the second field.
  • the user interface is further configured to provide a plurality of options to the user, each option corresponding to a particular action in relation to the object, and wherein the control unit is further configured to analyze the second set of signals to determine the selection of an option by the user.
  • a method of providing feedback to a user comprising receiving a first set of signals, the first set of signals representing motion of at least a part of the body of a user in response to an instruction indicating an action for the user to perform in relation to an object; receiving a second set of signals, the second set of signals representing the monitored position of the object; analyzing the first and second sets of signals to determine a quality of the motion of the user in performing the action; and determining feedback for the user based on the determined quality of the motion.
  • a computer program product comprising computer program code that, when executed on a suitable computer or processor, causes the computer or processor to perform the method as described above.
  • FIG. 1 is an illustration of a system according to an embodiment of the invention and a user
  • Fig. 2 is a block diagram of a system according to the invention.
  • Fig. 3 is an illustration of an object tracking unit in accordance with the invention.
  • Fig. 4 is a method in accordance with the invention.
  • the invention will be described below with reference to providing feedback to a user to aid the rehabilitation of an injury or disability following a stroke or similar condition, it will be appreciated that the invention can also be used in any application that involves the interaction of a user and an object, such as, for example, controlling a computer game.
  • Figure 1 shows a rehabilitation system 2 in accordance with a first embodiment of the invention.
  • the rehabilitation system 2 is for use by a user 4, and comprises a motion tracking unit 6, an object tracking unit 8 and a control unit 10.
  • the motion tracking unit 6 monitors or tracks the motion of at least a part of the body of the user 4.
  • the motion tracking unit 6 can monitor a specific part of the body of the user 4, such as an arm (as shown in Figure 1), but in other embodiments, the motion tracking unit 6 can monitor the motion of the whole body of the user 4.
  • the motion tracking unit 6 comprises a single sensor unit 12 that can measure acceleration and orientation in three dimensions, and that is attached to the wrist, upper arm or torso of the user 4 by a band 14, or other suitable attachment means.
  • the motion tracking system 6 can comprise a plurality of sensor units 12 that are located on different parts of the body of the user, in order to provide information on the motion of different parts of the body relative to each other.
  • motion tracking unit 6 can be selected for use in the rehabilitation system 2, depending on the part of the body of the user 4 to be monitored, and/or on the preference of the skilled person.
  • a camera-based motion tracking unit that obtains images of the user 4 and determines the motion of the body of the user 4 from the images.
  • the motion tracking unit 6 generates a first set of signals representing the motion of the monitored part of the body of the user 4, and the first set of signals are sent to the control unit 10.
  • the first set of signals can comprise raw data from the sensors in the sensor unit 12 (such as data from an accelerometer along with timing information), or the motion tracking unit 6 can generate the first set of signals by processing the raw data from the sensor unit 12 into a format that is suitable for use by the control unit 10.
  • the first set of signals are transmitted wirelessly (for example using Bluetooth or Wi-Fi) from the motion tracking unit 6 to the control unit 10.
  • the first set of signals it will be appreciated that, in other embodiments, it is possible for the first set of signals to be provided to the control unit 10 through a wired connection.
  • the motion tracking unit 6 shown in Figure 1 and the alternative camera-based units are known to the skilled person in the art, and will not be described further herein.
  • exemplary systems include the AquisGrain system developed by Philips Research, and the systems developed by Xsens (http;//wvvw. . xjen,s.com) and Vicon (http://www.vicon.com).
  • the object tracking unit 8 monitors the position of an object 16 or a plurality of objects 16 that the user 4 can interact with.
  • the object tracking unit 8 comprises a board that is divided into discrete fields, into which the object 16 can be placed.
  • the object tracking unit 8 can determine which field the object 16 has been placed into, for example by placing an RFID tag reader in each field and an RFID tag on the base of the object 16. If the object tracking unit 8 is to monitor a plurality of objects, each object can be provided with a respective RFID tag.
  • the object tracking unit 8 can generate a corresponding signal (for example, indicating the particular object, its position (i.e. field) and a time).
  • the generated signals form a second set of signals representing the monitored position of the object 16 (and optionally the other objects) that the user 4 has been instructed to interact with, and the second set of signals are sent to the control unit 10.
  • the second set of signals can comprise raw data from the object tracking unit 8 (such as data from an object tracking board), or the object tracking unit 8 can generate the second set of signals by processing the raw data from an object tracking board into a format that is suitable for use by the control unit 10.
  • the second set of signals are transmitted over a wired connection from the object tracking unit 8 to the control unit 10.
  • the second set of signals it is possible for the second set of signals to be provided to the control unit 10 wirelessly.
  • object tracking units 8 can be used in the rehabilitation system 2 according to the invention.
  • object tracking system as described in US Patent No. 6,168,158 to J. Bulsink, in which objects to be tracked are equipped with object-specific coils.
  • object tracking units 8 can include cameras to visually record and extract the position of the object 16.
  • the object tracking unit 8 shown in Figure 1 and alternatives are known to the skilled person, and will not be described further herein.
  • the first and second sets of signals from the motion tracking unit 6 and object tracking unit 8 respectively are provided to the control unit 10.
  • the control unit 10 analyses the first and second sets of signals to determine if the user 4 has complied with an instruction to interact with an object 16 in a particular way (for example, move the object from position A to position B using your right hand), and whether the movement of the relevant part of the body of the user 4 was appropriate for the instruction (for example, was the movement of the user's shoulder in moving the object 16 acceptable).
  • the control unit 10 can also be responsible for generating the instruction for the user 4 to perform a specific action with the object 16.
  • Figure 2 shows a block diagram of the rehabilitation system 2.
  • the control unit 10 comprises a processor 18 which receives the first and second sets of signals from the motion tracking unit 6 and object tracking unit 8.
  • the control unit 10 includes a display 20 for providing written or graphical instructions and/or feedback to the user 4, and a speaker 22 for providing audible instructions and/or feedback to the user 4.
  • the system 2 also includes a user instruction generator 24 that generates the instructions for the user 4.
  • the instructions indicate an action for the user 4 to perform in relation to an object 16.
  • a set of possible instructions can be stored, and the user instruction generator 24 can select an instruction for the user 4 to follow when required.
  • the user instruction generator 24 can arbitrarily determine the instruction on request, within a set of parameters determined by the objects 16 and object tracking unit 8. For example, if there are a set of three objects and the object tracking unit 8 can monitor objects in sixteen discrete fields, the instruction can comprise moving a specific one of the three objects from its current field in the sixteen available fields to another field in the sixteen available fields.
  • the user instruction generator 24 then provides the generated instruction to an instruction indicator 26 that indicates the instruction to the user 4.
  • the instruction indicator 26 can comprise a set of visual indicators, such as LEDs or other lights, in each field, indicating which object 16 should be moved, and where it should be moved to.
  • the user instruction generator 24 also provides an indication of the instruction to the processor 18 so that the processor 18 can determine whether the user 4 has complied with the instruction.
  • the display 20 and/or speaker 22 can also be used to provide the instruction to the user 4, in addition to, or instead of, the instruction indicator 26.
  • the object tracking unit 8, user instruction generator 24 and instruction indicator 26 can be integrated into a single unit (as indicated by dashed box 28 in Figure 2), and an exemplary embodiment is shown in Figure 3.
  • the object tracking unit 8 and instruction indicator 26 comprises a grid 30, with each square in the grid having an associated RFID tag reader for reading an RFID tag in or on the objects 16a, 16b and an LED or other light source for indicating an object 16a, 16b to be moved and a destination.
  • the instruction indicator 26 can illuminate a box in which one of the objects 16 is present (for example box 32 in which object 16b is located), and a destination box, indicated by reference numeral 34.
  • the instruction indicator 26 can comprise a projector which projects discrete fields on to the board of the object tracking unit 8 corresponding to those in the object tracking unit 8, and which can vary the color or content of particular ones of the projected fields.
  • the object tracking unit 8 and instruction indicator 26 can use the EnterTaible developed by the applicant.
  • FIG. 4 illustrates a method in accordance with the invention.
  • the rehabilitation system 2 is activated.
  • This step includes setting up and switching on the relevant apparatus, such as the motion tracking unit 6, object tracking unit 8 and control unit 10, and the user 4 preparing to use the system 2, including putting on any sensors required by the motion tracking unit 6.
  • an instruction is generated that instructs the user 4 to perform an action with an object 16.
  • this instruction can comprise, for example, instructing the user 4 to pick up an object 16 from a first position and to put the object 16 down at a second position.
  • step 105 the user 4 is instructed to perform the action with the object 16.
  • this instruction can be provided to the user by instruction indicator 26 and/or by the display 20 and speaker 22.
  • step 107 the motion of the arm (or arms) of the user 4 is monitored by the motion tracking unit 6, along with the position of the object 16 by the object tracking unit 8.
  • the signals from the motion tracking unit 6 and object tracking unit 8 are provided to the processor 18 of the control unit 10 where they are analyzed to determine the compliance of the user 4 with the instruction (step 109).
  • step 111 feedback is generated for the user 4 from the analysis of the signals, and this is provided to the user 4 via the display 20 and/or speakers 22.
  • the feedback can be generated after the user 4 has completed the action required by the instruction, or it can be generated during the movement of the user 4, by analyzing the signals from the motion tracking unit 6 and object tracking unit 8 as they are produced. In this way, the rehabilitation system 2 can provide feedback to the user 4 in real time (i.e. as the action is being performed).
  • step 113 the instruction can be repeated (if, for example, the user 4 did not execute the required movement properly, or if it is deemed that repetition will aid in the rehabilitation of the user 4), or a new instruction can be generated. The process then returns to step 105.
  • the object tracking unit 8 allows the control unit 10 to provide feedback on the end goal (the 'knowledge of results'), for example, the object 16 was moved to the correct location.
  • the processor 18 can further generate qualitative feedback ('knowledge of performance'), for example, based on the amount of compensation by the shoulder or the jerkiness of the movement.
  • the feedback can be provided online (i.e. during the execution of the instruction) or offline (after the execution).
  • the rehabilitation system 2 it is necessary for the rehabilitation system 2 to implement some algorithm to recognize a specific movement or movements in the motion data, in order to determine the point at which the user 4 starts to comply with the instruction and completes the instruction.
  • some algorithm may be required to identify when the user 4 reaches towards an object 16. Such algorithms are known in the art, and will not be described further herein.
  • the processor 18 uses the signals from the motion tracking unit 6 and object tracking unit 8 to identify the start and end points of the motion of the user 4.
  • the signals from the object tracking unit 8 allow the signals collected from the motion tracking unit 6 to be segmented into a subset of signals that relate to motion when the user 4 is complying with the instruction (i.e. picking up the object 16), and signals that do not relate to motion of the user 4 in complying with the instruction.
  • the processor 18 From the signals from the object tracking unit 8, the processor 18 identifies the time at which the object 16 is first moved following the issuance of the instruction, and this time point is also used as the start point of the relevant motion of the user 4 (the "relevant" motion being that involved in complying with the instruction). Thus, when analyzing the signals from the motion tracking unit 6, the processor 18 considers the motion occurring after the identified time point.
  • the start point of the motion can be set to a predetermined time period before the point at which the object 16 is first moved, in order to capture the motion of the user 4 just before the object 16 is picked up (i.e. corresponding to the motion of the user 4 in reaching for the object 16).
  • the start point of the motion can be set by an algorithm that uses the point at which object 16 is first moved as an estimate (which is near to the actual point that the motion started). For example, such an algorithm can check around the estimate to determine when the user 4 actually started to move, and thereby determine the real start point. Similarly, if the user 4 is already in motion at the point that they start to comply with the instruction, the algorithm can use the point at which the object 16 is first moved as an estimate, and can check around the estimate to determine when the user 4 actually started to perform movements in order to comply with the instruction.
  • the signals from the object tracking unit 8 can be used to determine the end point of the motion of the user 4 in a corresponding fashion.
  • the processor 18 can identify the end point of the motion as the point at which the object 16 is placed in another position (which may be the destination specified in the instruction or another destination), as a predetermined time period after the point at which the object 16 is placed in another position, or as a point determined from using the point at which the object 16 is placed in another position as an estimate.
  • the start time of the action can be identified as the time when the cup is removed from the board in the object tracking unit 8 and the board issues a corresponding signal.
  • the end time is indicated by the time that the cup is placed back on the board of the object tracking unit 8 and the board issues the corresponding signal.
  • the processor 18 is able to identify the start and end points of the motion without needing to use complex motion identification or recognition algorithms, whose performance (i.e. success in identifying particular motions correctly) depend on the types of actions and how well the actual execution matches a pre-recorded template.
  • this embodiment of the invention is particularly suited to use in rehabilitating users 4 that have problems in controlling their movements.
  • the processor 18 can analyze the signals from the motion tracking unit 6 and object tracking unit 8 to determine if the object 16 was placed in the location specified by the instruction. Next, the processor 18 can determine the quality of the motion of the user 4 in executing the instruction. As described above, the relevant motion (i.e. that in performing the instruction) can be identified using the start and end points of the motion of the object 16.
  • the quality of the motion of the user 4 can be given by, for example, its smoothness, the presence of any "compensations" (for example excessive use of the shoulder in the motion) and/or a measure of the spatial and/or temporal match between the actual movement of the user 4 and a prerecorded optimal movement.
  • the processor 18 can also determine the duration of the motion.
  • the processor 18 can determine the smoothness of the motion by examining the "jerk” (given by the third derivative of position with respect to time), for example as described in "Movement smoothness changes during stroke recovery” by Rohrer et al., The Journal of Neuroscience, 2002, 22, 8297-8304 and "The coordination or arm movements: An experimentally confirmed mathematical model", The Journal of Neuroscience, 1985, 5, 1688- 1703.
  • the processor 18 can determine the presence of any compensations, for example, by determining the deviation of the position of the trunk during the movement from the position of the trunk when sitting upright. The information required to determine this deviation can be provided from the motion tracking unit 6 (for example by placing an inertial or motion sensor on the trunk of the user 4).
  • the processor 18 can use pattern recognition techniques to check whether each extracted repetition corresponds to the desired movement as given by a prerecorded optimal movement. It will be appreciated that these pattern recognition techniques are relatively simple to implement once the start and end points of the motion are determined from the motion of the object 16, in comparison to pattern recognition techniques that must also identify the relevant motion from all of the motion tracking unit data.
  • the results of the analysis are used to provide feedback to the user 4.
  • the feedback can be as simple as just stating the results, i.e. the number of completed repetitions, the success rate, or the feedback can be more informative, and can provide the user 4 with advice on how to improve their technique (for example: 'try not to move your shoulder').
  • the feedback can be generated based on the deviation of the trunk from the optimal position for the movement (e.g. sitting upright). If the deviation is within, say, 10 degrees, the feedback can indicate to the user 4 that the movement was acceptable. However, if the deviation is more than, say, 10 degrees, the feedback can indicate to the user 4 that they should move their shoulder less.
  • the invention provides a system for providing feedback on the execution of an action by the user.
  • the invention finds particular application in rehabilitation and exercise. As the user is interacting with real objects, the user receives natural feedback on their actions.
  • the invention can also be used as a communication interface between the user and a control unit. Based on predefined fields and colors, say, the user can interact with the control unit by placing an object on a specific field, thereby indicating his input. Such a system can be used in controlling or interacting with a computer game.
  • the object tracking unit 8 and instruction indicator 26 can be used to implement this communication interface in a rehabilitation system 2.
  • the instruction indicator 26 can light up various fields to indicate different user-selectable options, such as two fields, say, one in red associated with a "back to the previous step” function and one in green associated with a "move to the next step” function, and the user 4 can indicate their choice by placing an object 16 in the desired field.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Abstract

There is provided a control unit configured to receive a first set of signals, the first set of signals representing motion of at least a part of the body of a user in response to an instruction indicating an action for the user to perform in relation to an object; receive a second set of signals, the second set of signals representing the monitored position of the object; analyze the first and second sets of signals to determine a quality of the motion of the user in performing the action; and determine feedback for the user based on the determined quality of the motion.

Description

Control unit for a system and method for providing feedback to a user
FIELD OF THE INVENTION
The invention relates to a control unit for a system and a method for providing feedback to a user, and in particular to a control unit for a system and a method that can provide feedback on the execution of an action by a user.
BACKGROUND OF THE INVENTION
Strokes are one of the most significant causes of disability in the industrialized world. Prominent disabilities that stroke patients suffer from include half-sided paralysis of the upper limbs and the impairment of their internal feedback mechanisms. It has been proven that rehabilitation exercises are effective in allowing the patient to regain motor control, provided that the training is intense and the patient is guided in the therapy.
Motor skill acquisition in healthy people, as well as in stroke patients, is facilitated by so-called 'augmented' or 'external' feedback, which can be given verbally through a coach (in the case of a healthy person learning, for example, a golf swing) or a physiotherapist (in the case of a stroke patient relearning, for example, to reach for an object). This type of feedback is in contrast to 'internal' feedback where the person uses their own senses, such as vision, touch or proprioception.
Technical solutions for unsupervised home stroke rehabilitation require the use of appropriate feedback mechanisms to ensure proper exercising. Thus, an unsupervised home-stroke rehabilitation system must automatically provide relevant external feedback (such as voice commands or visual cues) during or after the exercises. In the case of motor rehabilitation, the external feedback must be based on features that are extracted from movements of the patient. Ideally, the movements are directly measured, for example, by inertial sensors. Then, in order to extract meaningful features for feedback, relevant segments of the measured data must be identified.
Based on the understanding of neuro-learning, the rehabilitation process should be supported by internal feedback as much as possible. In traditional therapy, this is facilitated by using real objects that the patient can interact with and that resemble objects that they use in daily life.
The technology-supported rehabilitation system developed in the SMART project (http ://www.thesmartconsortium .org/) is based on a motion capture system that enables detailed qualitative feedback (i.e. it provides 'knowledge of performance').
For example, it is known that stroke patients often compensate movements of their impaired upper limbs with movements of their shoulder. In traditional therapy, the therapist can advise the patient to watch his shoulder in order not to compensate. In a system such as SMART, this feedback is provided by analyzing movements measured by a sensor that is attached to the trunk of the patient.
However, such a system cannot monitor nor track the use of real objects such as a cup, and thus cannot effectively provide feedback on the end goal (i.e. it cannot provide 'knowledge of results').
A virtual tabletop workspace has been developed ("A virtual tabletop workspace for the assessment of upper limb function in Traumatic Brain Injury (TBI)" by Wilson, P. et al, Proceedings of Virtual Rehabilitation 2007, 2007, 14-19) that incorporates real objects in the context of technology-supported rehabilitation of stroke patients.
Similarly, AutoCITE ("Automated constraint-induced therapy extension (AutoCITE) for movement deficits after stroke" by Lum, P. et al., Journal of Rehabilitation Research & Development, 2004, 41, 249-258) makes use of real objects.
In these systems, patients receive feedback on the end goal (the 'knowledge of results'). However, these systems cannot provide 'knowledge of performance'.
For example, although patients can compensate their arm movements with their shoulder without influencing the result of the placement of an object at a specific location, the patient might not learn the correct movements. With the virtual tabletop workspace and the AutoCITE systems, this kind of feedback is not possible since they only measure the results of the movements.
There is therefore a need for a system that is able to obtain knowledge of the performance of the user in addition to knowledge of the results of the user's actions in order to provide feedback to the user so that the correct movements are learnt.
SUMMARY OF THE IN VENTION
Thus, according to a first aspect of the invention, there is provided a control unit, configured to receive a first set of signals, the first set of signals representing motion of at least a part of the body of a user in response to an instruction indicating an action for the user to perform in relation to an object; receive a second set of signals, the second set of signals representing the monitored position of the object; analyze the first and second sets of signals to determine a quality of the motion of the user in performing the action; and determine feedback for the user based on the determined quality of the motion.
Thus, the present invention uses data from a motion tracking unit and data from an object tracking unit in order to provide the user with enhanced feedback on the action they have performed or are performing. The data from the motion tracking unit provides knowledge of the specific movements made by the user's body in performing the action, and the data from the object tracking unit provides knowledge of the result of the action. The invention is able to operate without restrictions on the types of movements or actions that can be undertaken by the user.
In a preferred embodiment, the control unit is configured to analyze the second set of signals to identify a subset of the first set of signals relating to the motion of the user in performing the action.
In one embodiment, the control unit is configured to determine a start point of the subset as the point at which the second set of signals indicate that the object has been moved.
In an alternative embodiment, the control unit is configured to determine a start point of the subset as a predetermined period before or after the point at which the second set of signals indicate that the object has been moved.
In a further alternative embodiment, the control unit is configured to determine a start point of the subset by identifying a point in the second set of signals that indicates that the object has been moved; and identifying a point in the first set of signals that corresponds to the user starting to perform the action using said point in the second set of signals.
In one embodiment, the control unit is configured to analyze the second set of signals to determine an end point of the subset as the point at which the second set of signals indicate that the object has been placed in a position indicated by the instruction or another position. In an alternative embodiment, the control unit is configured to determine an end point of the subset as a predetermined period before or after the point at which the second set of signals indicate that the object has been placed in a position indicated by the instruction or another position. In a further alternative embodiment, the control unit is configured to determine an end point of the subset by identifying a point in the second set of signals that indicates that the object has been placed in a position indicated by the instruction or another position; and identifying a point in the first set of signals that corresponds to the user completing the action using said point in the second set of signals.
Thus, in preferred embodiments, the positions of the object are used to identify important portions in the continuous stream of motion data. Features derived from data in these segments serve as the basis for the enhanced feedback in accordance with the invention.
In further embodiments, the control unit is configured to determine a quality of the motion of the user by determining the smoothness of the motion of the user, the presence of any compensatory motion in the motion of the user, and/or a measure of the deviation between the motion of the user and a predetermined template or pattern for the motion.
In accordance with a second aspect of the invention, there is provided a system for providing feedback to a user, the system comprising a user interface for providing an instruction to a user, the instruction indicating an action for the user to perform in relation to an object; a motion tracking unit for monitoring the motion of at least a part of the body of the user in response to the instruction and for generating a first set of signals, the first set of signals representing the monitored motion; an object tracking unit for monitoring the position of the object and for generating a second set of signals, the second set of signals representing the monitored position of the object; and a control unit as described above.
Preferably, the motion tracking unit comprises at least one sensor unit for attachment to a part of the body of the user.
In one embodiment, the object tracking unit comprises a board divided into discrete fields, and wherein the object tracking unit can determine the presence and/or absence of the object in each of the discrete fields.
In particular embodiments, the object tracking unit comprises a RFID tag reader in each field, and wherein the object comprises a corresponding RFID tag.
Preferably, the action for the user to perform in relation to the object comprises moving the object from a first field to a second field. Preferably, the user interface provides the instruction to the user by illuminating at least one of the first field and the second field.
In a further embodiment, the user interface is further configured to provide a plurality of options to the user, each option corresponding to a particular action in relation to the object, and wherein the control unit is further configured to analyze the second set of signals to determine the selection of an option by the user.
According to a third aspect of the invention, there is provided a method of providing feedback to a user, the method comprising receiving a first set of signals, the first set of signals representing motion of at least a part of the body of a user in response to an instruction indicating an action for the user to perform in relation to an object; receiving a second set of signals, the second set of signals representing the monitored position of the object; analyzing the first and second sets of signals to determine a quality of the motion of the user in performing the action; and determining feedback for the user based on the determined quality of the motion.
According to a fourth aspect of the invention, there is provided a computer program product comprising computer program code that, when executed on a suitable computer or processor, causes the computer or processor to perform the method as described above.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described, by way of example only, with reference to the following drawings, in which:
Fig. 1 is an illustration of a system according to an embodiment of the invention and a user;
Fig. 2 is a block diagram of a system according to the invention;
Fig. 3 is an illustration of an object tracking unit in accordance with the invention; and
Fig. 4 is a method in accordance with the invention.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Although the invention will be described below with reference to providing feedback to a user to aid the rehabilitation of an injury or disability following a stroke or similar condition, it will be appreciated that the invention can also be used in any application that involves the interaction of a user and an object, such as, for example, controlling a computer game.
Figure 1 shows a rehabilitation system 2 in accordance with a first embodiment of the invention. The rehabilitation system 2 is for use by a user 4, and comprises a motion tracking unit 6, an object tracking unit 8 and a control unit 10. The motion tracking unit 6 monitors or tracks the motion of at least a part of the body of the user 4. In some embodiments, the motion tracking unit 6 can monitor a specific part of the body of the user 4, such as an arm (as shown in Figure 1), but in other embodiments, the motion tracking unit 6 can monitor the motion of the whole body of the user 4.
In the illustrated embodiment, the motion tracking unit 6 comprises a single sensor unit 12 that can measure acceleration and orientation in three dimensions, and that is attached to the wrist, upper arm or torso of the user 4 by a band 14, or other suitable attachment means. In some embodiments, the motion tracking system 6 can comprise a plurality of sensor units 12 that are located on different parts of the body of the user, in order to provide information on the motion of different parts of the body relative to each other.
It will be appreciated by a person skilled in the art that other types of motion tracking unit 6 can be selected for use in the rehabilitation system 2, depending on the part of the body of the user 4 to be monitored, and/or on the preference of the skilled person. For example, it is possible to use a camera-based motion tracking unit that obtains images of the user 4 and determines the motion of the body of the user 4 from the images.
The motion tracking unit 6 generates a first set of signals representing the motion of the monitored part of the body of the user 4, and the first set of signals are sent to the control unit 10. The first set of signals can comprise raw data from the sensors in the sensor unit 12 (such as data from an accelerometer along with timing information), or the motion tracking unit 6 can generate the first set of signals by processing the raw data from the sensor unit 12 into a format that is suitable for use by the control unit 10.
In a preferred embodiment, the first set of signals are transmitted wirelessly (for example using Bluetooth or Wi-Fi) from the motion tracking unit 6 to the control unit 10. However, it will be appreciated that, in other embodiments, it is possible for the first set of signals to be provided to the control unit 10 through a wired connection.
The motion tracking unit 6 shown in Figure 1 and the alternative camera-based units are known to the skilled person in the art, and will not be described further herein. For completeness, exemplary systems include the AquisGrain system developed by Philips Research, and the systems developed by Xsens (http;//wvvw..xjen,s.com) and Vicon (http://www.vicon.com).
The object tracking unit 8 monitors the position of an object 16 or a plurality of objects 16 that the user 4 can interact with. In the illustrated embodiment, the object tracking unit 8 comprises a board that is divided into discrete fields, into which the object 16 can be placed. The object tracking unit 8 can determine which field the object 16 has been placed into, for example by placing an RFID tag reader in each field and an RFID tag on the base of the object 16. If the object tracking unit 8 is to monitor a plurality of objects, each object can be provided with a respective RFID tag. When an object 16 is placed in a field or removed from a field, the object tracking unit 8 can generate a corresponding signal (for example, indicating the particular object, its position (i.e. field) and a time).
The generated signals form a second set of signals representing the monitored position of the object 16 (and optionally the other objects) that the user 4 has been instructed to interact with, and the second set of signals are sent to the control unit 10. The second set of signals can comprise raw data from the object tracking unit 8 (such as data from an object tracking board), or the object tracking unit 8 can generate the second set of signals by processing the raw data from an object tracking board into a format that is suitable for use by the control unit 10.
In the illustrated embodiment, the second set of signals are transmitted over a wired connection from the object tracking unit 8 to the control unit 10. However, it will be appreciated that, in other embodiments, it is possible for the second set of signals to be provided to the control unit 10 wirelessly.
It will be appreciated that alternative types of object tracking units 8 can be used in the rehabilitation system 2 according to the invention. For example, it is possible to use an object tracking system as described in US Patent No. 6,168,158 to J. Bulsink, in which objects to be tracked are equipped with object-specific coils. Further alternative types of object tracking units 8 can include cameras to visually record and extract the position of the object 16. The object tracking unit 8 shown in Figure 1 and alternatives are known to the skilled person, and will not be described further herein. The first and second sets of signals from the motion tracking unit 6 and object tracking unit 8 respectively are provided to the control unit 10. The control unit 10 analyses the first and second sets of signals to determine if the user 4 has complied with an instruction to interact with an object 16 in a particular way (for example, move the object from position A to position B using your right hand), and whether the movement of the relevant part of the body of the user 4 was appropriate for the instruction (for example, was the movement of the user's shoulder in moving the object 16 acceptable). In some embodiments, the control unit 10 can also be responsible for generating the instruction for the user 4 to perform a specific action with the object 16. Figure 2 shows a block diagram of the rehabilitation system 2. Thus, it can be seen that the control unit 10 comprises a processor 18 which receives the first and second sets of signals from the motion tracking unit 6 and object tracking unit 8. In this embodiment, the control unit 10 includes a display 20 for providing written or graphical instructions and/or feedback to the user 4, and a speaker 22 for providing audible instructions and/or feedback to the user 4.
The system 2 also includes a user instruction generator 24 that generates the instructions for the user 4. As described, the instructions indicate an action for the user 4 to perform in relation to an object 16. A set of possible instructions can be stored, and the user instruction generator 24 can select an instruction for the user 4 to follow when required. Alternatively, the user instruction generator 24 can arbitrarily determine the instruction on request, within a set of parameters determined by the objects 16 and object tracking unit 8. For example, if there are a set of three objects and the object tracking unit 8 can monitor objects in sixteen discrete fields, the instruction can comprise moving a specific one of the three objects from its current field in the sixteen available fields to another field in the sixteen available fields.
The user instruction generator 24 then provides the generated instruction to an instruction indicator 26 that indicates the instruction to the user 4. In an embodiment in which the object tracking unit 8 monitors a plurality of discrete fields, the instruction indicator 26 can comprise a set of visual indicators, such as LEDs or other lights, in each field, indicating which object 16 should be moved, and where it should be moved to. The user instruction generator 24 also provides an indication of the instruction to the processor 18 so that the processor 18 can determine whether the user 4 has complied with the instruction.
It will be appreciated that, in some embodiments, the display 20 and/or speaker 22 can also be used to provide the instruction to the user 4, in addition to, or instead of, the instruction indicator 26.
It will also be appreciated that, in some embodiments, the object tracking unit 8, user instruction generator 24 and instruction indicator 26 can be integrated into a single unit (as indicated by dashed box 28 in Figure 2), and an exemplary embodiment is shown in Figure 3. Here, the object tracking unit 8 and instruction indicator 26 comprises a grid 30, with each square in the grid having an associated RFID tag reader for reading an RFID tag in or on the objects 16a, 16b and an LED or other light source for indicating an object 16a, 16b to be moved and a destination. Thus, in response to a particular instruction, the instruction indicator 26 can illuminate a box in which one of the objects 16 is present (for example box 32 in which object 16b is located), and a destination box, indicated by reference numeral 34.
In alternative embodiments, the instruction indicator 26 can comprise a projector which projects discrete fields on to the board of the object tracking unit 8 corresponding to those in the object tracking unit 8, and which can vary the color or content of particular ones of the projected fields. As a further alternative, the object tracking unit 8 and instruction indicator 26 can use the EnterTaible developed by the applicant.
Figure 4 illustrates a method in accordance with the invention. In step 101, the rehabilitation system 2 is activated. This step includes setting up and switching on the relevant apparatus, such as the motion tracking unit 6, object tracking unit 8 and control unit 10, and the user 4 preparing to use the system 2, including putting on any sensors required by the motion tracking unit 6.
In step 103, an instruction is generated that instructs the user 4 to perform an action with an object 16. As described above, this instruction can comprise, for example, instructing the user 4 to pick up an object 16 from a first position and to put the object 16 down at a second position.
In step 105, the user 4 is instructed to perform the action with the object 16. As described above, this instruction can be provided to the user by instruction indicator 26 and/or by the display 20 and speaker 22.
In step 107, the motion of the arm (or arms) of the user 4 is monitored by the motion tracking unit 6, along with the position of the object 16 by the object tracking unit 8.
The signals from the motion tracking unit 6 and object tracking unit 8 are provided to the processor 18 of the control unit 10 where they are analyzed to determine the compliance of the user 4 with the instruction (step 109).
In step 111, feedback is generated for the user 4 from the analysis of the signals, and this is provided to the user 4 via the display 20 and/or speakers 22. The feedback can be generated after the user 4 has completed the action required by the instruction, or it can be generated during the movement of the user 4, by analyzing the signals from the motion tracking unit 6 and object tracking unit 8 as they are produced. In this way, the rehabilitation system 2 can provide feedback to the user 4 in real time (i.e. as the action is being performed).
In step 113, the instruction can be repeated (if, for example, the user 4 did not execute the required movement properly, or if it is deemed that repetition will aid in the rehabilitation of the user 4), or a new instruction can be generated. The process then returns to step 105.
Thus, as described above, the object tracking unit 8 allows the control unit 10 to provide feedback on the end goal (the 'knowledge of results'), for example, the object 16 was moved to the correct location. By combining the signals from the object tracking unit 8 with signals from the motion tracking unit 6, the processor 18 can further generate qualitative feedback ('knowledge of performance'), for example, based on the amount of compensation by the shoulder or the jerkiness of the movement. As described above, the feedback can be provided online (i.e. during the execution of the instruction) or offline (after the execution). It will be appreciated that, in some implementations of the invention, it is necessary for the rehabilitation system 2 to implement some algorithm to recognize a specific movement or movements in the motion data, in order to determine the point at which the user 4 starts to comply with the instruction and completes the instruction. For example, algorithms may be required to identify when the user 4 reaches towards an object 16. Such algorithms are known in the art, and will not be described further herein.
However, it will be appreciated that, particularly with users that have a disability or impairment, it is difficult to achieve the required reliability and accuracy with these algorithms due to uncontrolled or compensatory movements of the user's body. These algorithms produce a large variation in determining whether the instruction has been correctly executed, and this is inappropriate as the exact movement is not as important as the result.
Thus, in a preferred embodiment of the invention, the processor 18 uses the signals from the motion tracking unit 6 and object tracking unit 8 to identify the start and end points of the motion of the user 4. In particular, the signals from the object tracking unit 8 allow the signals collected from the motion tracking unit 6 to be segmented into a subset of signals that relate to motion when the user 4 is complying with the instruction (i.e. picking up the object 16), and signals that do not relate to motion of the user 4 in complying with the instruction.
From the signals from the object tracking unit 8, the processor 18 identifies the time at which the object 16 is first moved following the issuance of the instruction, and this time point is also used as the start point of the relevant motion of the user 4 (the "relevant" motion being that involved in complying with the instruction). Thus, when analyzing the signals from the motion tracking unit 6, the processor 18 considers the motion occurring after the identified time point. Alternatively, the start point of the motion can be set to a predetermined time period before the point at which the object 16 is first moved, in order to capture the motion of the user 4 just before the object 16 is picked up (i.e. corresponding to the motion of the user 4 in reaching for the object 16). In a further alternative embodiment, the start point of the motion can be set by an algorithm that uses the point at which object 16 is first moved as an estimate (which is near to the actual point that the motion started). For example, such an algorithm can check around the estimate to determine when the user 4 actually started to move, and thereby determine the real start point. Similarly, if the user 4 is already in motion at the point that they start to comply with the instruction, the algorithm can use the point at which the object 16 is first moved as an estimate, and can check around the estimate to determine when the user 4 actually started to perform movements in order to comply with the instruction.
The signals from the object tracking unit 8 can be used to determine the end point of the motion of the user 4 in a corresponding fashion. In particular, the processor 18 can identify the end point of the motion as the point at which the object 16 is placed in another position (which may be the destination specified in the instruction or another destination), as a predetermined time period after the point at which the object 16 is placed in another position, or as a point determined from using the point at which the object 16 is placed in another position as an estimate.
For example, where the instruction relates to drinking from a cup, the start time of the action can be identified as the time when the cup is removed from the board in the object tracking unit 8 and the board issues a corresponding signal. The end time is indicated by the time that the cup is placed back on the board of the object tracking unit 8 and the board issues the corresponding signal.
Thus, by using the signals from both the motion tracking unit 6 and object tracking unit 8, the processor 18 is able to identify the start and end points of the motion without needing to use complex motion identification or recognition algorithms, whose performance (i.e. success in identifying particular motions correctly) depend on the types of actions and how well the actual execution matches a pre-recorded template. Thus, it will be appreciated that this embodiment of the invention is particularly suited to use in rehabilitating users 4 that have problems in controlling their movements.
The processor 18 can analyze the signals from the motion tracking unit 6 and object tracking unit 8 to determine if the object 16 was placed in the location specified by the instruction. Next, the processor 18 can determine the quality of the motion of the user 4 in executing the instruction. As described above, the relevant motion (i.e. that in performing the instruction) can be identified using the start and end points of the motion of the object 16.
The quality of the motion of the user 4 can be given by, for example, its smoothness, the presence of any "compensations" (for example excessive use of the shoulder in the motion) and/or a measure of the spatial and/or temporal match between the actual movement of the user 4 and a prerecorded optimal movement. The processor 18 can also determine the duration of the motion.
The processor 18 can determine the smoothness of the motion by examining the "jerk" (given by the third derivative of position with respect to time), for example as described in "Movement smoothness changes during stroke recovery" by Rohrer et al., The Journal of Neuroscience, 2002, 22, 8297-8304 and "The coordination or arm movements: An experimentally confirmed mathematical model", The Journal of Neuroscience, 1985, 5, 1688- 1703. The processor 18 can determine the presence of any compensations, for example, by determining the deviation of the position of the trunk during the movement from the position of the trunk when sitting upright. The information required to determine this deviation can be provided from the motion tracking unit 6 (for example by placing an inertial or motion sensor on the trunk of the user 4). The processor 18 can use pattern recognition techniques to check whether each extracted repetition corresponds to the desired movement as given by a prerecorded optimal movement. It will be appreciated that these pattern recognition techniques are relatively simple to implement once the start and end points of the motion are determined from the motion of the object 16, in comparison to pattern recognition techniques that must also identify the relevant motion from all of the motion tracking unit data.
As described above, the results of the analysis (step 109) are used to provide feedback to the user 4. The feedback can be as simple as just stating the results, i.e. the number of completed repetitions, the success rate, or the feedback can be more informative, and can provide the user 4 with advice on how to improve their technique (for example: 'try not to move your shoulder').
Using the example above, the feedback can be generated based on the deviation of the trunk from the optimal position for the movement (e.g. sitting upright). If the deviation is within, say, 10 degrees, the feedback can indicate to the user 4 that the movement was acceptable. However, if the deviation is more than, say, 10 degrees, the feedback can indicate to the user 4 that they should move their shoulder less.
Thus, the invention provides a system for providing feedback on the execution of an action by the user. As described, the invention finds particular application in rehabilitation and exercise. As the user is interacting with real objects, the user receives natural feedback on their actions.
The invention can also be used as a communication interface between the user and a control unit. Based on predefined fields and colors, say, the user can interact with the control unit by placing an object on a specific field, thereby indicating his input. Such a system can be used in controlling or interacting with a computer game.
In particular, the object tracking unit 8 and instruction indicator 26 can be used to implement this communication interface in a rehabilitation system 2. For example, where the object tracking unit 8 and instruction indicator 26 comprises a board that is divided into discrete fields, the instruction indicator 26 can light up various fields to indicate different user-selectable options, such as two fields, say, one in red associated with a "back to the previous step" function and one in green associated with a "move to the next step" function, and the user 4 can indicate their choice by placing an object 16 in the desired field.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.
Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measured cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Claims

CLAIMS:
1. A control unit, configured to: receive a first set of signals, the first set of signals representing motion of at least a part of the body of a user in response to an instruction indicating an action for the user to perform in relation to an object; receive a second set of signals, the second set of signals representing the monitored position of the object; analyze the first and second sets of signals to determine a quality of the motion of the user in performing the action; and determine feedback for the user based on the determined quality of the motion.
2. A control unit as claimed in claim 1, wherein the control unit is configured to analyze the second set of signals to identify a subset of the first set of signals relating to the motion of the user in performing the action.
3. A control unit as claimed in claim 2, wherein the control unit is configured to determine a start point of the subset as the point at which the second set of signals indicate that the object has been moved.
4. A control unit as claimed in claim 2, wherein the control unit is configured to determine a start point of the subset as a predetermined period before or after the point at which the second set of signals indicate that the object has been moved.
5. A control unit as claimed in claim 2, wherein the control unit is configured to determine a start point of the subset by: identifying a point in the second set of signals that indicates that the object has been moved; and identifying a point in the first set of signals that corresponds to the user starting to perform the action using said point in the second set of signals.
6. A control unit as claimed in any of claims 2 to 5, wherein the control unit is configured to analyze the second set of signals to determine an end point of the subset as the point at which the second set of signals indicate that the object has been placed in a position indicated by the instruction or another position.
7. A control unit as claimed in any of claims 2 to 5, wherein the control unit is configured to determine an end point of the subset as a predetermined period before or after the point at which the second set of signals indicate that the object has been placed in a position indicated by the instruction or another position.
8. A control unit as claimed in any of claims 2 to 5, wherein the control unit is configured to determine an end point of the subset by: identifying a point in the second set of signals that indicates that the object has been placed in a position indicated by the instruction or another position; and identifying a point in the first set of signals that corresponds to the user completing the action using said point in the second set of signals.
9. A control unit as claimed in any preceding claim, wherein the control unit is configured to determine a quality of the motion of the user by determining the smoothness of the motion of the user, the presence of any compensatory motion in the motion of the user, and/or a measure of the deviation between the motion of the user and a predetermined template or pattern for the motion.
10. A system for providing feedback to a user, the system comprising: a user interface for providing an instruction to a user, the instruction indicating an action for the user to perform in relation to an object; a motion tracking unit for monitoring the motion of at least a part of the body of the user in response to the instruction and for generating a first set of signals, the first set of signals representing the monitored motion; an object tracking unit for monitoring the position of the object and for generating a second set of signals, the second set of signals representing the monitored position of the object; and a control unit as claimed in any of claims 1 to 9.
11. A system as claimed in claim 10, wherein the motion tracking unit comprises at least one sensor unit for attachment to a part of the body of the user.
12. A system as claimed in claim 10 or 11, wherein the object tracking unit comprises a board divided into discrete fields, and wherein the object tracking unit can determine the presence and/or absence of the object in each of the discrete fields.
13. A system as claimed in claim 12, wherein the object tracking unit comprises a RFID tag reader in each field, and wherein the object comprises a corresponding RFID tag.
14. A system as claimed in claim 12 or 13, wherein the action for the user to perform in relation to the object comprises moving the object from a first field to a second field.
15. A system as claimed in claim 14, wherein the user interface provides the instruction to the user by illuminating at least one of the first field and the second field.
16. A system as claimed in any of claims 10 to 15, wherein the user interface is further configured to provide a plurality of options to the user, each option corresponding to a particular action in relation to the object, and wherein the control unit is further configured to analyze the second set of signals to determine the selection of an option by the user.
17. A method of providing feedback to a user, the method comprising: receiving a first set of signals, the first set of signals representing motion of at least a part of the body of a user in response to an instruction indicating an action for the user to perform in relation to an object; receiving a second set of signals, the second set of signals representing the monitored position of the object; analyzing the first and second sets of signals to determine a quality of the motion of the user in performing the action; and determining feedback for the user based on the determined quality of the motion.
18. A computer program product comprising computer program code that, when executed on a suitable computer or processor, causes the computer or processor to perform the method as claimed in claim 17.
PCT/IB2009/054628 2008-10-29 2009-10-21 Control unit for a system and method for providing feedback to a user WO2010049848A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP08167820 2008-10-29
EP08167820.3 2008-10-29

Publications (1)

Publication Number Publication Date
WO2010049848A1 true WO2010049848A1 (en) 2010-05-06

Family

ID=41349335

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/054628 WO2010049848A1 (en) 2008-10-29 2009-10-21 Control unit for a system and method for providing feedback to a user

Country Status (1)

Country Link
WO (1) WO2010049848A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8831794B2 (en) * 2011-05-04 2014-09-09 Qualcomm Incorporated Gesture recognition via an ad-hoc proximity sensor mesh for remotely controlling objects

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005021102A2 (en) * 2003-08-21 2005-03-10 Ultimate Balance, Inc. Adjustable training system for athletics and physical rehabilitation including student unit and remote unit communicable therewith
WO2006014810A2 (en) * 2004-07-29 2006-02-09 Kevin Ferguson A human movement measurement system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005021102A2 (en) * 2003-08-21 2005-03-10 Ultimate Balance, Inc. Adjustable training system for athletics and physical rehabilitation including student unit and remote unit communicable therewith
WO2006014810A2 (en) * 2004-07-29 2006-02-09 Kevin Ferguson A human movement measurement system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MOUNTAIN ET AL.: "The SMART Project: A user led approach to developing and testing technological applications for domiciliary stroke rehabilitation", DESIGNING ACCESSIBLE TECHNOLOGY, 21, 2006, XP002558299 *
PRIDMORE ET AL.: "Mixed reality environments in stroke rehabilitation: interfaces across the real/virtual divide", PROCEEDINGS OF THE 5TH INTERNATION CONFERENCE ON DISABILITY, VIRTUAL REALITY AND ASSOCIATED TECHNIQUES, 2004, pages 11 - 18, XP002558298 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8831794B2 (en) * 2011-05-04 2014-09-09 Qualcomm Incorporated Gesture recognition via an ad-hoc proximity sensor mesh for remotely controlling objects

Similar Documents

Publication Publication Date Title
US11037369B2 (en) Virtual or augmented reality rehabilitation
CN108463271B (en) System and method for motor skill analysis and skill enhancement and prompting
US20140371633A1 (en) Method and system for evaluating a patient during a rehabilitation exercise
US20170136296A1 (en) System and method for physical rehabilitation and motion training
JP5356690B2 (en) Method, system, and program for tracking a range of physical movement of a user
Melero et al. Upbeat: augmented reality-guided dancing for prosthetic rehabilitation of upper limb amputees
US20150004581A1 (en) Interactive physical therapy
JPH10151223A (en) Wellness system
US20140081432A1 (en) Method and Apparatus for Rehabilitation Using Adapted Video Games
US11786147B2 (en) Distributed sensor-actuator system for synchronized movement
EP3042360A1 (en) System and method for identifying and interpreting repetitive motions
Wang et al. Feature evaluation of upper limb exercise rehabilitation interactive system based on kinect
US20220019284A1 (en) Feedback from neuromuscular activation within various types of virtual and/or augmented reality environments
CN104484574A (en) Real-time human body gesture supervised training correction system based on quaternion
WO2020084351A1 (en) Systems and methods for assessment and measurement of reaction time in virtual/augmented reality
WO2016083826A1 (en) Facial exercise system
Vogiatzaki et al. Serious games for stroke rehabilitation employing immersive user interfaces in 3D virtual environment
CN111863198A (en) Rehabilitation robot interaction system and method based on virtual reality
Ruttkay et al. Towards a reactive virtual trainer
CN110237518B (en) Interactive training or treatment method and system
CN113257387B (en) Wearable device for rehabilitation training, rehabilitation training method and system
US11942206B2 (en) Systems and methods for evaluating environmental and entertaining elements of digital therapeutic content
EP2482935B1 (en) System for supporting a user to do exercises
US20230285806A1 (en) Systems and methods for intelligent fitness solutions
CN117148977A (en) Sports rehabilitation training method based on virtual reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09744466

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09744466

Country of ref document: EP

Kind code of ref document: A1