WO2021014149A1 - Methods and systems for musculoskeletal rehabilitation - Google Patents

Methods and systems for musculoskeletal rehabilitation Download PDF

Info

Publication number
WO2021014149A1
WO2021014149A1 PCT/GB2020/051746 GB2020051746W WO2021014149A1 WO 2021014149 A1 WO2021014149 A1 WO 2021014149A1 GB 2020051746 W GB2020051746 W GB 2020051746W WO 2021014149 A1 WO2021014149 A1 WO 2021014149A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image data
trained model
rehabilitation
physio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/GB2020/051746
Other languages
French (fr)
Inventor
Alexander Young
Nils HELLBERG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Virtihealth Ltd
Original Assignee
Virtihealth Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Virtihealth Ltd filed Critical Virtihealth Ltd
Publication of WO2021014149A1 publication Critical patent/WO2021014149A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation

Definitions

  • Musculoskeletal rehabilitation has been a challenge due to the high cost in personal training or hardware requirement.
  • Studies and expert opinion suggest that, for most patients, intensive post-operative rehabilitation can be followed up with education and a well-structured home exercise program.
  • Unfortunately the number of physiotherapists is insufficient to meet current demands.
  • the present disclosure provides methods and systems for providing physiotherapy, rehabilitation and/or training to a subject for treating disease, such as musculoskeletal disease, or providing exercise and education for trauma and orthopedic patients.
  • the provided systems and methods may also be used for other preventive training purpose, for example, ensuring that a patient with developing arthritis does not start favoring a diseased joint.
  • Systems and methods of the present disclosure may be used to achieve a specific rehabilitation goal , such as rehabilitation of a particular limb or used for non-medical training, such as wellness exercise.
  • Systems and methods of the present disclosure may utilize immersive technologies such as immersive, virtual reality (VR) and augmented reality (AR) enabled systems, coupled with artificial intelligence techniques (e.g., natural language processing, computer vision).
  • VR virtual reality
  • AR augmented reality
  • systems and methods of the present disclosure may provide real-time motion quantification using a user device camera alone without additional hardware requirements.
  • the motion quantification can be provided based on camera image data alone without requiring additional sensors. This may beneficially allow patients/users to be engaged in a rehabilitation process in a variety of places with easy setup and reduced cost.
  • remote quantification of movement e.g., joint movement
  • a camera e.g., camera on a user device, webcam or phone camera
  • Real-time feedback may be provided to the user through the user device and/or AR/VR system and a remote therapist in a range of communication modalities such as audio, haptic, or visual.
  • quantitative and qualitative analysis of the rehabilitation progress and real-time motion may be visually presented to users in user applications and to
  • surgeons/operators via a web-based dashboard.
  • a method for providing rehabilitation training to a user.
  • the method comprises: obtaining image data of the user performing a movement of a body part, wherein the image data is collected using an imaging device located on a user device; measuring a range of the movement of the body part based on the image data using a first trained model; and generating a feedback in real-time to correct the movement using a second trained model.
  • the image data is two-dimensional image data.
  • the range of the movement comprises a joint angle and displacement of the body part in a three-dimensional space.
  • the first trained model is obtained using an unsupervised learning algorithm.
  • the method further comprises providing the feedback using a virtual reality or augmented reality system.
  • the method further comprises generating one or more metrics including a contact force or counts of repetitions based on the image data.
  • the one or more metrics are derived from an estimated pose generated by the first trained model.
  • the second trained model processes the one or more metrics and the range of motion and outputs the feedback.
  • the first trained model or the second trained model is further improved using the image data.
  • a system for providing rehabilitation training to a user.
  • the system comprises a server in communication with a computing device associated with a user, wherein the server comprises a memory for storing interactive media and a set of software instructions, and one or more processors configured to execute the set of software instructions to: obtain image data of the user performing a movement of a body part, wherein the image data is collected using an imaging device located on the user device; measure a range of the movement of the body part based on the image data using a first trained model; and generate a feedback in real-time to correct the movement using a second trained model.
  • the image data is two-dimensional image data.
  • the range of the movement comprises a joint angle and displacement of the body part in a three-dimensional space.
  • the first trained model is developed using unsupervised learning algorithm.
  • the system further comprises a virtual reality or augmented reality system for providing the feedback.
  • the one or more processors are configured to further generate one or more metrics including a contact force or counts of repetitions based on the image data.
  • the one or more metrics are derived from an estimated pose generated by the first trained model.
  • the second trained model processes the one or more metrics and the range of motion and outputs the feedback.
  • the first trained model or the second trained model is further improved using the image data.
  • a tangible computer readable medium for storing instructions that, when executed by a server, causes the server to perform a computer- implemented method for providing rehabilitation training to a user.
  • the method comprises: obtaining image data of the user performing a movement of a body part, wherein the image data is collected using an imaging device located on a user device; measuring a range of the movement of the body part based on the image data using a first trained model; and generating a feedback in real-time to correct the movement using a second trained model.
  • the tangible computer readable medium further comprises providing the feedback using a virtual reality or augmented reality system.
  • FIG. 1 illustrates an exemplary environment in which the physio rehabilitation platform described herein may be implemented
  • FIG. 2 illustrates an example of implementing the physio rehabilitation platform
  • FIG. 3 shows an exemplary pose estimation algorithm
  • FIG. 4 shows exemplary components of a physio rehabilitation system, in accordance with embodiments of the invention.
  • FIG. 5 show an exemplary user interface provided by the physio rehabilitation system
  • FIG. 6 show an exemplary user interface provided by the physio rehabilitation system
  • FIG. 7 shows a flow chart of a method implemented by the physio rehabilitation platform.
  • the physio rehabilitation platform may utilize computer vision and artificial intelligence techniques to enable accurate quantification of a range of motion.
  • the physio rehabilitation platform can enable real-time motion measurement and feedback generation applicable to certain healthcare areas (e.g., musculoskeletal disease, non-medical training, etc).
  • the physio rehabilitation platform can be used to help users effectively engage in a rehabilitation program remotely and with reduced hardware requirement.
  • image data captured by a user device camera e g., phone camera, webcam, etc.
  • VR and/or AR devices may be used in the physio rehabilitation platform for providing real-time feedback and training guidance.
  • a range of motion of a body part may comprise a movement of the body part in 3- dimensional (3 D) space.
  • the range of moti on may be generated in terms of anthropomorphic constraints and joint angle limits of a human body articulation.
  • the range of motion may include displacement along an X-axis, a Y-axis, and a Z-axis and joint angles (e.g., elevation angle, azimuth angle).
  • FIG. 1 illustrates an exemplary environment in which the physio rehabilitation platform described herein may be implemented.
  • a physio rehabilitation platform 100 may include one or more user devices 101-1, 101-2, a server 120, a physio rehabilitation system 121, and a database 109, 123.
  • the physio rehabilitation platform 100 may optionally comprise one or more VR/AR systems 105-1, 105-2.
  • Each of the components 101-1, 101-2, 109, 123, 120, 105-1 and 105-2 may be operatively connected to one another via network 110 or any type of communication links that allows transmission of data from one component to another.
  • the physio rehabilitation system 121 may be configured to analyze input data (e.g., image data) from the user device in order to identify and quantify a range of motion of a body part or limb and to provide feedback information (e.g., guidance, quantification result, recommendation) to assist a user in correcting a pose or exercise.
  • input data e.g., image data
  • feedback information e.g., guidance, quantification result, recommendation
  • the physio rehabilitation system 121 may also receive input data from the AR/VR system to supplement the data collected by the user device.
  • the physio rehabilitation system 121 may be implemented anywhere within the physio rehabilitation platform, and/or outside of the physio rehabilitation platform. In some embodiments, the physio rehabilitation system may be implemented on the server. In other embodiments, a portion of the physio rehabilitation system may be implemented on the user device. Additionally, a portion of the physio rehabilitation system may be implemented on the AR/VR system. Alternatively, the physio rehabilitation system may be implemented in one or more databases. The physio rehabilitation system may be implemented using software, hardware, or a combination of software and hardware in one or more of the above-mentioned components within the physio rehabilitation platform.
  • the user device 101-1, 101-2 may comprise an imaging sensor 107-1, 107-2 serves as imaging device.
  • the imaging device 107-1, 107-2 may be on-board the user device.
  • the imaging device can include hardware and/or software element.
  • the imaging device may be a camera or imaging sensor operably coupled to the user device.
  • the imaging device may be located external to the user device, and image data of a body part or limbs of the user may be transmitted to the user device via communication means as described elsewhere herein.
  • the imaging device can be controlled by an application/software configured to take image or video of the user.
  • the camera may be configured to take a 2D image of at least a body part of the user.
  • the software and/or applications may be configured to control the camera on the user device to take image or video.
  • the imaging device 107-1, 107-2 may be a fixed lens or auto focus lens camera.
  • a camera can be a movie or video camera that captures dynamic image data (e.g., video).
  • a camera can be a still camera that captures static images (e.g., photographs).
  • a camera may capture both dynamic image data and static images.
  • a camera may switch between capturing dynamic image data and static images.
  • the camera may comprise optical elements (e.g., lens, mirrors, filters, etc).
  • the camera may capture color images (RGB images), greyscale image, and the like.
  • the imaging device 107-1, 107-2 may be a camera used to capture visual images of at least part of the human body. Any other type of sensor may be used, such as an infra-red sensor that may be used to capture thermal images of the human body.
  • the imaging sensor may collect information anywhere along the electromagnetic spectrum, and may generate corresponding images accordingly.
  • the imaging device may be capable of operation at a fairly high resolution.
  • the i maging sensor may have a resolution of greater than or equal to about 100 mm, 50 mm, 10 mm, 5 mm, 2 mm, 1 mm, 0.5 mm, 0.1 mm, 0.05 mm, 0.01 mm, 0.005 mm, 0.001 mm, 0.0005 mm, or 0.0001 pm.
  • the image sensor may be capable of collecting 4K or higher images.
  • the imaging device 107-1, 107-2 may capture an image frame or a sequence of image frames at a specific image resolution.
  • the image frame resolution may be defined by the number of pixels in a frame.
  • the image resolution may be greater than or equal to about 352x420 pixels, 480x320 pixels, 720x480 pixels, 1280x720 pixels, 1440x1080 pixels, 1920x1080 pixels, 2048x1080 pixels, 3840x2160 pixels, 4096x2160 pixels, 7680x4320 pixels, or 15360x8640 pixels.
  • the imaging device 107-1 , 107-2 may capture a sequence of im age fram es at a specific capture rate.
  • the sequence of images may be captured at a rate less than or equal to about one image every 0.0001 seconds, 0.0002 seconds, 0.0005 seconds, 0.001 seconds, 0.002 seconds, 0.005 seconds, 0.01 seconds, 0.02 seconds, 0.05 seconds. 0.1 seconds, 0.2 seconds, 0.5 seconds, 1 second, 2 seconds, 5 seconds, or 10 seconds.
  • the capture rate may change depending on user input and/or external conditions (e.g. illumination brightness).
  • the imaging device 107-1, 107-2 may be configured to obtain image data to track motion or posture of a user.
  • the imaging device may or may not be a 3D camera, stereo camera or depth camera.
  • computer vision techniques and deep learning techniques may be used to reconstruct 3D pose using 2D imaging data.
  • the imaging device may be monocular camera and images of the user may be taken from a single view/angle.
  • User device 101-1, 101-2 may comprise one or more imaging devices for capturing image data of one or more users 103-1, 103-2 co-located with the user device.
  • the captured image data may then be analyzed by the physio rehabilitation system to measure a range of motion of a body part or limbs.
  • the image data may be processed to identify a predicted three-dimensional (3D) pose of the user and a range of motion/movement of the user may be measured with high precision and accuracy.
  • the image data may be 2D image data or video data.
  • the image data may be color (e.g., RGB) images or 2D keypoints.
  • the image data may be raw data captured by a user device camera without extra setup or cost. Details about using computer vision and machine learning techniques for pose estimation are described later herein.
  • User device 101-1, 101-2 may be a computing device configured to perform one or more operations consistent with the disclosed embodiments.
  • Examples of user devices may include, but are not limited to, mobile devices, smartphones/cellphones, tablets, personal digital assistants (PDAs), laptop or notebook computers, desktop computers, media content players, television sets, video gaming station/system, virtual reality systems, augmented reality systems, microphones, or any electronic device capable of analyzing, receiving, providing or displaying certain types of feedback data (e.g., rehabilitation progress, motion quantification analysis) to a user.
  • the user device may be a handheld object.
  • the user device may be portable.
  • the user device may be carried by a human user. In some cases, the user device may be located remotely from a human user, and the user can control the user device using wireless and/or wired communications.
  • User device 101-1, 101-2 may include one or more processors that are capable of executing non-transitory computer readable media that may provide instructions for one or more operations consistent with the disclosed embodiments.
  • the user device may include one or more memory storage devices comprising non-transitory computer readable media including code, logic, or instructions for performing the one or more operations.
  • the user device may include software applications that allow the user device to communicate with and transfer data between AR/VR system 105-1, 105-2, server 120, physio rehabilitation system 121, and/or database 109.
  • the user device may include a communication unit, which may permit the communications with one or more other components in physio rehabilitation platform 121.
  • the communication unit may include a single communication module, or multiple communication modules.
  • the user device may be capable of interacting with one or more components in the physio rehabilitation platform 121 using a single communication link or multiple different types of communication links.
  • User device 101-1, 101-2 may include a display.
  • the display may be a screen.
  • the display may or may not be a touchscreen.
  • the display may be a light-emitting diode (LED) screen, OLED screen, liquid crystal display (LCD) screen, plasma screen, or any other type of screen.
  • the display may be configured to show a user interface (UI) or a graphical user interface (GUI) rendered through an application (e.g., via an application programming interface (API) executed on the user device).
  • the GUI may show images, charts, analytics relating to the rehabilitation progress and real-time quantification result, and the GUI may permit a user to view and receive feedbacks (e.g., guidance, recommendation) generated by the physio rehabilitation system.
  • the user device may also be configured to display webpages and/or websites on the Internet. One or more of the webpages/websites may be hosted by server 120 and/or rendered by the physio rehabilitation system 121.
  • a user may navigate within the GUI through the application. For example, the user may select a link by directly touching the screen (e.g., touchscreen). The user may touch any portion of the screen by touching a point on the screen. Alternatively, the user may select a portion of an image with ai d of a user interactive device (e.g., mouse, joystick, keyboard, trackball, touchpad, button, verbal commands, gesture-recognition, attitude sensor, thermal sensor, touch-capacitive sensors, or any other device).
  • a touchscreen may be configured to detect location of the user’s touch, length of touch, pressure of touch, and/or touch motion, whereby each of the
  • aforementioned manner of touch may be indicative of a specific input command from the user.
  • users may utilize the user devices to interact with the physio rehabilitation system 121 by way of one or more software applications (i.e., client software) running on and/or accessed by the user devices, wherein the user devices and the physio rehabilitation system 121 may form a client-server relationship.
  • client software i.e., software applications installed on the user devices 101-1, 101-2
  • the client software may be available either as downloadable software or mobile applications for various types of computer devices.
  • the client software can be implemented in a combination of one or more programming languages and markup languages for execution by various web browsers.
  • the client software can be executed in web browsers that support JavaScript and HTML rendering, such as Chrome, Mozilla Firefox, Internet Explorer, Safari, and any other compatible web browsers.
  • JavaScript and HTML rendering such as Chrome, Mozilla Firefox, Internet Explorer, Safari, and any other compatible web browsers.
  • the various embodiments of client software applications may be compiled for various devices, across multiple platforms, and may be optimized for their respective native platforms.
  • the physio rehabilitation platform 100 may comprise using a virtual or augmented reality system to place the subject in a virtual world, where the subject can be presented with visual, auditory, and/or haptic stimulation of feedback in response to measured range of motion, monitoring the subject’s interaction with the virtual and/or real world, and measuring the subject’s progress toward one or more therapeutic/rehabilitation goals.
  • the virtual reality (VR) or augmented reality (AR) system may comprise any device comprising one or more displays, such as a monitor, screen, mobile phone, computer, smartphone, laptop, tablet, television, smart television, or other device.
  • the user may access the experience or rehabilitation training through the use of supplemental headsets (e.g., Google®
  • the virtual reality (VR) or augmented reality (AR) system 105-1, 105-2 and/or the physio rehabilitation system 121 may comprise or be coupled to any other suitable devices or equipment such as smartwatches, wristbands, glasses, gloves, headgear (such as hats, helmets, virtual reality headsets, augmented reality headsets, head-mounted devices (HMD), headbands), pendants, armbands, leg bands, shoes, vests, motion sensing devices, etc.
  • suitable devices or equipment such as smartwatches, wristbands, glasses, gloves, headgear (such as hats, helmets, virtual reality headsets, augmented reality headsets, head-mounted devices (HMD), headbands), pendants, armbands, leg bands, shoes, vests, motion sensing devices, etc.
  • the virtual reality (VR) or augmented reality (AR) system may be communicatively coupled to one or more sensors.
  • the one or more sensors may be integrated in the VR/AR device or external to, and operatively coupled to, the VR/AR device, such as via wired or wireless (e.g., Bluetooth, Wi-Fi, Near Field Communication (NFC), etc.) connections.
  • the one or more sensors may be capable of collecting data on the user, such as the user’s interactions, reactions, and/or responses to one or more components and/or stimulations in the VR or AR experience.
  • sensors may include inertial sensors (e.g., accelerometers, gyroscopes, and/or gravity detection sensors, which may form inertial measurement units (IMUs)), location sensors (e.g., global positioning system (GPS) sensors, mobile device transmitters enabling location triangulation), heart rate monitors, external temperature sensors, skin temperature sensors, capacitive touch sensors, sensors configured to detect a galvanic skin response (GSR), vision sensors (e.g., imaging devices capable of detecting visible, infrared, or ultraviolet light, such as cameras), proximity or range sensors (e.g., ultrasonic sensors, lidar, time-of-flight or depth cameras), altitude sensors, attitude sensors (e.g., compasses), pressure sensors (e.g., barometers), humidity sensors, vibration sensors, audio sensors (e.g., microphones), and/or field sensors (e.g., magnetometers, electromagnetic sensors, radio sensors).
  • IMUs inertial measurement units
  • location sensors e.g., global positioning system
  • At least part of the subject’s interactions, reactions, and/or responses to the stimulations presented in the virtual world can be quantified based at least in part on sensory data measured for the subject, such as a range of motion, posture, reaction time, response volume, and/or other forms or units of outputs by the subject.
  • the subject’s realtime motion is quantified in real-time based on image data captured by the user device camera alone. Detail s about the motion quantification are described later herein.
  • any examples herein of sensors that may be present in AR/VR system may also apply to the user device.
  • one or more different sensors may be incorporated into the user device.
  • feedback to the user motion may be delivered in real-time in the form of the VR or AR experience.
  • a user may receive visual, audible, haptic or other forms of feedback in the VR or AR experience.
  • the feedback may be delivered without the VR or AR system.
  • charts in a GUI, audible command and the like may be delivered to the user via the user device.
  • the VR or AR experience may comprise one or more VR or AR scenes.
  • the VR or AR experience may comprise a time-dependent progression of one or more VR or AR scenes.
  • the VR or AR scenes may be dynamic, such as comprising one or more dynamic components (e.g., animation, audio, etc.) and/or components that can be triggered to change.
  • the user may be capable of interacting with, or reacting to or responding to, one or more components of the VR or AR scenes.
  • the user may have a stereoscopical view of the one or more VR or AR scenes in the VR or AR
  • the VR or AR experience can be a 360° experience.
  • the VR or AR experience may be capable of presenting one or more stimulations, such as visual stimulations, audio
  • the one or more stimulations may be provided via one or more external devices operatively coupled to the AR/VR system, such as via wired or wireless connections.
  • external devices can include, for example, other displays, screens, speakers, headphones, earphones, controllers, actuators, lamps, or other devices capable of providing visual, audio, and/or haptic output to the user.
  • User device 101-1, 101-2 and AR/VR system 105-1, 105-2 may be operated by one or more users consistent with the disclosed embodiments.
  • a user may be associated with a unique user device and an AR/VR system.
  • a user may be associated with a plurality of user devices and AR/VR systems.
  • a user as described herein may refer to an individual or a group of individuals who are seeking rehabilitation, exercise training, or to improve their well-being through the physio rehabilitation system.
  • a user may be an individual suffering from musculoskeletal disease (MSK).
  • a user as described herein may also include an individual or a group of individuals who are seeking educations about rehabilitation training, such as therapist, service providers or trainers.
  • a user as described herein may also include a real therapist, physician, supervisor or operator who is monitoring or coaching a patient/user through the physio rehabilitation system.
  • User device 101-1, 101-2 and/or the AR/VR system 105-1, 105-2 may be configured to receive input from one or more users.
  • a user may provide an input to the user device using an input device, for example, a keyboard, a mouse, a touch-screen panel, voice recognition and/or dictation software, built-in input device of the AR/VR system or any combination of the above.
  • the user input may include statements, comments, questions, or answers relating to rehabilitation or training, such as interaction with therapist or supervisor in real life through the platform, or alternatively as an avatar in the virtual environment.
  • the user input may be Patient-Reported Outcome Measures (PROMs) data which provide precise and accurate information about the rehabilitation outcome.
  • the PROMs data may be collected in a multimodal fashion.
  • Proprietary portal or personalized questions may be designed to collect PROMs data.
  • standard questions such as the full Knee Injury and Osteoarthritis Outcome Score (KOOS) questionnaire plus the PROMIS-10 Global (10 questions) for knee replacement, may be used. These user inputs can be important for training a personalized rehabilitation/ coaching model.
  • Server 120 may be one or more server computers configured to perform one or more operations consistent with the disclosed embodiments.
  • the server may be implemented as a single computer, through which user device and AR/VR systems are able to communicate with the physio rehabilitation system and database.
  • the user device and/or the AR/VR system communicate with the physio rehabilitation system directly through the network.
  • the server may communicate on behalf of the user device and/or the AR/VR systems with the physio rehabilitation system or database through the network.
  • the server may embody the functionality of one or more of physio rehabilitation systems.
  • one or more physio rehabilitation systems may be implemented inside and/or outside of the server.
  • the physio rehabilitation systems may be software and/or hardware components included with the server or remote from the server.
  • the user device and/or the AR/VR system may be directly connected to the server through a separate link (not shown in FIG. 1).
  • the server may be configured to operate as a front-end device configured to provide access to physio rehabilitation system consistent with certain disclosed embodiments.
  • the server may, in some embodiments, utilize one or more physio rehabilitation systems to analyze data streams from the user device and/or AR/VR system in order to quantify a range of motion, contact force, gait or other rehabilitation metrics, and to provide feedback information (e.g., coaching instruction, command, recommendation) to assist the user in correcting a posture.
  • feedback information e.g., coaching instruction, command, recommendation
  • the server may also be configured to store, search, retrieve, and/or analyze data and information stored in one or more of the databases.
  • the data and information may include raw data collected from imaging device on the user device, as well as each a user’s historical data pattern, rehabilitation metrics (e.g., pain level, gait, range of motion, contact force, etc.), medical record and user provided information. While FIG. 1 illustrates the server as a single server, in some
  • multiple devices may implement the functionality associated with a server.
  • a server may include a web server, an enterprise server, or any other type of computer server, and can be computer programmed to accept requests (e.g., HTTP, or other protocols that can initiate data transmission) from a computing device (e.g., user device and/or wearable device) and to serve the computing device with requested data.
  • a server can be a broadcasting facility, such as free-to-air, cable, satellite, and other broadcasting facility, for distributing data.
  • a server may also be a server in a data network (e.g., a cloud computing network).
  • a server may include known computing components, such as one or more processors, one or more memory devices storing software instructions executed by the processor(s), and data.
  • a server can have one or more processors and at least one memory for storing program instructions.
  • the processors can be a single or multiple microprocessors, field programmable gate arrays (FPGAs), or digital signal processors (DSPs) capable of executing particular sets of instructions.
  • Computer-readable instructions can be stored on a tangible non-transitory computer-readable medium, such as a hard disk, a CD-ROM (compact disk-read only memory), and MO (magneto-optical), a DVD-ROM (digital versatile disk-read only memory), a DVD RAM (digital versatile disk-random access memory), or a semiconductor memory.
  • a tangible non-transitory computer-readable medium such as a hard disk, a CD-ROM (compact disk-read only memory), and MO (magneto-optical), a DVD-ROM (digital versatile disk-read only memory), a DVD RAM (digital versatile disk-random access memory), or a semiconductor memory.
  • the methods can be implemented in hardware components or combinations of hardware and software such as, for example, ASICs, special purpose computers, or general purpose computers.
  • Network 110 may be a network that is configured to provide communication between the various components illustrated in FIG. 1.
  • the network may be implemented, in some embodiments, as one or more networks that connect devices and/or components in the network layout for allowing communication between them.
  • user device 101-1, 101-2, AR/VR system 105-1, 105-2, and physio rehabilitation system 121 may be in operable communication with one another over network 110.
  • Direct communications may be provided between two or more of the above components. The direct communi cations may occur without requiring any intermediary device or network. Indirect communications may be provided between two or more of the above components.
  • the indirect communications may occur with aid of one or more intermediary device or network.
  • indirect communications may utilize a telecommunications network.
  • Indirect communications may be performed with aid of one or more router, communication tower, satellite, or any other intermediary device or network.
  • types of communications may include, but are not limited to: communications via the Internet, Local Area Networks (LANs), Wide Area Networks (WANs), Bluetooth, Near Field Communication (NFC) technologies, networks based on mobile data protocols such as General Packet Radio Services (GPRS), GSM, Enhanced Data GSM Environment (EDGE), 3G, 4G, 5G or Long Term Evolution (LTE) protocols, Infra-Red (IR) communication technologies, and/or Wi-Fi, and may be wireless, wired, or a combination thereof.
  • the network may be implemented using cell and/or pager networks, satellite, licensed radio, or a combination of licensed and unlicensed radio.
  • the network may be wireless, wired, or a combination thereof.
  • User device 101-1, 101-2, AR/VR system 105-1, 105-2, server 120, and/or physio rehabilitation system 121 may be connected or interconnected to one or more databases 109, 123.
  • the databases may be one or more memory devices configured to store data. Additionally, the databases may also, in some embodiments, be implemented as a computer system with a storage device. In one aspect, the databases may be used by components of the network layout to perform one or more operations consistent with the disclosed embodiments.
  • One or more local databases, and cloud databases of the platform may utilize any suitable database techniques. For instance, structured query language (SQL) or“NoSQL” database may be utilized for storing the image data, user data, historical data, predictive model or algorithms.
  • SQL structured query language
  • NoSQL NoSQL
  • databases may be implemented using various standard data-structures, such as an array, hash, (linked) list, struct, structured text file (e.g., XML), table, JavaScript Object Notation (JSON), NOSQL and/or the like. Such data-structures may be stored in memory and/or in (structured) files.
  • an object-oriented database may be used.
  • Object databases can include a number of object collections that are grouped and/or linked together by common attributes; they may be related to other object collections by some common attributes. Object-oriented databases perform similarly to relational databases with the exception that objects are not just pieces of data but may have other types of functionality encapsulated within a given object.
  • the database may include a graph database that uses graph structures for semantic
  • databases may be implemented as a data-structure, the use of the database of the present invention may be integrated into another component such as the component of the present invention. Also, the database may be implemented as a mix of data structures, objects, and relational structures. Databases may be consolidated and/or distributed in variations through standard data processing techniques. Portions of databases, e.g., tables, may be exported and/or imported and thus decentralized and/or integrated.
  • the data management system may construct the database for fast and efficient data retrieval, query and delivery.
  • the physio rehabilitation system may provide customized algorithms to extract, transform, and load (ETL) the data.
  • the physio rehabilitation system may construct the databases using proprietary database architecture or data structures to provide an efficient database model that is adapted to large scale databases, is easily scalable, is efficient in query and data retrieval, or has reduced memory requirements in comparison to using other data structures.
  • the databases may comprise storage containing a variety of data consistent with disclosed embodiments.
  • the databases may store, for example, raw data collected by the imaging device located on user device.
  • the databases may also store user information, historical data patterns, data relating to a rehabilitation/training progress, medical records, analytics, user input (e.g., statements or comments indicative of how the user is feeling at different points in time, etc.), predictive models, algorithms, training datasets (e.g., video clips), and the like.
  • one or more of the databases may be co-located with the server, may be co-located with one another on the network, or may be located separately from other devices.
  • the disclosed embodiments are not limited to the configuration and/or arrangement of the database(s).
  • a server may access and execute physio rehabilitation system(s) to perform one or more processes consistent with the disclosed embodiments.
  • the physio rehabilitation system(s) may be software stored in memory accessible by a server (e.g., in memory local to the server or remote memory accessible over a communication link, such as the network).
  • the physio rehabilitation system(s) may be implemented as one or more computers, as software stored on a memory device accessible by the server, or a combination thereof.
  • one physio rehabilitation system(s) may be a computer executing one or more motion quantification algorithms and coaching algorithms
  • another physio rehabilitation system(s) may be software that, when executed by a server, performs one or more motion quantification algorithms and coaching algorithms.
  • the physio rehabilitation system 121 though is shown to be hosted on the server 120.
  • the physio rehabilitation system may be implemented as a hardware accelerator, software executable by a processor and various others.
  • the physio rehabilitation system may employ an edge intelligence paradigm that data processing and prediction is performed at the edge or edge gateway.
  • machine learning model may be built, developed and trained on the cloud/data center 120 and run on the user device and/or AR/VR systems (e.g., hardware accelerator). Details about the physio rehabilitation system and its components are described later herein with respect to FIG. 4.
  • the subject’s progress toward one or more therapeutic or rehabilitation goals can be quantified based at least in part on the motion quantification, contact force calculation or other metrics.
  • the subject’s progress toward one or more therapeutic goals can also be measured based on qualitative observations made by another user monitoring the subject’s interactions, reactions, and responses in the virtual world, such as a therapist, operator, or service provider.
  • the subject’s progress toward one or more therapeutic goals can be measured based on objective feedback from the virtual coach or a coaching module, or subjective feedback from a remote real therapist or service provider.
  • FIG. 2 illustrates an example of implementing the physio rehabilitation platform 200.
  • a physio rehabilitation platform 200 may comprise a user device 201 deployed in a location 203 remote from a supervising location 213, where a therapist or physician is permitted to accessing real-time analysis of the user motion through a therapist portal 211.
  • the user device 201 can be the same as the user device as described in FIG. 1.
  • the therapist portal 211 may be provided within an application running on a computing device located with the therapist.
  • the therapist portal 211 may provide real-time motion analysis (e.g., quantification of motion) and historical statistics (e.g., charts, records, patient information) about the user. Both the therapist portal 211 and the user application are provided by the physio rehabilitation system 121.
  • the computing device for the therapist to access the portal 211 may or may not comprise a camera.
  • the physio rehabilitation system may be implemented both inside and/or outside of a server.
  • the physio rehabilitation system may be software and/or hardware components included with a server, or remote from the server.
  • the physio rehabilitation system (or one or more functions of the physio rehabilitation system) may be implemented on the user device and/or the therapist device.
  • the user device, therapist device, and/or server may be configured to perform different functions of the physio rehabilitation system.
  • one or more functions of the physio rehabilitation system may be duplicated across the user device, therapist device, and/or server.
  • user device 201 may comprise at least one imaging device.
  • the user device 201 may comprise a camera for capturing video of the user.
  • the captured data stream may be processed locally at the user device and/or remotely at the physio rehabilitation system 121 to quantify a range of motion of a body part of the user.
  • the user device 201 may be configured to provide input data to the physio rehabilitation system.
  • the input data may comprise sensor data and user input as described above.
  • the sensor data may comprise raw data collected by the imaging device on the user device.
  • the sensor data may be stored in memory located on the user device, the therapist device, and/or server.
  • the sensor data may be stored in one or more databases.
  • the databases may be located on the server, coupled to user device, and/or therapist device. Alternatively, the databases may be located remotely from the server, user device, and/or the therapist device.
  • the user input may be provided by a user via the user device and/or the user application running on the user device.
  • the user input may be in response to questions provided by the physio rehabilitation system. Examples of questions may be relating to pain level and mobility of a body part (e.g., ankle, knees, etc.).
  • the user’s responses to those questions may be used to supplement the image data to determine the personalized coaching or feedback.
  • This information obtained from the user input can be analyzed using machine learning techniques (e.g., natural language processing) and computer vision methods.
  • an NLP engine may be utilized to process the input data (e.g., input text captured from the survey, voice input, etc.) and produce a structured output including the linguistic information.
  • the NLP engine may employ any suitable NLP techniques such as a parser to perform parsing on the input text.
  • a parser may include instructions for syntactically, semantically, and lexically analyzing the text content of the user input and identifying relationships between text fragments in the user input.
  • the parser makes use of syntactic and morphological information about individual words found in the dictionary or "lexicon” or derived through morphological processing (organized in the lexical analysis stage).
  • the physio rehabilitation system 121 may be configured to obtain and analyze data from at least one imaging device located on the user device and/or the user device.
  • the physio rehabilitation system may be configured to analyze the image data stream to
  • data related to the user and methods such as the pose prediction algorithm, quantification algorithm, coaching algorithm and user interaction algorithm may be stored in a database 21 1 accessible by the physio rehabilitation system 121.
  • the physio rehabilitation system 121 may employ machine learning and computer vision-based pose estimation algorithm to provide accurate quantification of range of motion based on image data.
  • the pose estimation algorithm may be capable of quantifying range of motion using 2D image data with improved precision and accuracy.
  • the pose estimation algorithm may also be capable of calculating contact force or other derivative metrics based on 2D image data.
  • the pose estimation algori thm can be any type of machine learning network such as a neural network.
  • neural networks include a deep neural network, convolutional neural network (CNN), and recurrent neural network (RNN).
  • the machine learning algorithm may comprise one or more of the following: a support vector machine (SVM), a naive Bayes classification, a linear regression, a quantile regression, a logistic regression, a random forest, a neural network, CNN, RNN, a gradient-boosted classifier or repressor, or another supervised or unsupervised machine learning algorithm (e.g., generative adversarial network (GAN), Cycle- GAN, etc.).
  • SVM support vector machine
  • GAN generative adversarial network
  • Cycle- GAN Cycle- GAN
  • FIG. 3 shows an exemplary pose estimation algorithm 300.
  • the pose estimation algorithm 300 may be an unsupervised learning approach to recover 3D human pose from 2D skeletal joints extracted from a single image.
  • the input 2D pose may be 2D image data captured by the user device camera as described above.
  • the pose estimation algorithm may not require any multi-view image data, 3D skeletons, correspondences between 2D-3D points, or use previously learned 3D priors during training.
  • the pose estimation algorithm 300 may comprise a lifting network 303 that is trained to estimate 3D skeleton from3D poses.
  • the lifting network may accept 2D landmarks 301 as inputs and generate a corresponding 3D skeleton estimate 305.
  • the recovered 3D skeleton is re-projected on random camera view-points to generate new‘synthetic’ 2D poses 305.
  • self-consistency loss both in 3D and in 2D may be defined.
  • the training can be self-supervised by exploiting the geometric self-consistency of the lift-reproject-lift process.
  • the pose estimation algorithm 300 may also comprise a 2D pose discriminator 307 to enable the lifter to output valid 3D poses.
  • an unsupervised 2D domain adapter network is trained to allow for an expansion of 2D data. This improves results and demonstrates the usefulness of 2D pose data for unsupervised 3D lifting.
  • the output of the machine learning model is 3D skeleton.
  • the output of the machine learning model (i.e., lifting network) may be further analyzed for calculating a range of motion and/or contact force.
  • the contact force may be predicted based on the estimated pose using a trained model or any other suitable models.
  • the model may be trained using labeled data (e.g., data pairs of force and a range of motion) that maps a pose to a predicted contact force.
  • the training dataset may include single frame 2D images that are not required from a video.
  • the training dataset may include video data or motion captured by different types of imaging devices from diverse viewpoints.
  • a video may contain one or more subjects in one frame performing an array of actions.
  • temporal 2D pose sequences e g., video sequence of actions
  • the pose estimation algorithm described herein using unsupervised machine learning as an example it should be noted that the disclosure is not limited thereto, and can use supervised learning and/or other approaches.
  • the quantitative analysis of the range of motion, contact forces and/or other metrics may then be used as input to a coaching algorithm for generating real time feedback.
  • the coaching algorithm may utilize machine learning techniques (e.g., natural language processing) to generate interactive information delivered to the user.
  • the coaching algorithm may comprise voice recognition technology to enable rehabilitation training for the user in the VR or AR space.
  • the feedback may be delivered through the AR/VR system.
  • user input such as content of response to avatar-based questions and emotion, tone and other user response captured through the AR/VR system or user device may also be analyzed by the coaching algorithm so as to generate a real-time personalized feedback.
  • the feedback may comprise, for example, count of repetitions being performed, warning of an incorrect or incomplete pose, suggestion or recommendation of correcting a pose, and various others.
  • the feedback may be delivered to the user through the AR or VR space and/or through the user device.
  • FIG. 4 shows exempl ary components of a physio rehabilitation system 420, in accordance with embodiments of the invention.
  • the physio rehabilitation system 420 may comprise a pose estimation module 421, coaching module 423 and a user interface (UI) module 425.
  • the physio rehabilitation system 420 may be in communication with one or more client terminals 410, 410-N.
  • the client terminal 410 may include client software— which includes a data processing module 411 and a data communication module 415.
  • the pose estimation module 421 may be configured to analyze data stream from the client terminal 410 to quantify a range of motion in real-time using algorithms described above.
  • the coaching module 423 may be configured to generate real-time feedback for coaching a user using algorithms described elsewhere herein.
  • the physio rehabilitation system 420 or a portion of the physio rehabilitation system 420 may be implemented on an edge intelligence platform.
  • a predictive model such as the pose estimation model or a coaching model may be a software-based solution based on fog computing concepts which extends data processing and prediction closer to the edge (e.g., client terminal). Maintaining close proximity to the edge devices (e.g., camera, user device, AR/VR system) rather than sending all data to a distant centralized cloud, minimizes latency allowing for maximum performance, faster response times, and more effective maintenance and operational strategies. It also significantly reduces overall bandwidth requirements and the cost of managing widely distributed networks.
  • the provided physio rehabilitation system 420 may employ an edge intelligence paradigm that at least a portion of data processing can be performed at the edge.
  • the predictive model for pose estimation may be built, developed, trained, maintained by the pose estimation module 421 (on the cloud), and run on the edge device or client terminal (e.g., hardware accelerator).
  • image data may be transmitted from the client terminal to the pose estimation module for real-time motion quantification.
  • one or more trained models may be capable of accounting for the variability among users (e.g., patients) and continually improve without relying on supervised features (e.g., labeled data).
  • the one or more predictive models may be adapted to individuals.
  • the platform may provide adaptive models with continual training or improvement after deployment.
  • the predictive model provided by the platform may be dynamically adjusted and tuned to adapt to different users over time.
  • the predictive model provided by the platform may be improved continuously over time (e.g., during implementation, after deployment). Such continual training and improvement may be performed automatically with little user input or user intervention.
  • the platform may also allow a physician, a patient’s caretakers to monitor the movement or exercises.
  • the provided methods and systems can be applied in various scenarios such as in cloud or an on-premises environment.
  • the physio rehabilitation system 420 may employ an edge intelligence paradigm that data processing and prediction is performed at the edge or edge gateway 410.
  • an adaptive predictive model may be built, developed and trained on the cloud/data center 420 and run on the user device (e g., hardware accelerator) for inference.
  • the lifting network may be pre-trained on the cloud and transmitted to the user device for implementation, then the continual training of the lifting network may be performed on the cloud as new image data are collected.
  • a fixed model may be implemented in the physical system with the training and further tuning of the model performed on the cloud.
  • Image data and /or user feedback may be transmitted to the remote server 420 which are used to update the model and the updated model (e.g., parameters of the model that are updated) may be downloaded to the physical system (e.g., user device, VR/AR device) for implementation.
  • the updated model e.g., parameters of the model that are updated
  • the physical system e.g., user device, VR/AR device
  • the data processing module 411 may comprise a trained pose estimation model to quantify a range of motion based on the raw image data local at the client terminal.
  • the output may be 3D pose or a range of motion and the output may be transmitted by the data
  • the data processing module 411 may not perform motion quantification of the image data.
  • the data processing module 411 may pre-process the raw image data including but not limited to, ingesting of sensor data into a local storage repository (e.g., local time-series database), data cleansing, data enrichment (e.g., decorating data with metadata), data alignment, data annotation, data tagging, or data aggregation.
  • the pre-processed image data may then be transmitted to the physio rehabilitation system 420 for quantification analysis.
  • the data communication module 415 may be configured to transmit the stream image data or data processed by the data processing module 411 to the physio rehabilitation system 420.
  • the data communication module may be configured to aggregate the raw data across a time duration (e.g., about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 seconds, about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 minutes, about 1 , 2, 3, 4, 5, 6, 7, 8, 9, 10 hours, etc.).
  • raw data may be aggregated across data types (e.g., voice data, user input, image data, etc.) or sources and sent to a remote entity (e.g., third-party application server, hospital, etc) as a package.
  • the pose estimation module or the coaching module may be implemented in software, hardware, firmware, embedded hardware, standalone hardware, application specific-hardware, or any combination of these.
  • the pose estimation module or the coaching module, edge computing platform, and techniques described herein may be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof.
  • These systems, devices, and techniques may include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
  • the user interface (UI) module 425 may be configured for representing and delivering analytics, sensor data (e.g., video), processed data, and feedback to a user (e.g., patient, therapist, etc.).
  • the UI may include a patient UI for representing real-time feedback generated by the physio rehabilitation system to the user and receiving user input from a user (e.g., through the AR/VR space, user device, etc.
  • the UI may also include a therapist UI for a therapist to view the analytics (e.g., charts, statistics, quantification on a display) and to receive coaching and recommendation from a therapist.
  • the user interface may comprise using of one or more user interactive device (e.g., mouse, joystick, keyboard, trackball, touchpad, button, verbal commands, gesture-recognition, attitude sensor, thermal sensor, touch-capacitive sensors, or any other AR or VR devices).
  • user interactive device e.g., mouse, joystick, keyboard, trackball, touchpad, button, verbal commands, gesture-recognition, attitude sensor, thermal sensor, touch-capacitive sensors, or any other AR or VR devices.
  • the physio rehabilitation system can generate one or more graphical user interfaces (GUIs) comprising statistics of the user’s rehabilitation progress, range of motion, feedback for coaching an exercise, and the like.
  • GUIs graphical user interfaces
  • the GUIs may be rendered on a display screen on a user device (e.g., a patient user device, a therapist device).
  • a GUI is a type of interface that allows users to interact with electronic devices through graphical icons and visual indicators such as secondary notation, as opposed to text-based interfaces, typed command labels or text navigation.
  • the actions in a GUI are usually performed through direct
  • GUIs can be found in hand- held devices such as MP3 players, portable media players, gaming devices and smaller household, office and industry equipment.
  • the GUIs may be provided in a software, a software application, a web browser, etc.
  • the GUIs may be displayed on a user device (e.g., user device 101-1, 101-2 of FIG. 1).
  • the GUIs may be provided through a mobile application. Examples of such GUIs are illustrated in FIG. 5 and FIG. 6 and described as follows.
  • FIG. 5 and FIG. 6 show examples of user interfaces provided by the physio
  • Window of FIG. 5 may be generated after a therapist device is connected to the physio rehabilitation system and data has been obtained from the physio rehabilitation system.
  • the example may display various rehabilitation or training metrics.
  • window may display the range of motion (ROM) of the user for a month, a week, a day, compared to a normal value and an initial value.
  • the ROM when exceeds a pre-determined threshold, may be flagged (color-coded) indicating correction may be needed.
  • a pain level across a selectable period of time may be displayed in a diagram. As shown in FIG. 5, charts and diagrams showing an analysis of historical data across a selectable period of time is presented in the therapist dashboard or user portal.
  • information about the patient e.g., name, age, gender, rehabilitation diagnosis
  • rehabilitation progress may be displayed.
  • the charts and diagrams may be updated automatically upon receiving real-time data.
  • the charts and diagrams may be updated upon receiving a user command.
  • FIG. 6 shows exemplary quantification result of range of motion.
  • information about the range of motion may be displayed in real-time.
  • information about the range of motion may be further analyzed to identify an incorrectly performed pose and an indication of such incorrectly performed pose may be presented within the user interface.
  • Window of FIG. 6 may be generated after the user device is connected to the physio
  • window may display the range of motion (ROM) of the user for a day, compared to a normal value and an initial value.
  • the ROM when exceeds a pre-determined threshold, may be flagged (color-coded) indicating correction may be needed.
  • the physio rehabilitation system may be configured to deliver personalized coaching and exercising guidance (e g., recommendations) to a user, by transmitting the information to the user device and/or AR/VR system.
  • the physio rehabilitation system may be configured to proactively provide guidance to assist a user in managing wellness and following a rehabilitation program in a long term based on the input data provided to the physio rehabilitation system.
  • the physio rehabilitation system can dynamically provide personalized recommendations to the user in real-time.
  • the personalized recommendations may also be provided at a predetermined frequency, e g., every hour, 12 hours, 24 hours, 2 days, 4 days, etc.
  • the physio rehabilitation system can provide a personalized recommendation in real-time based on video stream data or generate a reminder (daily, weekly, monthly) when the exercise lapse for a period of time.
  • FIG. 7 shows a flow chart of a method 700 implemented by the physio rehabilitation platform.
  • the method 700 may improve the compliance to a rehabilitation or training program without additional cost.
  • a patient may receive a reminder through the application running on the user device.
  • the patient may be prompted to capture a video using the user device.
  • the captured video may be processed by the mobile application to quantify a range of motion.
  • the range of motion may be represented to the patient via the user device (e.g., GUI on the display of the user device) or via an AR/VR system.
  • the range of motion may be calculated on the cloud.
  • the quantification result may be transmitted to the backend of the physio rehabilitation platform (e.g., coaching module residing on the cloud, cloud database).
  • Statistics e.g., statistic chart of rehabilitation metrics
  • feedback may be generated. If the rehabilitation progress is determined to be on-track, the patient data may be updated and the program may proceed according to the coaching feedback. If the rehabilitation progress is determined to be no on-track, a real therapist may intervene and may contact the patient.
  • a and/or B encompasses one or more of A or B, and combinations thereof such as A and B. It will be understood that although the terms“first,”“second,”“third” etc. may be used herein to describe various elements, components, regions and/or sections, these elements, components, regions and/or sections should not be limited by these terms. These terms are merely used to distinguish one element, component, region or section from another element, component, region or section. Thus, a first element, component, region or section discussed below could be termed a second element, component, region or section without departing from the teachings of the present disclosure. [0097] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure.
  • Methods and systems are provided herein providing rehabilitation training to a user in real-time.
  • the method may comprise: obtaining image data of the user performing an movement of a body part, wherein the image data is collected using an imaging device located on a user device; measuring a range of the movement of the body part based on the image data using a first trained model; and generating a feedback in real-time to correct the movement using a second trained model.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Rehabilitation Tools (AREA)

Abstract

Methods and systems are provided herein providing rehabilitation training to a user in real-time. The method may comprise: obtaining image data of the user performing a movement of a body part, wherein the image data is collected using an imaging device located on a user device; measuring a range of the movement of the body part based on the image data using a first trained model; and generating a feedback in real-time to correct the movement using a second trained model.

Description

METHODS AND SYSTEMS FOR MUSCULOSKELETAL REHABILITATION
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to U S. Provisional Application No. 62/876,978 filed on July 22, 2019, the content of which is incorporated herein in its entirety.
BACKGROUND
[0002] Musculoskeletal rehabilitation has been a challenge due to the high cost in personal training or hardware requirement. Studies and expert opinion suggest that, for most patients, intensive post-operative rehabilitation can be followed up with education and a well-structured home exercise program. Unfortunately the number of physiotherapists is insufficient to meet current demands.
[0003] Currently, physical therapy or rehabilitation is provided mainly by personal attention of a physical therapist who monitors and instructs a patient in the performance of certain exercises. Thus, costs for rehabilitation are high and compliance after a patient leaves a treatment center is relatively low. For example, existing App-based protocols are passive and the rehabilitation progress may not be objectively tracked. In another example, a remote therapist may use a camera or video recorder to record the movement of the patients for subsequent review. However, the setup process can be tedious and the camera equipment can be very expensive, and hiring a trainer or coach can be very costly. Also, analysis of a video replay of a movement does not allow for real-time feedback.
[0004] Thus, there is a need for methods and systems that can accurately measure user movement or mobility in real-time, deliver real-time feedback to users to coach a training process, and can be used in a variety of places without time consuming or expensive setup processes.
SUMMARY
[0005] The present disclosure provides methods and systems for providing physiotherapy, rehabilitation and/or training to a subject for treating disease, such as musculoskeletal disease, or providing exercise and education for trauma and orthopedic patients. The provided systems and methods may also be used for other preventive training purpose, for example, ensuring that a patient with developing arthritis does not start favoring a diseased joint. Systems and methods of the present disclosure may be used to achieve a specific rehabilitation goal , such as rehabilitation of a particular limb or used for non-medical training, such as wellness exercise. [0006] Systems and methods of the present disclosure may utilize immersive technologies such as immersive, virtual reality (VR) and augmented reality (AR) enabled systems, coupled with artificial intelligence techniques (e.g., natural language processing, computer vision). In some embodiments, systems and methods of the present disclosure may provide real-time motion quantification using a user device camera alone without additional hardware requirements. In some cases, the motion quantification can be provided based on camera image data alone without requiring additional sensors. This may beneficially allow patients/users to be engaged in a rehabilitation process in a variety of places with easy setup and reduced cost.
[0007] During a training process, remote quantification of movement (e.g., joint movement) of a user may be provided by capturing data using a camera (e.g., camera on a user device, webcam or phone camera) and measuring a range of motion using a trained algorithm and computer vision techniques. Real-time feedback may be provided to the user through the user device and/or AR/VR system and a remote therapist in a range of communication modalities such as audio, haptic, or visual. In some cases, quantitative and qualitative analysis of the rehabilitation progress and real-time motion may be visually presented to users in user applications and to
surgeons/operators via a web-based dashboard.
[0008] In an aspect of the present disclosure, a method is provided for providing rehabilitation training to a user. The method comprises: obtaining image data of the user performing a movement of a body part, wherein the image data is collected using an imaging device located on a user device; measuring a range of the movement of the body part based on the image data using a first trained model; and generating a feedback in real-time to correct the movement using a second trained model. In some embodiments, the image data is two-dimensional image data. In some embodiments, the range of the movement comprises a joint angle and displacement of the body part in a three-dimensional space. In some embodiments, the first trained model is obtained using an unsupervised learning algorithm.
[0009] In some embodiments, the method further comprises providing the feedback using a virtual reality or augmented reality system. In some embodiments, the method further comprises generating one or more metrics including a contact force or counts of repetitions based on the image data. In some cases, the one or more metrics are derived from an estimated pose generated by the first trained model. In some embodiments, the second trained model processes the one or more metrics and the range of motion and outputs the feedback. In some embodiments, the first trained model or the second trained model is further improved using the image data.
[0010] In a related yet separate aspect, a system is provided for providing rehabilitation training to a user. The system comprises a server in communication with a computing device associated with a user, wherein the server comprises a memory for storing interactive media and a set of software instructions, and one or more processors configured to execute the set of software instructions to: obtain image data of the user performing a movement of a body part, wherein the image data is collected using an imaging device located on the user device; measure a range of the movement of the body part based on the image data using a first trained model; and generate a feedback in real-time to correct the movement using a second trained model.
[0011] In some embodiments, the image data is two-dimensional image data. In some embodiments, the range of the movement comprises a joint angle and displacement of the body part in a three-dimensional space. In some embodiments, the first trained model is developed using unsupervised learning algorithm. In some embodiments, the system further comprises a virtual reality or augmented reality system for providing the feedback. In some embodiments, the one or more processors are configured to further generate one or more metrics including a contact force or counts of repetitions based on the image data. In some cases, the one or more metrics are derived from an estimated pose generated by the first trained model. In some embodiments, the second trained model processes the one or more metrics and the range of motion and outputs the feedback. In some embodiments, the first trained model or the second trained model is further improved using the image data.
[0012] In yet another separate aspect, a tangible computer readable medium is provided for storing instructions that, when executed by a server, causes the server to perform a computer- implemented method for providing rehabilitation training to a user. The method comprises: obtaining image data of the user performing a movement of a body part, wherein the image data is collected using an imaging device located on a user device; measuring a range of the movement of the body part based on the image data using a first trained model; and generating a feedback in real-time to correct the movement using a second trained model. In some embodiments, the tangible computer readable medium further comprises providing the feedback using a virtual reality or augmented reality system.
[0013] Additional aspects and advantages of the present disclosure will become readily apparent to those skilled in this art from the following detailed description, wherein only illustrative embodiments of the present disclosure are shown and described. As will be realized, the present disclosure is capable of other and different embodiments, and its several details are capable of modifications in various obvious respects, all without departing from the disclosure. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive. IN CORPORATION BY REFERENCE
[0014] All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each indivi dual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference. To the extent publications and patents or patent applications incorporated by reference contradict the disclosure contained in the specification, the specification is intended to supersede and/or take precedence over any such contradictory material.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative
embodiments, in which the principles of the invention are utilized, and the accompanying drawings (also“Figure” and“F IG.” herein), of which:
[0016] FIG. 1 illustrates an exemplary environment in which the physio rehabilitation platform described herein may be implemented;
[0017] FIG. 2 illustrates an example of implementing the physio rehabilitation platform;
[0018] FIG. 3 shows an exemplary pose estimation algorithm;
[0019] FIG. 4 shows exemplary components of a physio rehabilitation system, in accordance with embodiments of the invention;
[0020] FIG. 5 show an exemplary user interface provided by the physio rehabilitation system;
[0021] FIG. 6 show an exemplary user interface provided by the physio rehabilitation system;
[0022] FIG. 7 shows a flow chart of a method implemented by the physio rehabilitation platform.
DETAILED DESCRIPTION
[0023] While vari ous embodiments of the invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions may occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed.
[0024] The physio rehabilitation platform may utilize computer vision and artificial intelligence techniques to enable accurate quantification of a range of motion. The physio rehabilitation platform can enable real-time motion measurement and feedback generation applicable to certain healthcare areas (e.g., musculoskeletal disease, non-medical training, etc). The physio rehabilitation platform can be used to help users effectively engage in a rehabilitation program remotely and with reduced hardware requirement. In some cases, only image data captured by a user device camera (e g., phone camera, webcam, etc.) may be required for the pose estimation and motion quantification. Alternatively or in addition to, VR and/or AR devices may be used in the physio rehabilitation platform for providing real-time feedback and training guidance.
[0025] A range of motion of a body part may comprise a movement of the body part in 3- dimensional (3 D) space. The range of moti on may be generated in terms of anthropomorphic constraints and joint angle limits of a human body articulation. The range of motion may include displacement along an X-axis, a Y-axis, and a Z-axis and joint angles (e.g., elevation angle, azimuth angle).
[0026] FIG. 1 illustrates an exemplary environment in which the physio rehabilitation platform described herein may be implemented. A physio rehabilitation platform 100 may include one or more user devices 101-1, 101-2, a server 120, a physio rehabilitation system 121, and a database 109, 123. The physio rehabilitation platform 100 may optionally comprise one or more VR/AR systems 105-1, 105-2. Each of the components 101-1, 101-2, 109, 123, 120, 105-1 and 105-2 may be operatively connected to one another via network 110 or any type of communication links that allows transmission of data from one component to another.
[0027] The physio rehabilitation system 121 may be configured to analyze input data (e.g., image data) from the user device in order to identify and quantify a range of motion of a body part or limb and to provide feedback information (e.g., guidance, quantification result, recommendation) to assist a user in correcting a pose or exercise. In some cases, the physio rehabilitation system 121 may also receive input data from the AR/VR system to supplement the data collected by the user device.
[0028] The physio rehabilitation system 121 may be implemented anywhere within the physio rehabilitation platform, and/or outside of the physio rehabilitation platform. In some embodiments, the physio rehabilitation system may be implemented on the server. In other embodiments, a portion of the physio rehabilitation system may be implemented on the user device. Additionally, a portion of the physio rehabilitation system may be implemented on the AR/VR system. Alternatively, the physio rehabilitation system may be implemented in one or more databases. The physio rehabilitation system may be implemented using software, hardware, or a combination of software and hardware in one or more of the above-mentioned components within the physio rehabilitation platform.
[0029] The user device 101-1, 101-2 may comprise an imaging sensor 107-1, 107-2 serves as imaging device. The imaging device 107-1, 107-2 may be on-board the user device. The imaging device can include hardware and/or software element. In some embodiments, the imaging device may be a camera or imaging sensor operably coupled to the user device. In some alternative embodiments, the imaging device may be located external to the user device, and image data of a body part or limbs of the user may be transmitted to the user device via communication means as described elsewhere herein. The imaging device can be controlled by an application/software configured to take image or video of the user. In some embodiments, the camera may be configured to take a 2D image of at least a body part of the user. In some embodiments, the software and/or applications may be configured to control the camera on the user device to take image or video.
[0030] The imaging device 107-1, 107-2 may be a fixed lens or auto focus lens camera. A camera can be a movie or video camera that captures dynamic image data (e.g., video). A camera can be a still camera that captures static images (e.g., photographs). A camera may capture both dynamic image data and static images. A camera may switch between capturing dynamic image data and static images. Although certain embodiments provided herein are described in the context of cameras, it shall be understood that the present disclosure can be applied to any suitable imaging device, and any description herein relating to cameras can also be applied to any suitable imaging device, and any description herein relating to cameras can also be applied to other types of imaging devices. The camera may comprise optical elements (e.g., lens, mirrors, filters, etc). The camera may capture color images (RGB images), greyscale image, and the like.
[0031] The imaging device 107-1, 107-2 may be a camera used to capture visual images of at least part of the human body. Any other type of sensor may be used, such as an infra-red sensor that may be used to capture thermal images of the human body. The imaging sensor may collect information anywhere along the electromagnetic spectrum, and may generate corresponding images accordingly.
[0032] In some embodiments, the imaging device may be capable of operation at a fairly high resolution. The i maging sensor may have a resolution of greater than or equal to about 100 mm, 50 mm, 10 mm, 5 mm, 2 mm, 1 mm, 0.5 mm, 0.1 mm, 0.05 mm, 0.01 mm, 0.005 mm, 0.001 mm, 0.0005 mm, or 0.0001 pm. The image sensor may be capable of collecting 4K or higher images.
[0033] The imaging device 107-1, 107-2 may capture an image frame or a sequence of image frames at a specific image resolution. In some embodiments, the image frame resolution may be defined by the number of pixels in a frame. In some embodiments, the image resolution may be greater than or equal to about 352x420 pixels, 480x320 pixels, 720x480 pixels, 1280x720 pixels, 1440x1080 pixels, 1920x1080 pixels, 2048x1080 pixels, 3840x2160 pixels, 4096x2160 pixels, 7680x4320 pixels, or 15360x8640 pixels.
[0034] The imaging device 107-1 , 107-2 may capture a sequence of im age fram es at a specific capture rate. In some embodiments, the sequence of images may be captured at a rate less than or equal to about one image every 0.0001 seconds, 0.0002 seconds, 0.0005 seconds, 0.001 seconds, 0.002 seconds, 0.005 seconds, 0.01 seconds, 0.02 seconds, 0.05 seconds. 0.1 seconds, 0.2 seconds, 0.5 seconds, 1 second, 2 seconds, 5 seconds, or 10 seconds. In some embodiments, the capture rate may change depending on user input and/or external conditions (e.g. illumination brightness).
[0035] The imaging device 107-1, 107-2 may be configured to obtain image data to track motion or posture of a user. The imaging device may or may not be a 3D camera, stereo camera or depth camera. As described later herein, computer vision techniques and deep learning techniques may be used to reconstruct 3D pose using 2D imaging data. In some cases, the imaging device may be monocular camera and images of the user may be taken from a single view/angle.
[0036] User device 101-1, 101-2 may comprise one or more imaging devices for capturing image data of one or more users 103-1, 103-2 co-located with the user device. The captured image data may then be analyzed by the physio rehabilitation system to measure a range of motion of a body part or limbs. For example, the image data may be processed to identify a predicted three-dimensional (3D) pose of the user and a range of motion/movement of the user may be measured with high precision and accuracy. In some cases, the image data may be 2D image data or video data. The image data may be color (e.g., RGB) images or 2D keypoints. In some cases, the image data may be raw data captured by a user device camera without extra setup or cost. Details about using computer vision and machine learning techniques for pose estimation are described later herein.
[0037] User device 101-1, 101-2 may be a computing device configured to perform one or more operations consistent with the disclosed embodiments. Examples of user devices may include, but are not limited to, mobile devices, smartphones/cellphones, tablets, personal digital assistants (PDAs), laptop or notebook computers, desktop computers, media content players, television sets, video gaming station/system, virtual reality systems, augmented reality systems, microphones, or any electronic device capable of analyzing, receiving, providing or displaying certain types of feedback data (e.g., rehabilitation progress, motion quantification analysis) to a user. The user device may be a handheld object. The user device may be portable. The user device may be carried by a human user. In some cases, the user device may be located remotely from a human user, and the user can control the user device using wireless and/or wired communications.
[0038] User device 101-1, 101-2 may include one or more processors that are capable of executing non-transitory computer readable media that may provide instructions for one or more operations consistent with the disclosed embodiments. The user device may include one or more memory storage devices comprising non-transitory computer readable media including code, logic, or instructions for performing the one or more operations. The user device may include software applications that allow the user device to communicate with and transfer data between AR/VR system 105-1, 105-2, server 120, physio rehabilitation system 121, and/or database 109. The user device may include a communication unit, which may permit the communications with one or more other components in physio rehabilitation platform 121. In some instances, the communication unit may include a single communication module, or multiple communication modules. In some instances, the user device may be capable of interacting with one or more components in the physio rehabilitation platform 121 using a single communication link or multiple different types of communication links.
[0039] User device 101-1, 101-2 may include a display. The display may be a screen. The display may or may not be a touchscreen. The display may be a light-emitting diode (LED) screen, OLED screen, liquid crystal display (LCD) screen, plasma screen, or any other type of screen. The display may be configured to show a user interface (UI) or a graphical user interface (GUI) rendered through an application (e.g., via an application programming interface (API) executed on the user device). The GUI may show images, charts, analytics relating to the rehabilitation progress and real-time quantification result, and the GUI may permit a user to view and receive feedbacks (e.g., guidance, recommendation) generated by the physio rehabilitation system. The user device may also be configured to display webpages and/or websites on the Internet. One or more of the webpages/websites may be hosted by server 120 and/or rendered by the physio rehabilitation system 121.
[0040] A user may navigate within the GUI through the application. For example, the user may select a link by directly touching the screen (e.g., touchscreen). The user may touch any portion of the screen by touching a point on the screen. Alternatively, the user may select a portion of an image with ai d of a user interactive device (e.g., mouse, joystick, keyboard, trackball, touchpad, button, verbal commands, gesture-recognition, attitude sensor, thermal sensor, touch-capacitive sensors, or any other device). A touchscreen may be configured to detect location of the user’s touch, length of touch, pressure of touch, and/or touch motion, whereby each of the
aforementioned manner of touch may be indicative of a specific input command from the user.
[0041] In some cases, users may utilize the user devices to interact with the physio rehabilitation system 121 by way of one or more software applications (i.e., client software) running on and/or accessed by the user devices, wherein the user devices and the physio rehabilitation system 121 may form a client-server relationship. For example, the user devices may run dedicated mobile applications or software applications for viewing real-time quantification results, interacting with a remote trainer or providing user input provided by the physio rehabilitation system 121. [0042] In some cases, the client software (i.e., software applications installed on the user devices 101-1, 101-2) may be available either as downloadable software or mobile applications for various types of computer devices. Alternatively, the client software can be implemented in a combination of one or more programming languages and markup languages for execution by various web browsers. For example, the client software can be executed in web browsers that support JavaScript and HTML rendering, such as Chrome, Mozilla Firefox, Internet Explorer, Safari, and any other compatible web browsers. The various embodiments of client software applications may be compiled for various devices, across multiple platforms, and may be optimized for their respective native platforms.
[0043] In some embodiments, the physio rehabilitation platform 100 may comprise using a virtual or augmented reality system to place the subject in a virtual world, where the subject can be presented with visual, auditory, and/or haptic stimulation of feedback in response to measured range of motion, monitoring the subject’s interaction with the virtual and/or real world, and measuring the subject’s progress toward one or more therapeutic/rehabilitation goals. The virtual reality (VR) or augmented reality (AR) system may comprise any device comprising one or more displays, such as a monitor, screen, mobile phone, computer, smartphone, laptop, tablet, television, smart television, or other device. For example, the user may access the experience or rehabilitation training through the use of supplemental headsets (e.g., Google®
Daydream/Cardboard, Oculus® Gear/Rift, HTC® Vive, etc.).
[0044] The virtual reality (VR) or augmented reality (AR) system 105-1, 105-2 and/or the physio rehabilitation system 121 may comprise or be coupled to any other suitable devices or equipment such as smartwatches, wristbands, glasses, gloves, headgear (such as hats, helmets, virtual reality headsets, augmented reality headsets, head-mounted devices (HMD), headbands), pendants, armbands, leg bands, shoes, vests, motion sensing devices, etc.
[0045] The virtual reality (VR) or augmented reality (AR) system may be communicatively coupled to one or more sensors. The one or more sensors may be integrated in the VR/AR device or external to, and operatively coupled to, the VR/AR device, such as via wired or wireless (e.g., Bluetooth, Wi-Fi, Near Field Communication (NFC), etc.) connections. The one or more sensors may be capable of collecting data on the user, such as the user’s interactions, reactions, and/or responses to one or more components and/or stimulations in the VR or AR experience. Examples of types of sensors may include inertial sensors (e.g., accelerometers, gyroscopes, and/or gravity detection sensors, which may form inertial measurement units (IMUs)), location sensors (e.g., global positioning system (GPS) sensors, mobile device transmitters enabling location triangulation), heart rate monitors, external temperature sensors, skin temperature sensors, capacitive touch sensors, sensors configured to detect a galvanic skin response (GSR), vision sensors (e.g., imaging devices capable of detecting visible, infrared, or ultraviolet light, such as cameras), proximity or range sensors (e.g., ultrasonic sensors, lidar, time-of-flight or depth cameras), altitude sensors, attitude sensors (e.g., compasses), pressure sensors (e.g., barometers), humidity sensors, vibration sensors, audio sensors (e.g., microphones), and/or field sensors (e.g., magnetometers, electromagnetic sensors, radio sensors).
[0046] In some cases, at least part of the subject’s interactions, reactions, and/or responses to the stimulations presented in the virtual world can be quantified based at least in part on sensory data measured for the subject, such as a range of motion, posture, reaction time, response volume, and/or other forms or units of outputs by the subject. Alternatively, the subject’s realtime motion is quantified in real-time based on image data captured by the user device camera alone. Detail s about the motion quantification are described later herein.
[0047] Any examples herein of sensors that may be present in AR/VR system may also apply to the user device. For instance, one or more different sensors may be incorporated into the user device.
[0048] In some embodiments, feedback to the user motion may be delivered in real-time in the form of the VR or AR experience. A user may receive visual, audible, haptic or other forms of feedback in the VR or AR experience. Alternatively or in addition to, the feedback may be delivered without the VR or AR system. For instance, charts in a GUI, audible command and the like may be delivered to the user via the user device. In some cases, the VR or AR experience may comprise one or more VR or AR scenes. For example, the VR or AR experience may comprise a time-dependent progression of one or more VR or AR scenes. The VR or AR scenes may be dynamic, such as comprising one or more dynamic components (e.g., animation, audio, etc.) and/or components that can be triggered to change. The user may be capable of interacting with, or reacting to or responding to, one or more components of the VR or AR scenes. The user may have a stereoscopical view of the one or more VR or AR scenes in the VR or AR
experience. The VR or AR experience can be a 360° experience. The VR or AR experience may be capable of presenting one or more stimulations, such as visual stimulations, audio
stimulations, and/or haptic stimulations. Alternatively or in addition, the one or more stimulations may be provided via one or more external devices operatively coupled to the AR/VR system, such as via wired or wireless connections. Such other devices can include, for example, other displays, screens, speakers, headphones, earphones, controllers, actuators, lamps, or other devices capable of providing visual, audio, and/or haptic output to the user.
[0049] User device 101-1, 101-2 and AR/VR system 105-1, 105-2 may be operated by one or more users consistent with the disclosed embodiments. In some embodiments, a user may be associated with a unique user device and an AR/VR system. Alternatively, a user may be associated with a plurality of user devices and AR/VR systems. A user as described herein may refer to an individual or a group of individuals who are seeking rehabilitation, exercise training, or to improve their well-being through the physio rehabilitation system. In some cases, a user may be an individual suffering from musculoskeletal disease (MSK). A user as described herein may also include an individual or a group of individuals who are seeking educations about rehabilitation training, such as therapist, service providers or trainers. A user as described herein may also include a real therapist, physician, supervisor or operator who is monitoring or coaching a patient/user through the physio rehabilitation system.
[0050] User device 101-1, 101-2 and/or the AR/VR system 105-1, 105-2 may be configured to receive input from one or more users. A user may provide an input to the user device using an input device, for example, a keyboard, a mouse, a touch-screen panel, voice recognition and/or dictation software, built-in input device of the AR/VR system or any combination of the above. The user input may include statements, comments, questions, or answers relating to rehabilitation or training, such as interaction with therapist or supervisor in real life through the platform, or alternatively as an avatar in the virtual environment. The user input may be Patient-Reported Outcome Measures (PROMs) data which provide precise and accurate information about the rehabilitation outcome. The PROMs data may be collected in a multimodal fashion. Proprietary portal or personalized questions may be designed to collect PROMs data. Alternatively or in addition to, standard questions, such as the full Knee Injury and Osteoarthritis Outcome Score (KOOS) questionnaire plus the PROMIS-10 Global (10 questions) for knee replacement, may be used. These user inputs can be important for training a personalized rehabilitation/ coaching model.
[0051] Server 120 may be one or more server computers configured to perform one or more operations consistent with the disclosed embodiments. In one aspect, the server may be implemented as a single computer, through which user device and AR/VR systems are able to communicate with the physio rehabilitation system and database. In some embodiments, the user device and/or the AR/VR system communicate with the physio rehabilitation system directly through the network. In some embodiments, the server may communicate on behalf of the user device and/or the AR/VR systems with the physio rehabilitation system or database through the network. In some embodiments, the server may embody the functionality of one or more of physio rehabilitation systems. In some embodiments, one or more physio rehabilitation systems may be implemented inside and/or outside of the server. For example, the physio rehabilitation systems may be software and/or hardware components included with the server or remote from the server. [0052] In some embodiments, the user device and/or the AR/VR system may be directly connected to the server through a separate link (not shown in FIG. 1). In certain embodiments, the server may be configured to operate as a front-end device configured to provide access to physio rehabilitation system consistent with certain disclosed embodiments. The server may, in some embodiments, utilize one or more physio rehabilitation systems to analyze data streams from the user device and/or AR/VR system in order to quantify a range of motion, contact force, gait or other rehabilitation metrics, and to provide feedback information (e.g., coaching instruction, command, recommendation) to assist the user in correcting a posture. The server may also be configured to store, search, retrieve, and/or analyze data and information stored in one or more of the databases. The data and information may include raw data collected from imaging device on the user device, as well as each a user’s historical data pattern, rehabilitation metrics (e.g., pain level, gait, range of motion, contact force, etc.), medical record and user provided information. While FIG. 1 illustrates the server as a single server, in some
embodiments, multiple devices may implement the functionality associated with a server.
[0053] A server may include a web server, an enterprise server, or any other type of computer server, and can be computer programmed to accept requests (e.g., HTTP, or other protocols that can initiate data transmission) from a computing device (e.g., user device and/or wearable device) and to serve the computing device with requested data. In addition, a server can be a broadcasting facility, such as free-to-air, cable, satellite, and other broadcasting facility, for distributing data. A server may also be a server in a data network (e.g., a cloud computing network).
[0054] A server may include known computing components, such as one or more processors, one or more memory devices storing software instructions executed by the processor(s), and data. A server can have one or more processors and at least one memory for storing program instructions. The processors) can be a single or multiple microprocessors, field programmable gate arrays (FPGAs), or digital signal processors (DSPs) capable of executing particular sets of instructions. Computer-readable instructions can be stored on a tangible non-transitory computer-readable medium, such as a hard disk, a CD-ROM (compact disk-read only memory), and MO (magneto-optical), a DVD-ROM (digital versatile disk-read only memory), a DVD RAM (digital versatile disk-random access memory), or a semiconductor memory. Alternatively, the methods can be implemented in hardware components or combinations of hardware and software such as, for example, ASICs, special purpose computers, or general purpose computers.
[0055] While FIG. 1 illustrates the server as a single server, in some embodiments, multiple devices may implement the functionality associated with server. [0056] Network 110 may be a network that is configured to provide communication between the various components illustrated in FIG. 1. The network may be implemented, in some embodiments, as one or more networks that connect devices and/or components in the network layout for allowing communication between them. For example, user device 101-1, 101-2, AR/VR system 105-1, 105-2, and physio rehabilitation system 121 may be in operable communication with one another over network 110. Direct communications may be provided between two or more of the above components. The direct communi cations may occur without requiring any intermediary device or network. Indirect communications may be provided between two or more of the above components. The indirect communications may occur with aid of one or more intermediary device or network. For instance, indirect communications may utilize a telecommunications network. Indirect communications may be performed with aid of one or more router, communication tower, satellite, or any other intermediary device or network. Examples of types of communications may include, but are not limited to: communications via the Internet, Local Area Networks (LANs), Wide Area Networks (WANs), Bluetooth, Near Field Communication (NFC) technologies, networks based on mobile data protocols such as General Packet Radio Services (GPRS), GSM, Enhanced Data GSM Environment (EDGE), 3G, 4G, 5G or Long Term Evolution (LTE) protocols, Infra-Red (IR) communication technologies, and/or Wi-Fi, and may be wireless, wired, or a combination thereof. In some embodiments, the network may be implemented using cell and/or pager networks, satellite, licensed radio, or a combination of licensed and unlicensed radio. The network may be wireless, wired, or a combination thereof.
[0057] User device 101-1, 101-2, AR/VR system 105-1, 105-2, server 120, and/or physio rehabilitation system 121 may be connected or interconnected to one or more databases 109, 123. The databases may be one or more memory devices configured to store data. Additionally, the databases may also, in some embodiments, be implemented as a computer system with a storage device. In one aspect, the databases may be used by components of the network layout to perform one or more operations consistent with the disclosed embodiments. One or more local databases, and cloud databases of the platform may utilize any suitable database techniques. For instance, structured query language (SQL) or“NoSQL” database may be utilized for storing the image data, user data, historical data, predictive model or algorithms. Some of the databases may be implemented using various standard data-structures, such as an array, hash, (linked) list, struct, structured text file (e.g., XML), table, JavaScript Object Notation (JSON), NOSQL and/or the like. Such data-structures may be stored in memory and/or in (structured) files. In another alternative, an object-oriented database may be used. Object databases can include a number of object collections that are grouped and/or linked together by common attributes; they may be related to other object collections by some common attributes. Object-oriented databases perform similarly to relational databases with the exception that objects are not just pieces of data but may have other types of functionality encapsulated within a given object. In some embodiments, the database may include a graph database that uses graph structures for semantic
queries with nodes, edges and properties to represent and store data. If the database of the present invention is implemented as a data-structure, the use of the database of the present invention may be integrated into another component such as the component of the present invention. Also, the database may be implemented as a mix of data structures, objects, and relational structures. Databases may be consolidated and/or distributed in variations through standard data processing techniques. Portions of databases, e.g., tables, may be exported and/or imported and thus decentralized and/or integrated.
[0058] In some embodiments, the data management system may construct the database for fast and efficient data retrieval, query and delivery. For example, the physio rehabilitation system may provide customized algorithms to extract, transform, and load (ETL) the data. In some embodiments, the physio rehabilitation system may construct the databases using proprietary database architecture or data structures to provide an efficient database model that is adapted to large scale databases, is easily scalable, is efficient in query and data retrieval, or has reduced memory requirements in comparison to using other data structures.
[0059] In one embodiment, the databases may comprise storage containing a variety of data consistent with disclosed embodiments. For example, the databases may store, for example, raw data collected by the imaging device located on user device. The databases may also store user information, historical data patterns, data relating to a rehabilitation/training progress, medical records, analytics, user input (e.g., statements or comments indicative of how the user is feeling at different points in time, etc.), predictive models, algorithms, training datasets (e.g., video clips), and the like.
[0060] In certain embodiments, one or more of the databases may be co-located with the server, may be co-located with one another on the network, or may be located separately from other devices. One of ordinary skill will recognize that the disclosed embodiments are not limited to the configuration and/or arrangement of the database(s).
[0061] Although particular computing devices are illustrated and networks described, it is to be appreciated and understood that other computing devices and networks can be utilized without departing from the spirit and scope of the embodiments described herein. In addition, one or more components of the network layout may be interconnected in a variety of ways, and may in some embodiments be directly connected to, co-located with, or remote from one another, as one of ordinary skill will appreci ate. [0062] A server may access and execute physio rehabilitation system(s) to perform one or more processes consistent with the disclosed embodiments. In certain configurations, the physio rehabilitation system(s) may be software stored in memory accessible by a server (e.g., in memory local to the server or remote memory accessible over a communication link, such as the network). Thus, in certain aspects, the physio rehabilitation system(s) may be implemented as one or more computers, as software stored on a memory device accessible by the server, or a combination thereof. For example, one physio rehabilitation system(s) may be a computer executing one or more motion quantification algorithms and coaching algorithms, and another physio rehabilitation system(s) may be software that, when executed by a server, performs one or more motion quantification algorithms and coaching algorithms.
[0063] The physio rehabilitation system 121 though is shown to be hosted on the server 120. The physio rehabilitation system may be implemented as a hardware accelerator, software executable by a processor and various others. In some embodiments, the physio rehabilitation system may employ an edge intelligence paradigm that data processing and prediction is performed at the edge or edge gateway. In some instances, machine learning model may be built, developed and trained on the cloud/data center 120 and run on the user device and/or AR/VR systems (e.g., hardware accelerator). Details about the physio rehabilitation system and its components are described later herein with respect to FIG. 4.
[0064] In some cases, the subject’s progress toward one or more therapeutic or rehabilitation goals can be quantified based at least in part on the motion quantification, contact force calculation or other metrics. In some cases, the subject’s progress toward one or more therapeutic goals can also be measured based on qualitative observations made by another user monitoring the subject’s interactions, reactions, and responses in the virtual world, such as a therapist, operator, or service provider. For example, the subject’s progress toward one or more therapeutic goals can be measured based on objective feedback from the virtual coach or a coaching module, or subjective feedback from a remote real therapist or service provider.
[0065] The functions of the physio rehabilitation system and its communication with the user device, user and AR/VR system will be described in detail below with reference to FIG. 2.
Although various embodiments are described herein using physio rehabilitation as an example, it should be noted that the disclosure is not limited thereto, and can be used to other types of wellness exercising, training and activities besides medical purposes.
[0066] FIG. 2 illustrates an example of implementing the physio rehabilitation platform 200. Referring to FIG. 2, a physio rehabilitation platform 200 may comprise a user device 201 deployed in a location 203 remote from a supervising location 213, where a therapist or physician is permitted to accessing real-time analysis of the user motion through a therapist portal 211. The user device 201 can be the same as the user device as described in FIG. 1. The therapist portal 211 may be provided within an application running on a computing device located with the therapist. The therapist portal 211 may provide real-time motion analysis (e.g., quantification of motion) and historical statistics (e.g., charts, records, patient information) about the user. Both the therapist portal 211 and the user application are provided by the physio rehabilitation system 121. The computing device for the therapist to access the portal 211 may or may not comprise a camera.
[0067] As previously described, the physio rehabilitation system may be implemented both inside and/or outside of a server. For example, the physio rehabilitation system may be software and/or hardware components included with a server, or remote from the server. In some embodiments, the physio rehabilitation system (or one or more functions of the physio rehabilitation system) may be implemented on the user device and/or the therapist device.
Alternatively, the user device, therapist device, and/or server may be configured to perform different functions of the physio rehabilitation system. Optionally, one or more functions of the physio rehabilitation system may be duplicated across the user device, therapist device, and/or server.
[0068] In the example of FIG. 2, user device 201 may comprise at least one imaging device.
For example, the user device 201 may comprise a camera for capturing video of the user. The captured data stream may be processed locally at the user device and/or remotely at the physio rehabilitation system 121 to quantify a range of motion of a body part of the user.
[0069] The user device 201 may be configured to provide input data to the physio rehabilitation system. The input data may comprise sensor data and user input as described above.
[0070] The sensor data may comprise raw data collected by the imaging device on the user device. The sensor data may be stored in memory located on the user device, the therapist device, and/or server. In some embodiments, the sensor data may be stored in one or more databases. The databases may be located on the server, coupled to user device, and/or therapist device. Alternatively, the databases may be located remotely from the server, user device, and/or the therapist device.
[0071] The user input may be provided by a user via the user device and/or the user application running on the user device. The user input may be in response to questions provided by the physio rehabilitation system. Examples of questions may be relating to pain level and mobility of a body part (e.g., ankle, knees, etc.). The user’s responses to those questions may be used to supplement the image data to determine the personalized coaching or feedback. This information obtained from the user input can be analyzed using machine learning techniques (e.g., natural language processing) and computer vision methods. For example, an NLP engine may be utilized to process the input data (e.g., input text captured from the survey, voice input, etc.) and produce a structured output including the linguistic information. The NLP engine may employ any suitable NLP techniques such as a parser to perform parsing on the input text. A parser may include instructions for syntactically, semantically, and lexically analyzing the text content of the user input and identifying relationships between text fragments in the user input. The parser makes use of syntactic and morphological information about individual words found in the dictionary or "lexicon" or derived through morphological processing (organized in the lexical analysis stage).
[0072] The physio rehabilitation system 121 may be configured to obtain and analyze data from at least one imaging device located on the user device and/or the user device. For example, the physio rehabilitation system may be configured to analyze the image data stream to
quantitatively and qualitatively identify a range of motion, gait, contact forces or other metrics related to a rehabilitation progress, and generate real-time feedback and analytics for coaching the patient. As previously described, data related to the user and methods such as the pose prediction algorithm, quantification algorithm, coaching algorithm and user interaction algorithm may be stored in a database 21 1 accessible by the physio rehabilitation system 121.
[0073] The physio rehabilitation system 121 may employ machine learning and computer vision-based pose estimation algorithm to provide accurate quantification of range of motion based on image data. The pose estimation algorithm may be capable of quantifying range of motion using 2D image data with improved precision and accuracy. In some cases, the pose estimation algorithm may also be capable of calculating contact force or other derivative metrics based on 2D image data.
[0074] The pose estimation algori thm can be any type of machine learning network such as a neural network. Examples of neural networks include a deep neural network, convolutional neural network (CNN), and recurrent neural network (RNN). The machine learning algorithm may comprise one or more of the following: a support vector machine (SVM), a naive Bayes classification, a linear regression, a quantile regression, a logistic regression, a random forest, a neural network, CNN, RNN, a gradient-boosted classifier or repressor, or another supervised or unsupervised machine learning algorithm (e.g., generative adversarial network (GAN), Cycle- GAN, etc.).
[0075] FIG. 3 shows an exemplary pose estimation algorithm 300. The pose estimation algorithm 300 may be an unsupervised learning approach to recover 3D human pose from 2D skeletal joints extracted from a single image. The input 2D pose may be 2D image data captured by the user device camera as described above. As shown in the figure, the pose estimation algorithm may not require any multi-view image data, 3D skeletons, correspondences between 2D-3D points, or use previously learned 3D priors during training.
[0076] The pose estimation algorithm 300 may comprise a lifting network 303 that is trained to estimate 3D skeleton from3D poses. The lifting network may accept 2D landmarks 301 as inputs and generate a corresponding 3D skeleton estimate 305. During training, the recovered 3D skeleton is re-projected on random camera view-points to generate new‘synthetic’ 2D poses 305. By lifting the synthetic 2D poses back to 3D and re-projecting them in the original camera view, self-consistency loss both in 3D and in 2D may be defined.
[0077] The training can be self-supervised by exploiting the geometric self-consistency of the lift-reproject-lift process. The pose estimation algorithm 300 may also comprise a 2D pose discriminator 307 to enable the lifter to output valid 3D poses. In some cases, an unsupervised 2D domain adapter network is trained to allow for an expansion of 2D data. This improves results and demonstrates the usefulness of 2D pose data for unsupervised 3D lifting. The output of the machine learning model is 3D skeleton. The output of the machine learning model (i.e., lifting network) may be further analyzed for calculating a range of motion and/or contact force. For example, the contact force may be predicted based on the estimated pose using a trained model or any other suitable models. For instance, the model may be trained using labeled data (e.g., data pairs of force and a range of motion) that maps a pose to a predicted contact force.
[0078] The training dataset may include single frame 2D images that are not required from a video. Alternatively, the training dataset may include video data or motion captured by different types of imaging devices from diverse viewpoints. A video may contain one or more subjects in one frame performing an array of actions. When a video data is available, temporal 2D pose sequences (e g., video sequence of actions) can improve the accuracy of the signal frame lifting network. Although the pose estimation algorithm described herein using unsupervised machine learning as an example, it should be noted that the disclosure is not limited thereto, and can use supervised learning and/or other approaches.
[0079] The quantitative analysis of the range of motion, contact forces and/or other metrics (e.g., counts of repetitions) may then be used as input to a coaching algorithm for generating real time feedback. The coaching algorithm may utilize machine learning techniques (e.g., natural language processing) to generate interactive information delivered to the user. The coaching algorithm may comprise voice recognition technology to enable rehabilitation training for the user in the VR or AR space. In some cases, the feedback may be delivered through the AR/VR system. In some cases, in addition to the input described above, user input such as content of response to avatar-based questions and emotion, tone and other user response captured through the AR/VR system or user device may also be analyzed by the coaching algorithm so as to generate a real-time personalized feedback. The feedback may comprise, for example, count of repetitions being performed, warning of an incorrect or incomplete pose, suggestion or recommendation of correcting a pose, and various others. The feedback may be delivered to the user through the AR or VR space and/or through the user device.
[0080] FIG. 4 shows exempl ary components of a physio rehabilitation system 420, in accordance with embodiments of the invention. In some embodiments, the physio rehabilitation system 420 may comprise a pose estimation module 421, coaching module 423 and a user interface (UI) module 425. The physio rehabilitation system 420 may be in communication with one or more client terminals 410, 410-N. The client terminal 410 may include client software— which includes a data processing module 411 and a data communication module 415. The pose estimation module 421 may be configured to analyze data stream from the client terminal 410 to quantify a range of motion in real-time using algorithms described above. The coaching module 423 may be configured to generate real-time feedback for coaching a user using algorithms described elsewhere herein.
[0081] The physio rehabilitation system 420 or a portion of the physio rehabilitation system 420 may be implemented on an edge intelligence platform. For example, a predictive model such as the pose estimation model or a coaching model may be a software-based solution based on fog computing concepts which extends data processing and prediction closer to the edge (e.g., client terminal). Maintaining close proximity to the edge devices (e.g., camera, user device, AR/VR system) rather than sending all data to a distant centralized cloud, minimizes latency allowing for maximum performance, faster response times, and more effective maintenance and operational strategies. It also significantly reduces overall bandwidth requirements and the cost of managing widely distributed networks.
[0082] The provided physio rehabilitation system 420 may employ an edge intelligence paradigm that at least a portion of data processing can be performed at the edge. In some instances, the predictive model for pose estimation may be built, developed, trained, maintained by the pose estimation module 421 (on the cloud), and run on the edge device or client terminal (e.g., hardware accelerator). Alternatively or in addition to, image data may be transmitted from the client terminal to the pose estimation module for real-time motion quantification.
[0083] In some embodiments, one or more trained models may be capable of accounting for the variability among users (e.g., patients) and continually improve without relying on supervised features (e.g., labeled data). The one or more predictive models may be adapted to individuals. The platform may provide adaptive models with continual training or improvement after deployment. The predictive model provided by the platform may be dynamically adjusted and tuned to adapt to different users over time. The predictive model provided by the platform may be improved continuously over time (e.g., during implementation, after deployment). Such continual training and improvement may be performed automatically with little user input or user intervention. The platform may also allow a physician, a patient’s caretakers to monitor the movement or exercises. The provided methods and systems can be applied in various scenarios such as in cloud or an on-premises environment.
[0084] In some embodiments, the physio rehabilitation system 420 may employ an edge intelligence paradigm that data processing and prediction is performed at the edge or edge gateway 410. In some cases, an adaptive predictive model may be built, developed and trained on the cloud/data center 420 and run on the user device (e g., hardware accelerator) for inference. For example, the lifting network may be pre-trained on the cloud and transmitted to the user device for implementation, then the continual training of the lifting network may be performed on the cloud as new image data are collected. In such cases, a fixed model may be implemented in the physical system with the training and further tuning of the model performed on the cloud. Image data and /or user feedback (if available) may be transmitted to the remote server 420 which are used to update the model and the updated model (e.g., parameters of the model that are updated) may be downloaded to the physical system (e.g., user device, VR/AR device) for implementation.
[0085] The data processing module 411 may comprise a trained pose estimation model to quantify a range of motion based on the raw image data local at the client terminal. The output may be 3D pose or a range of motion and the output may be transmitted by the data
communication module 415 to the physio rehabilitation system 420 for further analysis. In some cases, the data processing module 411 may not perform motion quantification of the image data. The data processing module 411 may pre-process the raw image data including but not limited to, ingesting of sensor data into a local storage repository (e.g., local time-series database), data cleansing, data enrichment (e.g., decorating data with metadata), data alignment, data annotation, data tagging, or data aggregation. The pre-processed image data may then be transmitted to the physio rehabilitation system 420 for quantification analysis.
[0086] The data communication module 415 may be configured to transmit the stream image data or data processed by the data processing module 411 to the physio rehabilitation system 420. In some cases, the data communication module may be configured to aggregate the raw data across a time duration (e.g., about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 seconds, about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 minutes, about 1 , 2, 3, 4, 5, 6, 7, 8, 9, 10 hours, etc.). Alternatively or in addition, raw data may be aggregated across data types (e.g., voice data, user input, image data, etc.) or sources and sent to a remote entity (e.g., third-party application server, hospital, etc) as a package. [0087] The pose estimation module or the coaching module may be implemented in software, hardware, firmware, embedded hardware, standalone hardware, application specific-hardware, or any combination of these. The pose estimation module or the coaching module, edge computing platform, and techniques described herein may be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These systems, devices, and techniques may include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device. These computer programs (also known as programs, software, software applications, or code) may include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms“machine-readable medium” and “computer-readable medium” refer to any computer program product, apparatus, and/or device (such as magnetic discs, optical disks, memory, or Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor.
[0088] The user interface (UI) module 425 may be configured for representing and delivering analytics, sensor data (e.g., video), processed data, and feedback to a user (e.g., patient, therapist, etc.). The UI may include a patient UI for representing real-time feedback generated by the physio rehabilitation system to the user and receiving user input from a user (e.g., through the AR/VR space, user device, etc. The UI may also include a therapist UI for a therapist to view the analytics (e.g., charts, statistics, quantification on a display) and to receive coaching and recommendation from a therapist. The user interface may comprise using of one or more user interactive device (e.g., mouse, joystick, keyboard, trackball, touchpad, button, verbal commands, gesture-recognition, attitude sensor, thermal sensor, touch-capacitive sensors, or any other AR or VR devices).
[0089] In some embodiments, the physio rehabilitation system can generate one or more graphical user interfaces (GUIs) comprising statistics of the user’s rehabilitation progress, range of motion, feedback for coaching an exercise, and the like. The GUIs may be rendered on a display screen on a user device (e.g., a patient user device, a therapist device). A GUI is a type of interface that allows users to interact with electronic devices through graphical icons and visual indicators such as secondary notation, as opposed to text-based interfaces, typed command labels or text navigation. The actions in a GUI are usually performed through direct
manipulation of the graphical elements. In addition to computers, GUIs can be found in hand- held devices such as MP3 players, portable media players, gaming devices and smaller household, office and industry equipment. The GUIs may be provided in a software, a software application, a web browser, etc. The GUIs may be displayed on a user device (e.g., user device 101-1, 101-2 of FIG. 1). The GUIs may be provided through a mobile application. Examples of such GUIs are illustrated in FIG. 5 and FIG. 6 and described as follows.
[0090] FIG. 5 and FIG. 6 show examples of user interfaces provided by the physio
rehabilitation system. Window of FIG. 5 may be generated after a therapist device is connected to the physio rehabilitation system and data has been obtained from the physio rehabilitation system. The example may display various rehabilitation or training metrics. In the example of FIG. 5, window may display the range of motion (ROM) of the user for a month, a week, a day, compared to a normal value and an initial value. The ROM, when exceeds a pre-determined threshold, may be flagged (color-coded) indicating correction may be needed. A pain level across a selectable period of time may be displayed in a diagram. As shown in FIG. 5, charts and diagrams showing an analysis of historical data across a selectable period of time is presented in the therapist dashboard or user portal. In the example, information about the patient (e.g., name, age, gender, rehabilitation diagnosis) and rehabilitation progress may be displayed. In some cases, the charts and diagrams may be updated automatically upon receiving real-time data. Alternatively or in addition to, the charts and diagrams may be updated upon receiving a user command.
[0091] FIG. 6 shows exemplary quantification result of range of motion. In some cases, information about the range of motion may be displayed in real-time. In some cases, information about the range of motion may be further analyzed to identify an incorrectly performed pose and an indication of such incorrectly performed pose may be presented within the user interface. Window of FIG. 6 may be generated after the user device is connected to the physio
rehabilitation system and data has been obtained from the physio rehabilitation system. The example may display various rehabilitation or training metrics. In the example of FIG. 6, window may display the range of motion (ROM) of the user for a day, compared to a normal value and an initial value. The ROM, when exceeds a pre-determined threshold, may be flagged (color-coded) indicating correction may be needed.
[0092] In the GUIs of FIG. 5 and FIG. 6, different colors and shading may be used to differentiate the segments from each other. The numbers and words for various metrics may be provided in different colors and shades to improve readability, and to distinguish the metrics from one another. Any color scheme or any other visual differentiation scheme may be contemplated. [0093] The physio rehabilitation system may be configured to deliver personalized coaching and exercising guidance (e g., recommendations) to a user, by transmitting the information to the user device and/or AR/VR system. In some embodiments, the physio rehabilitation system may be configured to proactively provide guidance to assist a user in managing wellness and following a rehabilitation program in a long term based on the input data provided to the physio rehabilitation system.
[0094] in some embodiments, the physio rehabilitation system can dynamically provide personalized recommendations to the user in real-time. The personalized recommendations may also be provided at a predetermined frequency, e g., every hour, 12 hours, 24 hours, 2 days, 4 days, etc. In some instances, the physio rehabilitation system can provide a personalized recommendation in real-time based on video stream data or generate a reminder (daily, weekly, monthly) when the exercise lapse for a period of time.
[0095] FIG. 7 shows a flow chart of a method 700 implemented by the physio rehabilitation platform. The method 700 may improve the compliance to a rehabilitation or training program without additional cost. A patient may receive a reminder through the application running on the user device. The patient may be prompted to capture a video using the user device. In some cases, the captured video may be processed by the mobile application to quantify a range of motion. In such cases, the range of motion may be represented to the patient via the user device (e.g., GUI on the display of the user device) or via an AR/VR system. In alternative cases, the range of motion may be calculated on the cloud. The quantification result may be transmitted to the backend of the physio rehabilitation platform (e.g., coaching module residing on the cloud, cloud database). Statistics (e.g., statistic chart of rehabilitation metrics) and feedback may be generated. If the rehabilitation progress is determined to be on-track, the patient data may be updated and the program may proceed according to the coaching feedback. If the rehabilitation progress is determined to be no on-track, a real therapist may intervene and may contact the patient.
[0096] As used herein A and/or B encompasses one or more of A or B, and combinations thereof such as A and B. It will be understood that although the terms“first,”“second,”“third” etc. may be used herein to describe various elements, components, regions and/or sections, these elements, components, regions and/or sections should not be limited by these terms. These terms are merely used to distinguish one element, component, region or section from another element, component, region or section. Thus, a first element, component, region or section discussed below could be termed a second element, component, region or section without departing from the teachings of the present disclosure. [0097] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms“a”, “an” and“the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms“comprises” and/or“comprising,” or“includes” and/or“including,” when used in this specification, specify the presence of stated features, regions, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, regions, integers, steps, operations, elements, components and/or groups thereof.
[0098] While preferred embodiments of the present disclosure have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the disclosure. It should be understood that various alternatives to the embodiments of the disclosure described herein may be employed in practicing the disclosure. It is intended that the following claims define the scope of the disclosure and that methods and structures within the scope of these claims and their equivalents be covered thereby.
[0099] Methods and systems are provided herein providing rehabilitation training to a user in real-time. The method may comprise: obtaining image data of the user performing an movement of a body part, wherein the image data is collected using an imaging device located on a user device; measuring a range of the movement of the body part based on the image data using a first trained model; and generating a feedback in real-time to correct the movement using a second trained model.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A method for providing rehabilitation training to a user comprising:
obtaining image data of the user performing a movement of a body part, wherein the image data is collected using an imaging device located on a user device;
measuri ng a range of the movement of the body part based on the image data using a first trained model; and
generating a feedback in real-time to correct the movement using a second trained model.
2. The method of claim 1, wherein the image data is two-dimensional image data.
3. The method of claim 1 or 2, wherein the range of the movement comprises a joint angle and displacement of the body part in a three-dimensional space.
4. The method of claim 1, 2, or 3 wherein the first trained model is obtained using an unsupervised learning algorithm.
5. The method of any preceding claim, further comprising providing the feedback using a virtual reality or augmented reality system.
6. The method of any preceding claim, further comprising generating one or more metrics including a contact force or counts of repetitions based on the image data.
7. The method of claim 6, wherein the one or more metrics are derived from an estimated pose generated by the first trained model.
8. The method of any preceding claim, wherein the second trained model processes the one or more metrics and the range of motion and outputs the feedback.
9. The method of any preceding claim, wherein the first trained model or the second trained model is further improved using the image data.
10. A system for providing rehabilitation training to a user comprising:
a server in communication with a computing device associated with a user, wherein the server comprises a memory for storing interactive media and a set of software instructions, and one or more processors configured to execute the set of software instructions to:
obtain image data of the user performing a movement of a body part, wherein the image data is collected using an imaging device located on the user device;
measure a range of the movement of the body part based on the image data using a first trained model; and
generate a feedback in real-time to correct the movement using a second trained model.
11. The system of claim 10, wherein the image data is two-dimensional image data.
12. The system of claim 10 or 11, wherein the range of the movement comprises a joint angle and displacement of the body part in a three-dimensional space.
13. The system of claim 10, 1 1 , or 12 wherein the first trained model is developed using unsupervised learning algorithm.
14. The system of any of claims 10 to 13, wherein the system further comprises a virtual reality or augmented reality system for providing the feedback.
15. The system of any of claims 10 to 14, wherein the one or more processors are configured to further generate one or more metrics including a contact force or counts of repetitions based on the image data.
16. The system of claim 15, wherein the one or more metrics are derived from an estimated pose generated by the first trained model.
17. The system of any of claims 10 to 16, wherein the second trained model processes the one or more metrics and the range of motion and outputs the feedback.
18. The system of any of claim s 10 to 17, wherein the first trained model or the second trained model is further improved using the image data.
19. A tangible computer readable medium storing instructions that, when executed by a server, causes the server to perform a computer-implemented method for providing rehabilitation training to a user, the method comprising:
obtaining image data of the user performing a movement of a body part, wherein the image data is collected using an imaging device located on a user device;
measuring a range of the movement of the body part based on the image data using a first trained model; and
generating a feedback in real-time to correct the movement using a second trained model.
20. The tangible computer readable medium of claim 19, further comprising providing the feedback using a virtual reality or augmented reality system.
21. A computer program product storing instructions that, when executed by a server, cause the server to carry out the method of any of claim 1 to 9.
PCT/GB2020/051746 2019-07-22 2020-07-22 Methods and systems for musculoskeletal rehabilitation Ceased WO2021014149A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962876978P 2019-07-22 2019-07-22
US62/876,978 2019-07-22

Publications (1)

Publication Number Publication Date
WO2021014149A1 true WO2021014149A1 (en) 2021-01-28

Family

ID=71842700

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2020/051746 Ceased WO2021014149A1 (en) 2019-07-22 2020-07-22 Methods and systems for musculoskeletal rehabilitation

Country Status (1)

Country Link
WO (1) WO2021014149A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118053544A (en) * 2024-02-01 2024-05-17 江苏医药职业学院 Exercise injury rehabilitation training method and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170368413A1 (en) * 2016-03-12 2017-12-28 Arie Shavit Training system and methods for designing, monitoring and providing feedback of training
US20180315247A1 (en) * 2017-05-01 2018-11-01 Dave Van Andel Virtual or augmented reality rehabilitation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170368413A1 (en) * 2016-03-12 2017-12-28 Arie Shavit Training system and methods for designing, monitoring and providing feedback of training
US20180315247A1 (en) * 2017-05-01 2018-11-01 Dave Van Andel Virtual or augmented reality rehabilitation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A VAKANSKI ET AL: "Mathematical Modeling and Evaluation of Human Motions in Physical Therapy Using Mixture Density Neural Networks", JOURNAL OF PHYSIOTHERAPY & PHYSICAL REHABILITATION, 1 December 2016 (2016-12-01), United States, XP055740061, Retrieved from the Internet <URL:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5242735/pdf/nihms827516.pdf> [retrieved on 20201014] *
HUANG MING-CHUN ET AL: "Using Pressure Map Sequences for Recognition of On Bed Rehabilitation Exercises", IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, IEEE, PISCATAWAY, NJ, USA, vol. 18, no. 2, 1 March 2014 (2014-03-01), pages 411 - 418, XP011542039, ISSN: 2168-2194, [retrieved on 20140303], DOI: 10.1109/JBHI.2013.2296891 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118053544A (en) * 2024-02-01 2024-05-17 江苏医药职业学院 Exercise injury rehabilitation training method and system

Similar Documents

Publication Publication Date Title
CN113140279B (en) Method and system for describing and recommending optimal rehabilitation plan in adaptive telemedicine
US20220270738A1 (en) Computerized systems and methods for military operations where sensitive information is securely transmitted to assigned users based on ai/ml determinations of user capabilities
US20230058605A1 (en) Method and system for using sensor data to detect joint misalignment of a user using a treatment device to perform a treatment plan
US20220415471A1 (en) Method and system for using sensor data to identify secondary conditions of a user based on a detected joint misalignment of the user who is using a treatment device to perform a treatment plan
US20230072368A1 (en) System and method for using an artificial intelligence engine to optimize a treatment plan
Yang et al. Behavioral and physiological signals-based deep multimodal approach for mobile emotion recognition
US20230060039A1 (en) Method and system for using sensors to optimize a user treatment plan in a telemedicine environment
US20220230729A1 (en) Method and system for telemedicine resource deployment to optimize cohort-based patient health outcomes in resource-constrained environments
US20230405403A1 (en) Wearable device systems and methods for guiding physical movements
US20240203580A1 (en) Method and system for using artificial intelligence to triage treatment plans for patients and electronically initiate the treament plans based on the triaging
Maskeliūnas et al. BiomacVR: A virtual reality-based system for precise human posture and motion analysis in rehabilitation exercises using depth sensors
US12009083B2 (en) Remote physical therapy and assessment of patients
Rasa Artificial intelligence and its revolutionary role in physical and mental rehabilitation: a review of recent advancements
US20230092766A1 (en) Systems and methods for visualization of a treatment progress
Madhusanka et al. Implicit intention communication for activities of daily living of elder/disabled people to improve well-being
Rahman Multimedia environment toward analyzing and visualizing live kinematic data for children with hemiplegia
Yi et al. [Retracted] Home Interactive Elderly Care Two‐Way Video Healthcare System Design
Fiorini et al. User profiling to enhance clinical assessment and human–robot interaction: A feasibility study
Ekambaram et al. AI-assisted physical therapy for post-injury rehabilitation: Current state of the art
Wang et al. PhysiQ: Off-site Quality Assessment of Exercise in Physical Therapy
Ettefagh et al. Enhancing automated lower limb rehabilitation exercise task recognition through multi-sensor data fusion in tele-rehabilitation
Boudreault-Morales et al. The effect of depth data and upper limb impairment on lightweight monocular RGB human pose estimation models
WO2021014149A1 (en) Methods and systems for musculoskeletal rehabilitation
Zaher et al. Artificial intelligence techniques in enhancing home-based rehabilitation: A survey
Bini et al. Current Challenges and Future Outlook for Extended Reality as Cutting-Edge Assistive Technology Shaping Caring Personnel

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20747089

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20747089

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02/06/2022)