US20190038187A1 - Systems and methods for evaluating body motion - Google Patents
Systems and methods for evaluating body motion Download PDFInfo
- Publication number
- US20190038187A1 US20190038187A1 US16/054,074 US201816054074A US2019038187A1 US 20190038187 A1 US20190038187 A1 US 20190038187A1 US 201816054074 A US201816054074 A US 201816054074A US 2019038187 A1 US2019038187 A1 US 2019038187A1
- Authority
- US
- United States
- Prior art keywords
- motion
- user
- data
- joint
- server system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/7425—Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0062—Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0062—Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
- A63B2024/0068—Comparison to target or threshold, previous performance or not real time comparison to other individuals
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B2071/065—Visualisation of specific exercise parameters
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/80—Special sensors, transducers or devices therefor
- A63B2220/806—Video cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20072—Graph-based image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
Definitions
- the present disclosure is directed towards computer-based systems and methods for evaluating a user's body motion based on motion data obtained via a series of wireless sensors attached to a user's body.
- the motion data may objectively evaluated by the computer-based system.
- a physical therapist or trainer may measure the range of motion and other motion data for a particular joint before, during, and/or after an activity. For example, range of motion may be measured manually using a goniometer. However goniometers measurements may be inaccurate due to poor placement of the goniometer, interference due to uncontrolled movements by the user, and inability to properly obtain measurements using the goniometers. Moreover, goniometer measurements may be subject to both inter-user and intra-user variability. Put simply, a physical therapist or trainer may not measure the range of motion using the same method every time they see the patient, and one physical therapist or trainer may have a different measurement technique than a second physical therapist or trainer. As range of motion and other movement features are important factors for benchmarking progress in patients and athletes it is critical to have the most accurate and objective measurement of range of motion and other movement features.
- a server system receives motion data from one or more input devices.
- the motion data corresponds to movement of a user.
- the server system generates a motion profile based on at least the motion data received from the one or more input devices.
- the server system retrieves a pre-defined target motion profile from a database structure.
- the server system objectively evaluating the extracted motion profile by comparing one or more parameters of the generated motion profile with one or more parameters of the retrieved pre-defined target motion profile.
- a server system receives motion data from a plurality of input devices positioned about a user while performing a movement.
- the server system generates a motion profile for the user based at least on the motion data, by calculating a position of a joint of the user's body based on the motion data, calculating positions of at least two reference points of the user's body based on the motion data, and calculating an actual range of motion for the joint based on the position of the joint and the positions of the at least two reference points.
- the server system retrieves a reference range of motion from a database storing one or more reference range of motions for one or more movements.
- the system evaluates a user's body motion by comparing the reference range of motion to the actual range of motion to objectively evaluate the user's movement.
- a system in another embodiment, includes one or more input devices, a processor, and a memory.
- the one or more input devices are configured to capture a movement of a user.
- the processor in communication with the one or more input devices.
- the memory has programming instructions stored thereon, which, when executed by the processor, performs an operation.
- the operation includes receiving motion data from the one or more input devices, the motion data corresponding to movement of a user.
- the operation further includes generating a motion profile based on at least the motion data received from the one or more input devices.
- the operation further includes retrieving a pre-defined target motion profile from a database structure.
- the operation further includes objectively evaluating the extracted motion profile by comparing one or more parameters of the generated motion profile with one or more parameters of the retrieved pre-defined target motion profile.
- a computer-implemented method for automated movement evaluations may extract a motion profile from motion data received from one or more wireless sensors configured to obtain motion information of a user, retrieve a pre-defined target motion profile from a database structure, and objectively evaluate the extracted motion profile by comparing one or more parameters of the extracted motion profile with one or more corresponding parameters of the retrieved pre-defined target motion profile.
- the computer-implemented method may be configured to take one or more actions as a result of the objective evaluation.
- the systems and methods described herein may be integrated with the administrative computer systems used by physical therapists, medical professionals, insurance companies and the like. Accordingly, the systems and methods described herein may provide automated coding and billing services to third party providers, and monitor and convey information related to patient compliance and progression.
- the systems and methods described herein may provide means for improving a patient or athlete's performance of a particular movement.
- the computer based system for automated movement evaluations may be integrated with administrative medical computer systems and provide automated billing, insurance, and prescription assistance.
- FIG. 1 is a system diagram showing a body motion evaluation system, according to an embodiment of the present disclosure.
- FIG. 2 is a system diagram of an illustrative computer system, according to another embodiment of the present disclosure.
- FIG. 3 is a flow diagram showing a process for generating a graphical representation of a user's body based on motion data obtained from a plurality of sensors attached to the user, according to an embodiment of the present disclosure.
- FIG. 4 is a flow chart showing a process for evaluating a user's body motion based on motion data obtained from a plurality of sensors attached to the user, according to an embodiment of the present disclosure.
- FIG. 5 is system diagram showing a body motion evaluation system, according to another embodiment of the present disclosure.
- FIG. 6 is system diagram showing a body motion evaluation system, according to another embodiment of the present disclosure.
- FIGS. 7-21 show illustrative user interfaces (UIs) that may be used within a body motion evaluation system, according to embodiments of the present disclosure.
- UIs user interfaces
- a computer-implemented method for automated movement evaluations may extract a motion profile from motion data received from one or more wireless sensors configured to obtain motion information of a user, retrieve a pre-defined target motion profile from a database structure, and objectively evaluate the extracted motion profile by comparing one or more parameters of the extracted motion profile with one or more corresponding parameters of the retrieved pre-defined target motion profile.
- the computer-implemented method may be configured to take one or more actions as a result of the objected evaluation.
- FIG. 1 is a system diagram illustrating a body motion evaluation system 100 , according to an exemplary embodiment.
- Body motion evaluation system 100 may include a processing system (or “server”) 102 , a database 104 , a plurality of sensors 106 a - 106 n ( 106 generally), and a plurality of output devices 108 a - 108 n ( 108 generally).
- the system 100 may also include a plurality of cameras 107 a - 107 n ( 107 generally).
- the sensors, cameras, and output devices may be in communication with the server 102 via a network 110 .
- Network 110 may be of any suitable type, including individual connections via the Internet, such as cellular or Wi-Fi networks.
- network 105 may connect terminals, services, and mobile devices using direct connections, such as radio frequency identification (RFID), near-field communication (NFC), BluetoothTM, low-energy BluetoothTM (BLE), Wi-FiTM, ZigBeeTM, ambient backscatter communication (ABC) protocols, USB, WAN, or LAN.
- RFID radio frequency identification
- NFC near-field communication
- BLE low-energy BluetoothTM
- Wi-FiTM ZigBeeTM
- ABSC ambient backscatter communication
- Network 110 may include any type of computer networking arrangement used to exchange data.
- network 110 may include any type of computer networking arrangement used to exchange information.
- network 110 may be the Internet, a private data network, virtual private network using a public network and/or other suitable connection(s) that enables components in body motion evaluation system 100 to send and receiving information between the components of body motion evaluation system 100 .
- the sensors 106 may be attached to the body of a user 112 .
- the sensors may be coupled to one or more articles of clothing that can be worn by the user 112 .
- the sensors 106 may be attached to a tight-fighting body suit that can be worn by the user 112 .
- the sensors 106 may be attached to a sleeve (e.g., arm compression sleeve, calve compression sleeve, quadriceps/hamstring compression sleeve, etc.).
- sensors 106 may be located proximate a user's wrist, shoulder and/or back.
- the illustrative server 102 includes a sensor module 102 a, camera module 102 b , visualization module 102 c, output generation module 102 d, database management module 102 e , and an evaluation module 102 f.
- the operations described herein may be achieved with any suitable number of modules.
- Each of the sensors 106 is configured to generate motion data responsive to the user's body motion.
- each of the one or more sensors 106 may include an accelerometer, magnetometer, and/or gyroscope.
- An accelerometer may be configured to measure acceleration forces at the location of each sensor 106 .
- the acceleration forces may be measured in units of meters per second-squared (m/s 2 ).
- the magnetometer may be configured to measure magnetism and be normalized for Earth's magnetic field.
- a gyroscope may be configured to measure angular velocity in units of radians per second ( ⁇ ).
- a sensor's accelerometer, magnetometer, and/or gyroscope may be provided as MEMS (microelectromechanical systems) inertial sensors.
- each sensor 106 may generate roll, pitch and yaw data responsive to the user's body motion at the point on the user's body where the sensor is located.
- Each sensor 106 may generate motion data responsive to the user on any suitable time-scale.
- the sensors 106 may generate motion data on a time-scale of 1,000 data points per second.
- Sensor motion data may be processed by one or more filtering and/or down-sampling techniques.
- Kalman filters may be used.
- motion data may be down-sampled to a time-scale of 125 data points per second.
- the filtering and/or down sampling techniques may be performed at the site of the sensor 106 , prior to each sensor 106 wirelessly transmitting motion data to the server system 102 .
- the entirety of the obtained sensor motion data may be transmitted to the server system 102 and then filtered and/or down sampled at the server system 102 by a sensor module 102 a.
- the sensors 106 may transmit motion data to the server system 102 wirelessly via network 110 .
- sensors 106 may transmit motion data to the server system 102 using Bluetooth®, Wi-Fi, near-field communication (NFC), ZigBee®, or any other suitable wireless communication means.
- FIG. 1 includes a single server system 102 , those skilled in the art may readily understand that multiple server systems may be used.
- the multiple server systems may cooperate to evaluate a user's body motion.
- a sensor module within a first server system may perform an initial processing of sensor motion data
- a second server system may receive and use the initially processed motion data to perform an evaluation of the user's body motion.
- motion data obtained from the sensors 106 may be converted to “positional data” describing to the position of one or more joints, limbs, and/or other reference points on the user's body. Converting sensor motion data to positional data can be based on knowledge of where the sensors 106 are attached to the user's body, dimensions of the user's body, and/or dimensions of clothing worn by the user onto which the sensors are attached.
- the position data may, in turn, be used to calculate information (or “joint data”) about one or more joints of interest, such as range of motion, velocity, acceleration, or other aspects of angular movements about a joint.
- Example joints of the interest may include the elbow, wrist, shoulder, hip, knee, and ankle.
- the conversion from sensor motion data to positional data for the joints may involve one or more coordinate transformations.
- sensor motion data represented as (roll, pitch yaw) values may be converted to 3D Cartesian positional data, e.g., (x, y, z) values using forward kinematics.
- motion data received from the sensors 106 may be converted from (roll, pitch, yaw) values into a Quaternion representation.
- motion data represented as Quaternions may then be converted to positional data and/or joint data.
- the joint data may include a measure of the angle formed at a joint by two limbs that connect at the joint.
- the angle formed at the joint may be calculated based on a triangulation of the angle formed by the position of at least two sensors 106 with respect to the floor or another reference point.
- the angle formed at the joint may be calculated based on a triangulation of the angle formed by the position of the at least two limbs with respect to the floor or another reference point.
- Joint data may be obtained from positional data by analyzing the orientation of one or more sensors with respect to the user's limbs and joints. In one embodiment this may require knowledge of a user's limb length. This may include limb calculations based on a user's height and weight.
- joint data may be calculated based on information statistical population information and anatomy research.
- the described system may calculate an angle for the joint of interest based on the angle formed by two 3D Cartesian vectors e.g., (x, y, z) at the joint of interest.
- the system may display a triangle in a user interface illustrative of the two 3D Cartesian vectors and the angle for the joint.
- the angle for the joint may be calculated using mathematical techniques applying dot products, inverse trigonometric functions and radians to degree conversion.
- the described system may calculate a “between” value representative of the angle between a target joint and the next adjacent joint to the target joint.
- the between value may be calculated based on the angle between two directional vectors.
- the first directional vector may be calculated by subtracting the position of the previous adjacent joint from the position of the target joint.
- the second directional vector may be calculated from subtracting the position of the target joint from the position of the target joint.
- the described system may calculate an “extend” value representative of the angle between the direction of the target joint to the next adjacent joint to the target joint.
- the extend value may also be calculated as an angle between two directional vectors.
- the first directional vector may be calculated by subtracting the position of the target joint from the position of the previous adjacent joint.
- the second directional vector may be calculated from subtracting the position of the target joint from the position of the target joint.
- the described system may calculate an “axis” value representative of the measure of the angle formed by the target joint when compared to a given axis.
- the “axis” value may include a feature that allows a three-dimensional vector to offset the calculation and provide a customized measure of the angle.
- the axis value may be calculated using 2D (i.e., XY, YZ, XZ) planes used to create a directional vector to measure against the angle in 3D space.
- the axis value may allow a user to ignore unwanted values in the angle measurement. For example, the axis value may allow a user to ignore any horizontal movements when raising their arm vertically.
- the visualization module 102 c can generate visualizations of a user body movement based on, for example, calculated joint data. Such visualizations may include 3D avatars, wireframes, and animated avatars/wireframes. In some embodiments, visualization module 102 c can generate an animated wireframe of the user's movement having a plurality of nodes determined using positional and/or joint data. The positional and/or joint data may be used to determine the movement of nodes within the animation so that the visualization may accurately illustrate real-time user movement (i.e., movement sensed in real-time) or pre-recorded user movement. In one embodiment, the visualization module 102 c may generate a visualization based on input from one or more cameras 107 in addition to positional and/or joint data.
- Data recorded by the cameras 107 may be transmitted via the network 110 to the server system 102 and processed by camera module 102 b before being integrated with positional and/or joint data by the visualization module 102 c .
- a visualization generated by the visualization module 102 c may be generated by an output generation module 102 d and displayed on one or more output devices 108 .
- the visualization module 102 c may be configured to generate and overlay multiple different visualizations onto each other.
- reference motion profiles may be stored in a database 104 .
- the visualization module 102 c can overlay (i.e., superimpose) an animation (or other visualization) of stored reference movements with an animation of real-time user movements to facilitate evaluation of the user's range of motion.
- an evaluation module 102 f of the server system 102 may be configured to provide an evaluation of a movement recorded by the sensors 106 as will be discussed in relation to FIG. 4 .
- the database management module 102 e of the server system 102 may be configured to interface with a database 104 coupled to the server system 102 .
- the database management module 102 e may be used to access, retrieve, and modify data stored in the database 104 .
- the database 104 may include one or more repositories that hold data and information.
- An exercise repository 104 a of the database 104 may store information about “ideal” (or “perfect” or “target”) movements associated with certain exercises that can be performed by a user. Such information is referred to herein as a “motion profile.”
- a motion profile may be embodied as positional data, joint data, visualizations, video, and/or any combination thereof. Motion profiles can be retrieved from the exercise repository 104 a and used to generate a visualization of the corresponding “ideal” movements that can be compared in real-time against a user's sensed motion.
- the exercise repository 104 a of the database 104 may store animation files along with metadata for an exercise.
- the stored “motion profile” may include information related to the motion including a title, file path, target joint, measurement type, benchmark angles, duration and billing code.
- the stored animation may include animation for some or all character profiles including a 3D avatar, wireframe, and overlay.
- the stored animation is in an Enflux recording animation file format (.enfl).
- the database 104 may also include a billing information repository 104 b that is configured to include billing and insurance information related to particular exercises or groups of exercises.
- each exercise in the exercise repository 104 a may be associated with a Current Procedural Terminology (CPT) code that is compatible with standardized billing practices in the medical fields.
- CPT Current Procedural Terminology
- the database 104 may also include a user history repository 104 c that is configured to store various information associated with particular users. For example, a user's body motion can be captured and stored as a motion profile with the repository 104 c. Similar to the “ideal” motion profiles discussed above, the user's historical motion profiles can be compared against subsequent real-time motion for evaluation purposes. Other user information that could be stored includes the user's account registration information, prescribed exercises and training/physical therapy protocols, and billing and insurance information.
- an output generation module 102 d may process the information obtained by the server 102 and create the appropriate output for one or more output computing devices 108 .
- an output computing device 108 may include a patient/athlete computer device, an insurance company computing device, and the like.
- the output generation module 102 d may produce an application, website, or the like for a user, medical professional, or insurance agency personnel to view data and information related to movement.
- the output generation module 102 d may generate prescriptions, directly bill insurance companies, and display exercises on an output computing device 108 . Example outputs generated by the output generation module 102 d are illustrated in FIGS. 7-22 .
- the server system 102 may include one or more processors, or microprocessors coupled to one or more non-transitory memory devices, and may be adapted to perform the functions described herein.
- a server system 102 may be any special-purpose machine capable of storing and executing a set of computer-readable instructions (e.g., software) that specify actions to be taken to perform the functions described herein.
- the server system 102 may be a specialized component specifically designed to optimize the relationships set forth herein.
- the term “server” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. In some examples, at least two of the multiple server systems may be in different physical locations.
- a server system 102 is one example of computer-readable storage medium.
- the term “computer-readable storage medium” should be taken to include a single medium or multiple media that store one or more sets of instructions.
- the term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing or encoding a set of instructions for execution by the machine and that causes the machine to perform any one or more the methodologies of the present disclosure.
- FIG. 2 show an example of a computer system 200 , according to an aspect of the present disclosure.
- computer system 200 may be the same as or similar to a computer system utilized by a sensor device 106 , camera device 107 , output device 108 , and/or server system 102 of FIG. 1 .
- the computer system 200 is an example of a machine within which a set of instructions for causing the machine to perform any one or more of the methodologies, processes or functions discussed herein may be executed.
- the machine may be connected (e.g., networked) to other machines as described above.
- the machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine may be any special-purpose machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine for performing the functions describe herein.
- each of the output computing devices 108 may be implemented by the example machine shown in FIG. 2 (or a combination of two or more of such machines).
- Example computer system 200 may include processing device 201 , memory 205 , data storage device 209 and communication interface 213 , which may communicate with each other via data and control bus 211 .
- computer system 200 may also include display device 215 and/or user interface 217 .
- Processing device 201 may include, without being limited to, a microprocessor, a central processing unit, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP) and/or a network processor. Processing device 201 may be configured to execute processing logic 203 for performing the operations described herein. In general, processing device 201 may include any suitable special-purpose processing device specially programmed with processing logic 203 to perform the operations described herein.
- Computer system 200 may include communication interface device 213 , for direct communication with other computers (including wired and/or wireless communication), and/or for communication with network.
- computer system 200 may include display device 215 (e.g., a liquid crystal display (LCD), a touch sensitive display, etc.).
- display device 215 e.g., a liquid crystal display (LCD), a touch sensitive display, etc.
- computer system 200 may include user interface 217 (e.g., an alphanumeric input device, a cursor control device, etc.).
- computer system 200 may include data storage device 209 storing instructions (e.g., software) for performing any one or more of the functions described herein.
- Data storage device 209 may include any suitable non-transitory computer-readable storage medium, including, without being limited to, solid-state memories, optical media and magnetic media.
- FIG. 3 is a flow diagram illustrating a method 300 for generating a visualization for a character profile (3D avatar, wireframe or overlay) that may be representative of a user's body motion, according to one exemplary embodiment.
- Method 300 may begin at step 302 .
- server system 102 may receive motion data from one or more input devices.
- server system 102 may receive motion data from one or more sensors 106 positioned on or proximate to a body of user 112 .
- Server system 102 may further receive motion data from one or more camera devices 107 capturing movement of user 112 .
- Server system 102 may receive the motion data via network 110 .
- visualization module 102 c may receive the motion data captured by one or more sensors 106 via sensor module 102 a.
- visualization module 102 c may receive motion captured by one or more camera devices 107 via camera module 102 b.
- sensor module 102 a and camera module 102 b may transmit motion data to visualization module 102 c in real-time or near real-time. In some embodiments, sensor module 102 a and camera module 102 b may transmit motion data to visualization module 102 c periodically (i.e., at pre-defined points during the day). In some embodiments, sensor module 102 a and camera module 102 b may transmit motion data only when prompted by visualization module 102 c.
- server system 102 may convert the received motion data to positional data.
- visualization module 102 c may converted to the received motion data to positional data that describes to the position of one or more joints, limbs, and/or other reference points on the user's body.
- converting sensor motion data to positional data may be based on sensor 106 location, dimensions of the user's body, and/or dimensions of clothing worn by the user.
- Positional data may be used to locate one or more joints of interest such as elbow, wrist, shoulder, hip, knee, and ankle.
- visualization module 102 c may convert motion data to positional data via one or more coordinate transformations.
- sensor motion data represented as (roll, pitch yaw) values
- 3D Cartesian positional data e.g., (x, y, z) values
- motion data received from the input devices may be converted from (roll, pitch, yaw) values into a Quaternion representation.
- motion data represented as Quaternions may then be converted to positional data and/or joint data.
- server system 102 may calculate joint data may be calculated based on at least one of the motion data and positional data.
- Visualization module 102 c may obtain joint data from positional data by analyzing the orientation of one or more sensors with respect to the user's limbs and joints. In some embodiments, this may require knowledge of a user's limb length. This may include limb calculations based on a user's height and weight.
- visualization module 102 c may calculate joint data based, in part, on statistical population information and anatomy research.
- Joint data may include a measure of the angle formed at a joint by two limbs that connect at the joint.
- calculating the angle formed at the joint may be based on a triangulation of the angle formed by the position of at least two sensors 106 with respect to the floor or another reference point.
- calculating the angle formed at the joint may be based on a triangulation of the angle formed by the position of the at least two limbs with respect to the floor or another reference point.
- visualization module 102 c may calculate an angle for the joint of interest based on the angle formed by two three-dimensional vectors at the joint of interest.
- the angle for the joint may be calculated using mathematical techniques, such as, dot products operations, inverse trigonometric functions, radians-to-degrees conversion, and the like.
- joint data may further include a “between” value representative of the angle between a target joint and the next adjacent joint to the target joint.
- Visualization module 102 c may calculated the between value based on the angle between two directional vectors.
- the first directional vector may be calculated by subtracting the position of the previous adjacent joint from the position of the target joint.
- the second directional vector may be calculated from subtracting the position of the target joint from the position of the target joint.
- joint data may further include an “extend” value representative of the angle between the direction of the target joint to the next adjacent joint to the target joint.
- Visualization module 102 c may calculate the extend value based on an angle between two directional vectors. The first directional vector may be calculated by subtracting the position of the target joint from the position of the previous adjacent joint. The second directional vector may be calculated from subtracting the position of the target joint from the position of the target joint.
- joint data may further include an “axis” value representative of the measure of the angle formed by the target joint when compared to a given axis.
- the axis value may include a feature that allows a three-dimensional vector to offset the calculation and provide a customized measure of the angle.
- Visualization module 102 c may calculate the axis value using two dimensional planes to create a directional vector to measure against the angle in three-dimensional space.
- server system 102 may generate a character profile by applying one or more of the motion data, positional data, and joint data to a character profile.
- the character profile may be stored in the database 104 .
- FIG. 4 is a flow diagram illustrating a method 400 for evaluating a user's body motion based on motion data obtained from a plurality of sensors attached to the user, according to an exemplary embodiment.
- the one or more steps discussed in conjunction with method 400 may provide an objective evaluation of a user's movement by comparing a new recorded user movement with an “ideal” movement.
- the “ideal” movement may be a recording of a movement performed by a professional athlete or physical therapist stored in the exercise repository 104 a.
- the new recorded user movement may be compared with a user's past recorded movements stored in the user history repository 104 c.
- the output generation module 102 d of the server system 102 may retrieve a stored motion profile from the database 104 via the database management module 102 e.
- the stored motion profile may be a part of the exercise repository 104 a and/or the user history repository 104 c.
- the stored motion profile that is used for the evaluation may be specified by a user (patient/athlete), physical therapist, medical professional or the like.
- the stored motion profile may be specified thru an interactive graphical user interface, application and the like.
- the user, physical therapist, or medical professional who specifies the stored motion profile to use may use a search feature that identifies the stored movements in the database 104 that may be used as a part of the evaluation.
- output generation module 102 d may retrieve a character profile generated for the user.
- output generation module 102 d may retrieve the character profile generated above, in conjunction with method 300 .
- Output generation module 102 d may extract a motion profile from sensor motion data received from one or more wireless sensors configured to obtain motion information for a user by way of the visualization module 102 c in accordance with the methods discussed above.
- an evaluation module 102 f of server system 102 may then objectively evaluate the extracted motion profile by comparing the stored motion profile with the extracted motion profile in another step 405 .
- evaluation module 102 f may compare position data, angle range of the joint, velocity, acceleration, jerk, and peak spread or the furthest angle of separation between the two limbs at the joint of the extracted motion profile to position data, angle range of the joint, velocity, acceleration, jerk, and peak spread or the furthest angle of separation between the two limbs at the joint of the stored motion profile.
- evaluation module 102 f may then generate an evaluation of the newly recorded user movement.
- the objective evaluation of the extracted motion profile may involve a comparison of one or more parameters of the extracted motion profile with that of the retrieved pre-defined target motion profile. Possible parameters may include position data, angle range of the joint, velocity, acceleration, jerk, and peak spread or the furthest angle of separation between the two limbs at the joint.
- the evaluation module 102 f may provide an objective means for identifying points of pain or difficulty in a patient or athlete's movement by locating points within the movement that are not smooth in a trace of the movement over time.
- FIG. 5 is system diagram showing a body motion evaluation system 500 , according to an exemplary embodiment.
- Body motion evaluation system 500 includes a database 502 , application programming interfaces (API) endpoints 504 , a user account module 506 , an orders module 508 , a medical code module 510 , a reporting module 512 , and one or more graphical user interfaces (GUIs) 520 .
- API application programming interfaces
- GUIs graphical user interfaces
- a user such as an athlete or patient 522 may access the body motion evaluation system 500 by way of a graphical user interface 520 .
- a different graphical user interface may be provided to a clinician 524 having administrative privileges.
- the athlete or patient 522 and clinician 524 may be privy to different portions of the database 502 , also known as a “web locker.”
- the body motion evaluation system 500 may include the ability for users to place or fulfill orders 508 such as providing a user with a prescription for a physical therapy regimen.
- the system 500 may also report user compliance (participation in a prescribed physical therapy regimen) to an insurance company by way of the reporting module 512 .
- the system 500 may also be configured to bill an insurance company for the movements included in the prescribed physical therapy regimen by way of a medical code module 510 .
- the system may also include one or more application programming interfaces 504 that allow the system 500 to communicate with administrative computers 514 , patient sensors 516 and insurance companies 518 .
- FIG. 6 is system diagram showing a body motion evaluation system 600 , according to another embodiment of the present disclosure.
- the body motion evaluation system 600 includes a database 602 , a performance module 604 , and a therapy module 606 .
- the performance module 604 may be configured to improve the performance of an athlete.
- the therapy module 606 may be configured to provide physical therapy to an athlete or patient.
- the performance module 604 may be configured to interface with users by way of a desktop 608 a, web browser 608 b, Android® application 608 c, or Apple® based operating systems 608 d.
- the therapy module 606 may be configured to interface with users by way of a desktop 610 a, web browser 610 b, Android® application 610 c, or Apple® based operating systems 610 d.
- FIGS. 7-21 show illustrative user interfaces (UIs) that may be used within a body motion evaluation system, according to embodiments of the present disclosure.
- UIs user interfaces
- a user may be asked to create a profile that includes information including their insurance carrier, height, and weight.
- the insurance carrier information may be used in order to automatically bill insurance companies for physical therapy treatments.
- the height and weight information may be used to generate the joint data from the positional data and/or sensor motion data.
- the user information provided when creating a profile may be stored within a user history repository 104 c and configure settings in the billing information repository 104 b.
- a first panel 801 may provide a representation of an “ideal” motion
- a second panel 803 may provide a live tracking representation of a user's current body position
- a third panel 805 may provide a representation of the sensors located on the user's body
- a fourth panel 807 may provide information related to the cameras.
- the user interface may be configured with a menu bar 901 that displays information related to the state of the sensors.
- a menu bar 901 that displays information related to the state of the sensors.
- the sensors on the shirt are streaming information while the sensors in the pans are disconnected.
- the menu bar 901 may include options to calibrate the wireless sensors, reset the orientation, align the sensors, and manage Bluetooth® data transfer settings.
- the first panel 801 may provide a representation of an “ideal” motion.
- a representation may include a video recording, a wireframe animation, a 3D avatar animation, an overlay animation, and the like.
- the “ideal” motion may be selected from one or more stored motions.
- the user interface may provide a drop down menu 809 that allows a user to select stored motions corresponding to a particular joint such as the shoulder, knee, or hip or belonging to a particular set of motions such as a functional movement library.
- the user may be able to search 811 thru a collection of stored motions to locate an “ideal” motion.
- the stored motions may be located in an exercise repository 104 A of the database 104 .
- the “ideal” motion may correspond to a previously captured user motion that is stored in a user history repository 104 c of the database 104 .
- the user interface may be configured to display a recording of the representation of the selected “ideal” motion in a first panel 801 .
- the user may also select a joint or sensor of interest 805 a in the second panel 805 .
- the angle formed by the selected joint or sensor of interest may be calculated and displayed 801 a for the “ideal” motion in a first panel 801
- the angle for the selected joint or sensor of interest may also be calculated and displayed 803 a for the user's motion in real-time 803 .
- a user has opted to view the angle information for the shoulder.
- the user may also adjust the orientation of the representation (i.e., the 3D avatar, wireframe, or overlay) so that the joint or sensor of interest is more clearly visible.
- the user interface may be configured so that a user's live motion is recorded and available for playback in the second panel 803 . Accordingly, a user may be able to play a recording of the “ideal” motion synchronized with a recording of the user's motion.
- the user interface may include a motion representation settings panel 813 that allows a user to select the character representation they would like to see for the “ideal” motion and/or the user motion.
- the character representation may be a wireframe, overlay, or 3D avatar.
- the motion representation settings panel 813 may also include the option of displaying camera data from a webcam that populates panel 807 and angle data that may be superimposed on the motion representations of the “ideal” motion and the user motion.
- “ideal” motion and/or user motion may be represented as a wireframe and/or overlay.
- the shoulder flexion angle may be calculated based on a triangle drawn between the arm, a position on the user's body, and the joint of interest, the shoulder.
- the user interface may include an evaluation panel 815 that provides a graphical representation comparing a user's recorded movement with an “ideal” movement. As illustrated, the comparison may display the angle of interest over time in degrees per frame. Suitable comparison parameters may also include position, velocity, acceleration over time.
- the third panel 805 providing a representation of the sensors located on the user's body may also include one or more menus that allow a user to select various configurations for measuring the angles of interest.
- the user interface in connection with the output generation module 102 d may produce an evaluation or report.
- the evaluation may provide a comparison of the user's recorded movement with an “ideal” movement.
- the evaluation may display the user's personal information, the time and date of the report was generated, the time and date of the movement being evaluated, the movement of interest, the timing for the movement of interest, where the measurement was taken from, what the maximum angle of measurement was and a graph comparing the user's recorded movement with an “ideal” movement.
- the graph may display the angle of interest over time in degrees per frame, position, velocity, and/or acceleration over time. From the graph it may be possible to identify points of pain based on the shape of the graph.
- the evaluation or report may also include time-stamping so the user may see the routine of the duration and frequency of exercise. This may allow a user's progress to be tracked per session and over selected extended periods of time (e.g. day, week, month, year).
- a clinician such as a physical therapist or trainer may use the user interface to prescribe a series of exercises for a patient or athlete to complete.
- the clinician may input information regarding the user's condition, and select one or more exercises that are stored in the exercise repository 104 A in the depicted prescription portal 817 .
- the data input by the clinician may then be exported into prescription email, or prescription document that is provided to the user.
- the prescription document 1900 may be automatically generated based on the selections made by the clinician.
- the prescription document 1900 may automatically include a description and images that correspond to the selected exercises.
- the user interface may also include a billing portal 2000 .
- a clinician or user may select one or more exercises that were performed with the user.
- the billing portal 2000 may then retrieve billing information such as CPT codes from the billing information repository 104 b in order to automatically generate an invoice compatible with insurance company computing systems.
- the user interface may also be used for sports-specific performance training applications corresponding to the performance module 604 .
- the user interface that displays a target sports specific ideal motion and a live tracking of athlete motion.
- the user interface may be configured differently for a sports-specific performance training application corresponding to the performance module 604 than for a therapy based application.
- the output generation module may generate applications specific to a user.
- User specific applications may be tailored to the needs of the veterans rehabilitation market, the fitness and athletic improvement market, the mobile and personal device digital products market and the medical market.
- the systems and methods described herein may be used as a baseline diagnostic and informational tool within a hospital or other medical care environment.
- the systems and methods described herein may be used in connection with regenerative medicine groups as a means to track the effect of and outcomes from stem cell injections or other physical therapy treatments.
- the user compliance and billing aspects of the systems and methods described herein may be useful for medical care provided in connection with worker's compensation.
- the systems and methods described herein may be used as a supportive pre- or post-op information resource for orthopedics or other medical professionals.
- the systems and methods described herein may be used in an in-home setting and allow for in-home rehabilitation after orthopedic surgery.
- conventional in-home rehabilitation or center based rehabilitation often times physical therapy is unsuccessful in aiding a patient or athlete in obtaining their full range of motion due to non-compliance and lack of oversight from medical professionals.
- many patients and athletes may undergo second corrective surgeries.
- the systems and methods described herein may provide remote tracking, and guided physical therapy routines with oversight from medical professionals, thus assisting with patient and athlete compliance and providing an improved quality of physical therapy.
- the systems and methods described herein may be used by wellness providers such as physical therapists, massates or chiropractors.
- wellness providers may track patient/client motion prior to, during, and after treatment and provide the information to patients and clients using the systems and methods described herein.
- systems and methods described herein may be provided to primary care physicians as a part of routine patient monitoring during yearly examinations.
- the systems and methods described herein may involve one or more of the following steps in any suitable order.
- a user may putt on a body suit containing wireless sensors.
- the sensors on the body suit may be paired to a computing device.
- a user may select a target or ideal motion profile for comparison.
- the body suit may starting a recording of a user's movement while the user is wearing the suit.
- the suit transmits data to the computing device while recording the user's movement.
- angle data is recorded based on the target joint and measurement type.
- a suit may stream data to the computing device.
- rotational data corresponding to the shirts or pans of the body suit may be transmitted over Bluetooth® or other suitable communication protocols to an application.
- rotational data in Quaternions
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Physiology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Human Computer Interaction (AREA)
- Geometry (AREA)
- Multimedia (AREA)
- Physical Education & Sports Medicine (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application claims reference to U.S. Provisional Application Ser. No. 62/540,689 filed Aug. 3, 2017, which is hereby incorporated by reference in its entirety.
- The present disclosure is directed towards computer-based systems and methods for evaluating a user's body motion based on motion data obtained via a series of wireless sensors attached to a user's body. The motion data may objectively evaluated by the computer-based system.
- Conventionally, a physical therapist or trainer may measure the range of motion and other motion data for a particular joint before, during, and/or after an activity. For example, range of motion may be measured manually using a goniometer. However goniometers measurements may be inaccurate due to poor placement of the goniometer, interference due to uncontrolled movements by the user, and inability to properly obtain measurements using the goniometers. Moreover, goniometer measurements may be subject to both inter-user and intra-user variability. Put simply, a physical therapist or trainer may not measure the range of motion using the same method every time they see the patient, and one physical therapist or trainer may have a different measurement technique than a second physical therapist or trainer. As range of motion and other movement features are important factors for benchmarking progress in patients and athletes it is critical to have the most accurate and objective measurement of range of motion and other movement features.
- Indeed, as discussed by Nussbaumer et al. (Nussbaumer, Silvio, et al. “Validity and test-retest reliability of manual goniometers for measuring passive hip range of motion in femoroacetabular impingement patients.” BMC musculoskeletal disorders 11.1 (2010): 194.), major drawbacks of goniometry for hip measurements are that the starting position, the center of rotation, the long axis of the limb and the true vertical and horizontal positions can only be visually estimated and that conventional goniometers must be held with two hands, leaving neither hand free for stabilization of the body or the proximal part of the joint. There are also difficulties in monitoring joints that are surrounded by large amounts of soft tissue, such as the hip. The validity (i.e., the degree to which a measurement actually measures what it claims to measure) and reliability (i.e., the degree to which a measurement is consistent and stable) of manual goniometers have therefore been questioned, especially for measuring hip flexion.
- Accordingly there remains a need for systems and methods to evaluate a user's body motion in an automated and objective manner such that they are not prone to human error or variation due to human recording.
- In one embodiment, a computer-implemented method is disclosed herein. A server system receives motion data from one or more input devices. The motion data corresponds to movement of a user. The server system generates a motion profile based on at least the motion data received from the one or more input devices. The server system retrieves a pre-defined target motion profile from a database structure. The server system objectively evaluating the extracted motion profile by comparing one or more parameters of the generated motion profile with one or more parameters of the retrieved pre-defined target motion profile.
- In another embodiment, a computer-implemented method is disclosed herein. A server system receives motion data from a plurality of input devices positioned about a user while performing a movement. The server system generates a motion profile for the user based at least on the motion data, by calculating a position of a joint of the user's body based on the motion data, calculating positions of at least two reference points of the user's body based on the motion data, and calculating an actual range of motion for the joint based on the position of the joint and the positions of the at least two reference points. The server system retrieves a reference range of motion from a database storing one or more reference range of motions for one or more movements. The system evaluates a user's body motion by comparing the reference range of motion to the actual range of motion to objectively evaluate the user's movement.
- In another embodiment, a system is disclosed herein. The system includes one or more input devices, a processor, and a memory. The one or more input devices are configured to capture a movement of a user. The processor in communication with the one or more input devices. The memory has programming instructions stored thereon, which, when executed by the processor, performs an operation. The operation includes receiving motion data from the one or more input devices, the motion data corresponding to movement of a user. The operation further includes generating a motion profile based on at least the motion data received from the one or more input devices. The operation further includes retrieving a pre-defined target motion profile from a database structure. The operation further includes objectively evaluating the extracted motion profile by comparing one or more parameters of the generated motion profile with one or more parameters of the retrieved pre-defined target motion profile.
- The present disclosure relates to computer based systems and methods for evaluating a user's body motion in an automated and objective fashion. To that end, a computer-implemented method for automated movement evaluations may extract a motion profile from motion data received from one or more wireless sensors configured to obtain motion information of a user, retrieve a pre-defined target motion profile from a database structure, and objectively evaluate the extracted motion profile by comparing one or more parameters of the extracted motion profile with one or more corresponding parameters of the retrieved pre-defined target motion profile. In one embodiment, the computer-implemented method may be configured to take one or more actions as a result of the objective evaluation.
- In one embodiment, the systems and methods described herein may be integrated with the administrative computer systems used by physical therapists, medical professionals, insurance companies and the like. Accordingly, the systems and methods described herein may provide automated coding and billing services to third party providers, and monitor and convey information related to patient compliance and progression.
- In one embodiment, the systems and methods described herein may provide means for improving a patient or athlete's performance of a particular movement.
- In one embodiment, the computer based system for automated movement evaluations may be integrated with administrative medical computer systems and provide automated billing, insurance, and prescription assistance.
- Many aspects of the present embodiments may be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon illustrating example embodiments:
-
FIG. 1 is a system diagram showing a body motion evaluation system, according to an embodiment of the present disclosure. -
FIG. 2 is a system diagram of an illustrative computer system, according to another embodiment of the present disclosure. -
FIG. 3 is a flow diagram showing a process for generating a graphical representation of a user's body based on motion data obtained from a plurality of sensors attached to the user, according to an embodiment of the present disclosure. -
FIG. 4 is a flow chart showing a process for evaluating a user's body motion based on motion data obtained from a plurality of sensors attached to the user, according to an embodiment of the present disclosure. -
FIG. 5 is system diagram showing a body motion evaluation system, according to another embodiment of the present disclosure. -
FIG. 6 is system diagram showing a body motion evaluation system, according to another embodiment of the present disclosure. -
FIGS. 7-21 show illustrative user interfaces (UIs) that may be used within a body motion evaluation system, according to embodiments of the present disclosure. - The present disclosure relates to computer based systems and methods for automated movement evaluations. To that end a computer-implemented method for automated movement evaluations may extract a motion profile from motion data received from one or more wireless sensors configured to obtain motion information of a user, retrieve a pre-defined target motion profile from a database structure, and objectively evaluate the extracted motion profile by comparing one or more parameters of the extracted motion profile with one or more corresponding parameters of the retrieved pre-defined target motion profile. In one embodiment, the computer-implemented method may be configured to take one or more actions as a result of the objected evaluation.
-
FIG. 1 is a system diagram illustrating a bodymotion evaluation system 100, according to an exemplary embodiment. Bodymotion evaluation system 100 may include a processing system (or “server”) 102, adatabase 104, a plurality ofsensors 106 a-106 n (106 generally), and a plurality ofoutput devices 108 a-108 n (108 generally). In some embodiments, thesystem 100 may also include a plurality ofcameras 107 a-107 n (107 generally). The sensors, cameras, and output devices may be in communication with theserver 102 via anetwork 110. - Network 110 may be of any suitable type, including individual connections via the Internet, such as cellular or Wi-Fi networks. In some embodiments, network 105 may connect terminals, services, and mobile devices using direct connections, such as radio frequency identification (RFID), near-field communication (NFC), Bluetooth™, low-energy Bluetooth™ (BLE), Wi-Fi™, ZigBee™, ambient backscatter communication (ABC) protocols, USB, WAN, or LAN. Because the information transmitted may be personal or confidential, security concerns may dictate one or more of these types of connection be encrypted or otherwise secured. In some embodiments, however, the information being transmitted may be less personal, and therefore, the network connections may be selected for convenience over security.
-
Network 110 may include any type of computer networking arrangement used to exchange data. For example,network 110 may include any type of computer networking arrangement used to exchange information. For example,network 110 may be the Internet, a private data network, virtual private network using a public network and/or other suitable connection(s) that enables components in bodymotion evaluation system 100 to send and receiving information between the components of bodymotion evaluation system 100. - The
sensors 106 may be attached to the body of auser 112. In some embodiments, the sensors may be coupled to one or more articles of clothing that can be worn by theuser 112. For example, thesensors 106 may be attached to a tight-fighting body suit that can be worn by theuser 112. In another example, thesensors 106 may be attached to a sleeve (e.g., arm compression sleeve, calve compression sleeve, quadriceps/hamstring compression sleeve, etc.). In one embodiment,sensors 106 may be located proximate a user's wrist, shoulder and/or back. - Data generated by the
sensors 106 andcameras 107 may be provided to theserver 102 via thenetwork 110. Theillustrative server 102 includes asensor module 102 a,camera module 102 b,visualization module 102 c,output generation module 102 d,database management module 102 e, and anevaluation module 102 f. The operations described herein may be achieved with any suitable number of modules. - Each of the
sensors 106 is configured to generate motion data responsive to the user's body motion. In one embodiment, each of the one ormore sensors 106 may include an accelerometer, magnetometer, and/or gyroscope. An accelerometer may be configured to measure acceleration forces at the location of eachsensor 106. In some embodiments, the acceleration forces may be measured in units of meters per second-squared (m/s2). In some embodiments, the magnetometer may be configured to measure magnetism and be normalized for Earth's magnetic field. In some embodiments, a gyroscope may be configured to measure angular velocity in units of radians per second (ω). In some embodiments embodiments, a sensor's accelerometer, magnetometer, and/or gyroscope may be provided as MEMS (microelectromechanical systems) inertial sensors. In some embodiments, eachsensor 106 may generate roll, pitch and yaw data responsive to the user's body motion at the point on the user's body where the sensor is located. - Each
sensor 106 may generate motion data responsive to the user on any suitable time-scale. For example, in one embodiment thesensors 106 may generate motion data on a time-scale of 1,000 data points per second. Sensor motion data may be processed by one or more filtering and/or down-sampling techniques. In some embodiments, Kalman filters may be used. In some embodiments, motion data may be down-sampled to a time-scale of 125 data points per second. In some embodiments, the filtering and/or down sampling techniques may be performed at the site of thesensor 106, prior to eachsensor 106 wirelessly transmitting motion data to theserver system 102. In some embodiments, the entirety of the obtained sensor motion data may be transmitted to theserver system 102 and then filtered and/or down sampled at theserver system 102 by asensor module 102 a. - In some embodiments, the
sensors 106 may transmit motion data to theserver system 102 wirelessly vianetwork 110. For example,sensors 106 may transmit motion data to theserver system 102 using Bluetooth®, Wi-Fi, near-field communication (NFC), ZigBee®, or any other suitable wireless communication means. - Although the embodiment illustrated in
FIG. 1 includes asingle server system 102, those skilled in the art may readily understand that multiple server systems may be used. In some embodiments, the multiple server systems may cooperate to evaluate a user's body motion. For example, a sensor module within a first server system may perform an initial processing of sensor motion data, and a second server system may receive and use the initially processed motion data to perform an evaluation of the user's body motion. - In one embodiment, motion data obtained from the
sensors 106 may be converted to “positional data” describing to the position of one or more joints, limbs, and/or other reference points on the user's body. Converting sensor motion data to positional data can be based on knowledge of where thesensors 106 are attached to the user's body, dimensions of the user's body, and/or dimensions of clothing worn by the user onto which the sensors are attached. The position data may, in turn, be used to calculate information (or “joint data”) about one or more joints of interest, such as range of motion, velocity, acceleration, or other aspects of angular movements about a joint. Example joints of the interest may include the elbow, wrist, shoulder, hip, knee, and ankle. - In one embodiment, the conversion from sensor motion data to positional data for the joints may involve one or more coordinate transformations. For example, in one embodiment, sensor motion data represented as (roll, pitch yaw) values may be converted to 3D Cartesian positional data, e.g., (x, y, z) values using forward kinematics. In one embodiment, motion data received from the
sensors 106 may be converted from (roll, pitch, yaw) values into a Quaternion representation. In one embodiment, motion data represented as Quaternions may then be converted to positional data and/or joint data. - In one embodiment, the joint data may include a measure of the angle formed at a joint by two limbs that connect at the joint. In one embodiment the angle formed at the joint may be calculated based on a triangulation of the angle formed by the position of at least two
sensors 106 with respect to the floor or another reference point. In one embodiment the angle formed at the joint may be calculated based on a triangulation of the angle formed by the position of the at least two limbs with respect to the floor or another reference point. Joint data may be obtained from positional data by analyzing the orientation of one or more sensors with respect to the user's limbs and joints. In one embodiment this may require knowledge of a user's limb length. This may include limb calculations based on a user's height and weight. In one embodiment joint data may be calculated based on information statistical population information and anatomy research. - In one embodiment, the described system may calculate an angle for the joint of interest based on the angle formed by two 3D Cartesian vectors e.g., (x, y, z) at the joint of interest. In one embodiment the system may display a triangle in a user interface illustrative of the two 3D Cartesian vectors and the angle for the joint. The angle for the joint may be calculated using mathematical techniques applying dot products, inverse trigonometric functions and radians to degree conversion.
- In one embodiment, the described system may calculate a “between” value representative of the angle between a target joint and the next adjacent joint to the target joint. The between value may be calculated based on the angle between two directional vectors. The first directional vector may be calculated by subtracting the position of the previous adjacent joint from the position of the target joint. The second directional vector may be calculated from subtracting the position of the target joint from the position of the target joint.
- In one embodiment, the described system may calculate an “extend” value representative of the angle between the direction of the target joint to the next adjacent joint to the target joint. The extend value may also be calculated as an angle between two directional vectors. The first directional vector may be calculated by subtracting the position of the target joint from the position of the previous adjacent joint. The second directional vector may be calculated from subtracting the position of the target joint from the position of the target joint.
- In one embodiment, the described system may calculate an “axis” value representative of the measure of the angle formed by the target joint when compared to a given axis. In one embodiment the “axis” value may include a feature that allows a three-dimensional vector to offset the calculation and provide a customized measure of the angle. The axis value may be calculated using 2D (i.e., XY, YZ, XZ) planes used to create a directional vector to measure against the angle in 3D space. In one embodiment the axis value may allow a user to ignore unwanted values in the angle measurement. For example, the axis value may allow a user to ignore any horizontal movements when raising their arm vertically.
- The
visualization module 102 c can generate visualizations of a user body movement based on, for example, calculated joint data. Such visualizations may include 3D avatars, wireframes, and animated avatars/wireframes. In some embodiments,visualization module 102 c can generate an animated wireframe of the user's movement having a plurality of nodes determined using positional and/or joint data. The positional and/or joint data may be used to determine the movement of nodes within the animation so that the visualization may accurately illustrate real-time user movement (i.e., movement sensed in real-time) or pre-recorded user movement. In one embodiment, thevisualization module 102 c may generate a visualization based on input from one ormore cameras 107 in addition to positional and/or joint data. Data recorded by thecameras 107 may be transmitted via thenetwork 110 to theserver system 102 and processed bycamera module 102 b before being integrated with positional and/or joint data by thevisualization module 102 c. A visualization generated by thevisualization module 102 c may be generated by anoutput generation module 102 d and displayed on one ormore output devices 108. - In some embodiments, the
visualization module 102 c may be configured to generate and overlay multiple different visualizations onto each other. For example, as discussed below, reference motion profiles may be stored in adatabase 104. Thevisualization module 102 c can overlay (i.e., superimpose) an animation (or other visualization) of stored reference movements with an animation of real-time user movements to facilitate evaluation of the user's range of motion. - In one embodiment, an
evaluation module 102 f of theserver system 102 may be configured to provide an evaluation of a movement recorded by thesensors 106 as will be discussed in relation toFIG. 4 . - The
database management module 102 e of theserver system 102 may be configured to interface with adatabase 104 coupled to theserver system 102. Thedatabase management module 102 e may be used to access, retrieve, and modify data stored in thedatabase 104. Thedatabase 104 may include one or more repositories that hold data and information. - An
exercise repository 104 a of thedatabase 104 may store information about “ideal” (or “perfect” or “target”) movements associated with certain exercises that can be performed by a user. Such information is referred to herein as a “motion profile.” A motion profile may be embodied as positional data, joint data, visualizations, video, and/or any combination thereof. Motion profiles can be retrieved from theexercise repository 104 a and used to generate a visualization of the corresponding “ideal” movements that can be compared in real-time against a user's sensed motion. - In one embodiment the
exercise repository 104 a of thedatabase 104 may store animation files along with metadata for an exercise. The stored “motion profile” may include information related to the motion including a title, file path, target joint, measurement type, benchmark angles, duration and billing code. In one embodiment the stored animation may include animation for some or all character profiles including a 3D avatar, wireframe, and overlay. In one embodiment the stored animation is in an Enflux recording animation file format (.enfl). - The
database 104 may also include abilling information repository 104 b that is configured to include billing and insurance information related to particular exercises or groups of exercises. For example, each exercise in theexercise repository 104 a may be associated with a Current Procedural Terminology (CPT) code that is compatible with standardized billing practices in the medical fields. - The
database 104 may also include a user history repository 104 c that is configured to store various information associated with particular users. For example, a user's body motion can be captured and stored as a motion profile with the repository 104 c. Similar to the “ideal” motion profiles discussed above, the user's historical motion profiles can be compared against subsequent real-time motion for evaluation purposes. Other user information that could be stored includes the user's account registration information, prescribed exercises and training/physical therapy protocols, and billing and insurance information. - In some embodiments, an
output generation module 102 d may process the information obtained by theserver 102 and create the appropriate output for one or moreoutput computing devices 108. For example, anoutput computing device 108 may include a patient/athlete computer device, an insurance company computing device, and the like. In one embodiment, theoutput generation module 102 d may produce an application, website, or the like for a user, medical professional, or insurance agency personnel to view data and information related to movement. Theoutput generation module 102 d may generate prescriptions, directly bill insurance companies, and display exercises on anoutput computing device 108. Example outputs generated by theoutput generation module 102 d are illustrated inFIGS. 7-22 . - In some embodiments, the
server system 102 may include one or more processors, or microprocessors coupled to one or more non-transitory memory devices, and may be adapted to perform the functions described herein. Aserver system 102 may be any special-purpose machine capable of storing and executing a set of computer-readable instructions (e.g., software) that specify actions to be taken to perform the functions described herein. Alternatively, theserver system 102 may be a specialized component specifically designed to optimize the relationships set forth herein. The term “server” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. In some examples, at least two of the multiple server systems may be in different physical locations. - A
server system 102 is one example of computer-readable storage medium. The term “computer-readable storage medium” should be taken to include a single medium or multiple media that store one or more sets of instructions. The term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing or encoding a set of instructions for execution by the machine and that causes the machine to perform any one or more the methodologies of the present disclosure. -
FIG. 2 show an example of acomputer system 200, according to an aspect of the present disclosure. In some embodiments,computer system 200 may be the same as or similar to a computer system utilized by asensor device 106,camera device 107,output device 108, and/orserver system 102 ofFIG. 1 . - The
computer system 200 is an example of a machine within which a set of instructions for causing the machine to perform any one or more of the methodologies, processes or functions discussed herein may be executed. In some examples, the machine may be connected (e.g., networked) to other machines as described above. The machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be any special-purpose machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine for performing the functions describe herein. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. In some examples, each of theoutput computing devices 108 may be implemented by the example machine shown inFIG. 2 (or a combination of two or more of such machines). -
Example computer system 200 may includeprocessing device 201,memory 205,data storage device 209 andcommunication interface 213, which may communicate with each other via data andcontrol bus 211. In some examples,computer system 200 may also includedisplay device 215 and/or user interface 217. -
Processing device 201 may include, without being limited to, a microprocessor, a central processing unit, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP) and/or a network processor.Processing device 201 may be configured to executeprocessing logic 203 for performing the operations described herein. In general,processing device 201 may include any suitable special-purpose processing device specially programmed withprocessing logic 203 to perform the operations described herein. -
Memory 205 may include, for example, without being limited to, at least one of a read-only memory (ROM), a random access memory (RAM), a flash memory, a dynamic RAM (DRAM) and a static RAM (SRAM), storing computer-readable instructions 207 executable by processingdevice 201. In general,memory 205 may include any suitable non-transitory computer readable storage medium storing computer-readable instructions 207 executable by processingdevice 201 for performing the operations described herein. Although onememory device 205 is illustrated inFIG. 2 , in some examples,computer system 200 may include two or more memory devices (e.g., dynamic memory and static memory). -
Computer system 200 may includecommunication interface device 213, for direct communication with other computers (including wired and/or wireless communication), and/or for communication with network. In some examples,computer system 200 may include display device 215 (e.g., a liquid crystal display (LCD), a touch sensitive display, etc.). In some examples,computer system 200 may include user interface 217 (e.g., an alphanumeric input device, a cursor control device, etc.). - In some examples,
computer system 200 may includedata storage device 209 storing instructions (e.g., software) for performing any one or more of the functions described herein.Data storage device 209 may include any suitable non-transitory computer-readable storage medium, including, without being limited to, solid-state memories, optical media and magnetic media. -
FIG. 3 is a flow diagram illustrating amethod 300 for generating a visualization for a character profile (3D avatar, wireframe or overlay) that may be representative of a user's body motion, according to one exemplary embodiment.Method 300 may begin atstep 302. - At
step 302,server system 102 may receive motion data from one or more input devices. For example,server system 102 may receive motion data from one ormore sensors 106 positioned on or proximate to a body ofuser 112.Server system 102 may further receive motion data from one ormore camera devices 107 capturing movement ofuser 112.Server system 102 may receive the motion data vianetwork 110. In some embodiments,visualization module 102 c may receive the motion data captured by one ormore sensors 106 viasensor module 102 a. In some embodiments,visualization module 102 c may receive motion captured by one ormore camera devices 107 viacamera module 102 b. - In some embodiments,
sensor module 102 a andcamera module 102 b may transmit motion data tovisualization module 102 c in real-time or near real-time. In some embodiments,sensor module 102 a andcamera module 102 b may transmit motion data tovisualization module 102 c periodically (i.e., at pre-defined points during the day). In some embodiments,sensor module 102 a andcamera module 102 b may transmit motion data only when prompted byvisualization module 102 c. - At
step 304,server system 102 may convert the received motion data to positional data. For example,visualization module 102 c may converted to the received motion data to positional data that describes to the position of one or more joints, limbs, and/or other reference points on the user's body. - In some embodiments, converting sensor motion data to positional data may be based on
sensor 106 location, dimensions of the user's body, and/or dimensions of clothing worn by the user. Positional data may be used to locate one or more joints of interest such as elbow, wrist, shoulder, hip, knee, and ankle. - In some embodiments,
visualization module 102 c may convert motion data to positional data via one or more coordinate transformations. For example, sensor motion data, represented as (roll, pitch yaw) values, may be converted to 3D Cartesian positional data (e.g., (x, y, z) values) using forward kinematics. In some embodiments, motion data received from the input devices may be converted from (roll, pitch, yaw) values into a Quaternion representation. In some embodiments, motion data represented as Quaternions may then be converted to positional data and/or joint data. - At
step 306,server system 102 may calculate joint data may be calculated based on at least one of the motion data and positional data.Visualization module 102 c may obtain joint data from positional data by analyzing the orientation of one or more sensors with respect to the user's limbs and joints. In some embodiments, this may require knowledge of a user's limb length. This may include limb calculations based on a user's height and weight. In some embodiments,visualization module 102 c may calculate joint data based, in part, on statistical population information and anatomy research. - Joint data may include a measure of the angle formed at a joint by two limbs that connect at the joint. In some embodiments, calculating the angle formed at the joint may be based on a triangulation of the angle formed by the position of at least two
sensors 106 with respect to the floor or another reference point. - In some embodiments, calculating the angle formed at the joint may be based on a triangulation of the angle formed by the position of the at least two limbs with respect to the floor or another reference point.
- In some embodiments,
visualization module 102 c may calculate an angle for the joint of interest based on the angle formed by two three-dimensional vectors at the joint of interest. The angle for the joint may be calculated using mathematical techniques, such as, dot products operations, inverse trigonometric functions, radians-to-degrees conversion, and the like. - In some embodiments, joint data may further include a “between” value representative of the angle between a target joint and the next adjacent joint to the target joint.
Visualization module 102 c may calculated the between value based on the angle between two directional vectors. The first directional vector may be calculated by subtracting the position of the previous adjacent joint from the position of the target joint. The second directional vector may be calculated from subtracting the position of the target joint from the position of the target joint. - In some embodiments, joint data may further include an “extend” value representative of the angle between the direction of the target joint to the next adjacent joint to the target joint.
Visualization module 102 c may calculate the extend value based on an angle between two directional vectors. The first directional vector may be calculated by subtracting the position of the target joint from the position of the previous adjacent joint. The second directional vector may be calculated from subtracting the position of the target joint from the position of the target joint. - In some embodiments, joint data may further include an “axis” value representative of the measure of the angle formed by the target joint when compared to a given axis. In some embodiments the axis value may include a feature that allows a three-dimensional vector to offset the calculation and provide a customized measure of the angle.
Visualization module 102 c may calculate the axis value using two dimensional planes to create a directional vector to measure against the angle in three-dimensional space. - At step 408,
server system 102 may generate a character profile by applying one or more of the motion data, positional data, and joint data to a character profile. In one embodiment, the character profile may be stored in thedatabase 104. -
FIG. 4 is a flow diagram illustrating amethod 400 for evaluating a user's body motion based on motion data obtained from a plurality of sensors attached to the user, according to an exemplary embodiment. - The one or more steps discussed in conjunction with
method 400 may provide an objective evaluation of a user's movement by comparing a new recorded user movement with an “ideal” movement. In some embodiments, the “ideal” movement may be a recording of a movement performed by a professional athlete or physical therapist stored in theexercise repository 104 a. In some embodiments, the new recorded user movement may be compared with a user's past recorded movements stored in the user history repository 104 c. - At
step 402, theoutput generation module 102 d of theserver system 102 may retrieve a stored motion profile from thedatabase 104 via thedatabase management module 102 e. The stored motion profile may be a part of theexercise repository 104 a and/or the user history repository 104 c. In one embodiment, the stored motion profile that is used for the evaluation may be specified by a user (patient/athlete), physical therapist, medical professional or the like. The stored motion profile may be specified thru an interactive graphical user interface, application and the like. In one embodiment, the user, physical therapist, or medical professional who specifies the stored motion profile to use may use a search feature that identifies the stored movements in thedatabase 104 that may be used as a part of the evaluation. - At
step 404,output generation module 102 d may retrieve a character profile generated for the user. For example,output generation module 102 d may retrieve the character profile generated above, in conjunction withmethod 300.Output generation module 102 d may extract a motion profile from sensor motion data received from one or more wireless sensors configured to obtain motion information for a user by way of thevisualization module 102 c in accordance with the methods discussed above. - At
step 406, anevaluation module 102 f ofserver system 102 may then objectively evaluate the extracted motion profile by comparing the stored motion profile with the extracted motion profile in another step 405. For example,evaluation module 102 f may compare position data, angle range of the joint, velocity, acceleration, jerk, and peak spread or the furthest angle of separation between the two limbs at the joint of the extracted motion profile to position data, angle range of the joint, velocity, acceleration, jerk, and peak spread or the furthest angle of separation between the two limbs at the joint of the stored motion profile. - At step 408,
evaluation module 102 f may then generate an evaluation of the newly recorded user movement. The objective evaluation of the extracted motion profile may involve a comparison of one or more parameters of the extracted motion profile with that of the retrieved pre-defined target motion profile. Possible parameters may include position data, angle range of the joint, velocity, acceleration, jerk, and peak spread or the furthest angle of separation between the two limbs at the joint. - In addition to providing an objective measure of a person's movement, the
evaluation module 102 f may provide an objective means for identifying points of pain or difficulty in a patient or athlete's movement by locating points within the movement that are not smooth in a trace of the movement over time. -
FIG. 5 is system diagram showing a bodymotion evaluation system 500, according to an exemplary embodiment. Bodymotion evaluation system 500 includes adatabase 502, application programming interfaces (API)endpoints 504, auser account module 506, an orders module 508, amedical code module 510, areporting module 512, and one or more graphical user interfaces (GUIs) 520. - As illustrated a user such as an athlete or
patient 522 may access the bodymotion evaluation system 500 by way of agraphical user interface 520. A different graphical user interface may be provided to aclinician 524 having administrative privileges. Based on theirlogin information 506, the athlete orpatient 522 andclinician 524 may be privy to different portions of thedatabase 502, also known as a “web locker.” The bodymotion evaluation system 500 may include the ability for users to place or fulfill orders 508 such as providing a user with a prescription for a physical therapy regimen. Thesystem 500 may also report user compliance (participation in a prescribed physical therapy regimen) to an insurance company by way of thereporting module 512. Thesystem 500 may also be configured to bill an insurance company for the movements included in the prescribed physical therapy regimen by way of amedical code module 510. The system may also include one or moreapplication programming interfaces 504 that allow thesystem 500 to communicate withadministrative computers 514,patient sensors 516 andinsurance companies 518. -
FIG. 6 is system diagram showing a bodymotion evaluation system 600, according to another embodiment of the present disclosure. The bodymotion evaluation system 600 includes adatabase 602, aperformance module 604, and atherapy module 606. In one embodiment theperformance module 604 may be configured to improve the performance of an athlete. In one embodiment thetherapy module 606 may be configured to provide physical therapy to an athlete or patient. Theperformance module 604 may be configured to interface with users by way of adesktop 608 a,web browser 608 b,Android® application 608 c, or Apple® basedoperating systems 608 d. Thetherapy module 606 may be configured to interface with users by way of adesktop 610 a,web browser 610 b,Android® application 610 c, or Apple® basedoperating systems 610 d. -
FIGS. 7-21 show illustrative user interfaces (UIs) that may be used within a body motion evaluation system, according to embodiments of the present disclosure. - As depicted in
FIG. 7 in one embodiment, a user may be asked to create a profile that includes information including their insurance carrier, height, and weight. The insurance carrier information may be used in order to automatically bill insurance companies for physical therapy treatments. The height and weight information may be used to generate the joint data from the positional data and/or sensor motion data. In one embodiment the user information provided when creating a profile may be stored within a user history repository 104 c and configure settings in thebilling information repository 104 b. - As depicted in
FIG. 8 the user may be routed to an user interface that contains one or more panels providing information related to the system and methods for evaluating body motion. In one embodiment, afirst panel 801 may provide a representation of an “ideal” motion, asecond panel 803 may provide a live tracking representation of a user's current body position, athird panel 805 may provide a representation of the sensors located on the user's body and afourth panel 807 may provide information related to the cameras. - As illustrated in
FIG. 9 in one embodiment the user interface may be configured with amenu bar 901 that displays information related to the state of the sensors. For example, as depicted the sensors on the shirt are streaming information while the sensors in the pans are disconnected. Themenu bar 901 may include options to calibrate the wireless sensors, reset the orientation, align the sensors, and manage Bluetooth® data transfer settings. - As discussed above in one embodiment the
first panel 801 may provide a representation of an “ideal” motion. A representation may include a video recording, a wireframe animation, a 3D avatar animation, an overlay animation, and the like. As illustrated inFIG. 10 the “ideal” motion may be selected from one or more stored motions. For example, the user interface may provide a drop downmenu 809 that allows a user to select stored motions corresponding to a particular joint such as the shoulder, knee, or hip or belonging to a particular set of motions such as a functional movement library. In one embodiment, the user may be able to search 811 thru a collection of stored motions to locate an “ideal” motion. In one embodiment the stored motions may be located in an exercise repository 104A of thedatabase 104. In one embodiment, the “ideal” motion may correspond to a previously captured user motion that is stored in a user history repository 104 c of thedatabase 104. - As illustrated in
FIG. 11 the user interface may be configured to display a recording of the representation of the selected “ideal” motion in afirst panel 801. The user may also select a joint or sensor of interest 805 a in thesecond panel 805. In response to the user's selection, the angle formed by the selected joint or sensor of interest may be calculated and displayed 801 a for the “ideal” motion in afirst panel 801, and the angle for the selected joint or sensor of interest may also be calculated and displayed 803 a for the user's motion in real-time 803. In the depicted example, a user has opted to view the angle information for the shoulder. In one embodiment, the user may also adjust the orientation of the representation (i.e., the 3D avatar, wireframe, or overlay) so that the joint or sensor of interest is more clearly visible. - As illustrated in
FIG. 12 the user interface may be configured so that a user's live motion is recorded and available for playback in thesecond panel 803. Accordingly, a user may be able to play a recording of the “ideal” motion synchronized with a recording of the user's motion. - As illustrated in
FIG. 13 the user interface may include a motionrepresentation settings panel 813 that allows a user to select the character representation they would like to see for the “ideal” motion and/or the user motion. As depicted the character representation may be a wireframe, overlay, or 3D avatar. The motionrepresentation settings panel 813 may also include the option of displaying camera data from a webcam that populatespanel 807 and angle data that may be superimposed on the motion representations of the “ideal” motion and the user motion. - As illustrated in
FIG. 14 “ideal” motion and/or user motion may be represented as a wireframe and/or overlay. As illustrated with regards to the “ideal” motion, the shoulder flexion angle may be calculated based on a triangle drawn between the arm, a position on the user's body, and the joint of interest, the shoulder. - As illustrated in
FIG. 15 the user interface may include anevaluation panel 815 that provides a graphical representation comparing a user's recorded movement with an “ideal” movement. As illustrated, the comparison may display the angle of interest over time in degrees per frame. Suitable comparison parameters may also include position, velocity, acceleration over time. - As illustrated in
FIG. 16 thethird panel 805 providing a representation of the sensors located on the user's body may also include one or more menus that allow a user to select various configurations for measuring the angles of interest. - As illustrated in
FIG. 17 in one embodiment the user interface in connection with theoutput generation module 102 d may produce an evaluation or report. The evaluation may provide a comparison of the user's recorded movement with an “ideal” movement. In one embodiment the evaluation may display the user's personal information, the time and date of the report was generated, the time and date of the movement being evaluated, the movement of interest, the timing for the movement of interest, where the measurement was taken from, what the maximum angle of measurement was and a graph comparing the user's recorded movement with an “ideal” movement. The graph may display the angle of interest over time in degrees per frame, position, velocity, and/or acceleration over time. From the graph it may be possible to identify points of pain based on the shape of the graph. The evaluation or report may also include time-stamping so the user may see the routine of the duration and frequency of exercise. This may allow a user's progress to be tracked per session and over selected extended periods of time (e.g. day, week, month, year). - As illustrated in
FIG. 18 in one embodiment a clinician such as a physical therapist or trainer may use the user interface to prescribe a series of exercises for a patient or athlete to complete. The clinician may input information regarding the user's condition, and select one or more exercises that are stored in the exercise repository 104A in the depictedprescription portal 817. The data input by the clinician may then be exported into prescription email, or prescription document that is provided to the user. - For example, as illustrated in
FIG. 19 , theprescription document 1900 may be automatically generated based on the selections made by the clinician. Theprescription document 1900 may automatically include a description and images that correspond to the selected exercises. - As illustrated in
FIG. 20 the user interface may also include abilling portal 2000. Using the billing portal, a clinician or user may select one or more exercises that were performed with the user. Thebilling portal 2000 may then retrieve billing information such as CPT codes from thebilling information repository 104 b in order to automatically generate an invoice compatible with insurance company computing systems. - As illustrated in
FIG. 21 the user interface may also be used for sports-specific performance training applications corresponding to theperformance module 604. The user interface that displays a target sports specific ideal motion and a live tracking of athlete motion. In one embodiment the user interface may be configured differently for a sports-specific performance training application corresponding to theperformance module 604 than for a therapy based application. - In one embodiment the output generation module may generate applications specific to a user. User specific applications may be tailored to the needs of the veterans rehabilitation market, the fitness and athletic improvement market, the mobile and personal device digital products market and the medical market.
- In one embodiment the systems and methods described herein may be used as a baseline diagnostic and informational tool within a hospital or other medical care environment. For example, the systems and methods described herein may be used in connection with regenerative medicine groups as a means to track the effect of and outcomes from stem cell injections or other physical therapy treatments. Additionally, the user compliance and billing aspects of the systems and methods described herein may be useful for medical care provided in connection with worker's compensation.
- In one embodiment the systems and methods described herein may be used as a supportive pre- or post-op information resource for orthopedics or other medical professionals. For example, the systems and methods described herein may be used in an in-home setting and allow for in-home rehabilitation after orthopedic surgery. With conventional in-home rehabilitation or center based rehabilitation often times physical therapy is unsuccessful in aiding a patient or athlete in obtaining their full range of motion due to non-compliance and lack of oversight from medical professionals. As a result, many patients and athletes may undergo second corrective surgeries. The systems and methods described herein may provide remote tracking, and guided physical therapy routines with oversight from medical professionals, thus assisting with patient and athlete compliance and providing an improved quality of physical therapy.
- In one embodiment the systems and methods described herein may be used by wellness providers such as physical therapists, masseuses or chiropractors. In one embodiment the wellness providers may track patient/client motion prior to, during, and after treatment and provide the information to patients and clients using the systems and methods described herein.
- In one embodiment the systems and methods described herein may be provided to primary care physicians as a part of routine patient monitoring during yearly examinations.
- In one embodiment, the systems and methods described herein may involve one or more of the following steps in any suitable order. In one step a user may putt on a body suit containing wireless sensors. In one step the sensors on the body suit may be paired to a computing device. In one step a user may select a target or ideal motion profile for comparison. In one step the body suit may starting a recording of a user's movement while the user is wearing the suit. In one step the suit transmits data to the computing device while recording the user's movement. In one step angle data is recorded based on the target joint and measurement type. In one step a suit may stream data to the computing device. In one step rotational data corresponding to the shirts or pans of the body suit may be transmitted over Bluetooth® or other suitable communication protocols to an application. In one step rotational data (in Quaternions) may be applied to corresponding joints of a character profile (e.g., 3D avatar).
- While the present disclosure has been discussed in terms of certain embodiments, it should be appreciated that the present disclosure is not so limited. The embodiments are explained herein by way of example, and there are numerous modifications, variations and other embodiments that may be employed that would still be within the scope of the present disclosure.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/054,074 US11266328B2 (en) | 2017-08-03 | 2018-08-03 | Systems and methods for evaluating body motion |
US17/587,750 US11717190B2 (en) | 2017-08-03 | 2022-01-28 | Systems and methods for evaluating body motion |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762540689P | 2017-08-03 | 2017-08-03 | |
US16/054,074 US11266328B2 (en) | 2017-08-03 | 2018-08-03 | Systems and methods for evaluating body motion |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/587,750 Continuation US11717190B2 (en) | 2017-08-03 | 2022-01-28 | Systems and methods for evaluating body motion |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190038187A1 true US20190038187A1 (en) | 2019-02-07 |
US11266328B2 US11266328B2 (en) | 2022-03-08 |
Family
ID=65231876
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/054,074 Active 2038-12-07 US11266328B2 (en) | 2017-08-03 | 2018-08-03 | Systems and methods for evaluating body motion |
US17/587,750 Active US11717190B2 (en) | 2017-08-03 | 2022-01-28 | Systems and methods for evaluating body motion |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/587,750 Active US11717190B2 (en) | 2017-08-03 | 2022-01-28 | Systems and methods for evaluating body motion |
Country Status (1)
Country | Link |
---|---|
US (2) | US11266328B2 (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10545578B2 (en) * | 2017-12-22 | 2020-01-28 | International Business Machines Corporation | Recommending activity sensor usage by image processing |
JP2020174910A (en) * | 2019-04-18 | 2020-10-29 | 株式会社Sportip | Exercise support system |
SE1950879A1 (en) * | 2019-07-10 | 2021-01-11 | Wememove Ab | Torso-mounted accelerometer signal reconstruction |
US20210085220A1 (en) * | 2018-06-19 | 2021-03-25 | Tornier, Inc. | Extended reality visualization of range of motion |
US20210142893A1 (en) * | 2019-10-03 | 2021-05-13 | Rom Technologies, Inc. | System and method for processing medical claims |
US20210290107A1 (en) * | 2017-05-13 | 2021-09-23 | William Thomas Murray | Joint mobility measurement device |
US20210386323A1 (en) * | 2020-06-10 | 2021-12-16 | Pmotion, Inc. | Enhanced goniometer |
US11337649B2 (en) | 2016-10-31 | 2022-05-24 | Zipline Medical, Inc. | Systems and methods for monitoring physical therapy of the knee and other joints |
US11450051B2 (en) * | 2020-11-18 | 2022-09-20 | Snap Inc. | Personalized avatar real-time motion capture |
US20220365555A1 (en) * | 2021-05-06 | 2022-11-17 | Microsoft Technology Licensing, Llc | Estimating runtime-frame velocity of wearable device |
US11557215B2 (en) * | 2018-08-07 | 2023-01-17 | Physera, Inc. | Classification of musculoskeletal form using machine learning model |
US11557075B2 (en) | 2019-02-06 | 2023-01-17 | Snap Inc. | Body pose estimation |
US11615592B2 (en) | 2020-10-27 | 2023-03-28 | Snap Inc. | Side-by-side character animation from realtime 3D body motion capture |
US20230119404A1 (en) * | 2018-12-28 | 2023-04-20 | Gree, Inc. | Video distribution system for live distributing video containing animation of character object generated based on motion of distributor user, video distribution method, and storage medium storing thereon video distribution program |
US11660022B2 (en) | 2020-10-27 | 2023-05-30 | Snap Inc. | Adaptive skeletal joint smoothing |
US11734894B2 (en) | 2020-11-18 | 2023-08-22 | Snap Inc. | Real-time motion transfer for prosthetic limbs |
US11748931B2 (en) | 2020-11-18 | 2023-09-05 | Snap Inc. | Body animation sharing and remixing |
US11849415B2 (en) | 2018-07-27 | 2023-12-19 | Mclaren Applied Technologies Limited | Time synchronisation |
US11880947B2 (en) | 2021-12-21 | 2024-01-23 | Snap Inc. | Real-time upper-body garment exchange |
US11898874B2 (en) | 2019-10-18 | 2024-02-13 | Mclaren Applied Technologies Limited | Gyroscope bias estimation |
US20240057893A1 (en) * | 2022-08-17 | 2024-02-22 | August River, Ltd Co | Remotely tracking range of motion measurement |
US12004967B2 (en) | 2020-06-02 | 2024-06-11 | Howmedica Osteonics Corp. | Systems and methods for planning placement of an acetabular implant for a patient based on pelvic tilt |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11266328B2 (en) * | 2017-08-03 | 2022-03-08 | Latella Sports Technologies, LLC | Systems and methods for evaluating body motion |
WO2020200082A1 (en) * | 2019-03-29 | 2020-10-08 | 广州虎牙信息科技有限公司 | Live broadcast interaction method and apparatus, live broadcast system and electronic device |
US20210407051A1 (en) * | 2020-06-26 | 2021-12-30 | Nvidia Corporation | Image generation using one or more neural networks |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100277411A1 (en) * | 2009-05-01 | 2010-11-04 | Microsoft Corporation | User tracking feedback |
US20140228985A1 (en) * | 2013-02-14 | 2014-08-14 | P3 Analytics, Inc. | Generation of personalized training regimens from motion capture data |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004264060A (en) * | 2003-02-14 | 2004-09-24 | Akebono Brake Ind Co Ltd | Error correction method in attitude detector, and action measuring instrument using the same |
US8942662B2 (en) * | 2012-02-16 | 2015-01-27 | The United States of America, as represented by the Secretary, Department of Health and Human Services, Center for Disease Control and Prevention | System and method to predict and avoid musculoskeletal injuries |
KR102535617B1 (en) * | 2016-01-04 | 2023-05-24 | 한국전자통신연구원 | System and method for detecting object in depth image |
US9773330B1 (en) * | 2016-12-29 | 2017-09-26 | BioMech Sensor LLC | Systems and methods for real-time data quantification, acquisition, analysis, and feedback |
US11266328B2 (en) * | 2017-08-03 | 2022-03-08 | Latella Sports Technologies, LLC | Systems and methods for evaluating body motion |
JP7003628B2 (en) * | 2017-12-19 | 2022-01-20 | 富士通株式会社 | Object tracking program, object tracking device, and object tracking method |
-
2018
- 2018-08-03 US US16/054,074 patent/US11266328B2/en active Active
-
2022
- 2022-01-28 US US17/587,750 patent/US11717190B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100277411A1 (en) * | 2009-05-01 | 2010-11-04 | Microsoft Corporation | User tracking feedback |
US20140228985A1 (en) * | 2013-02-14 | 2014-08-14 | P3 Analytics, Inc. | Generation of personalized training regimens from motion capture data |
US20140228712A1 (en) * | 2013-02-14 | 2014-08-14 | Marcus Elliott | Generation of personalized training regimens from motion capture data |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11337649B2 (en) | 2016-10-31 | 2022-05-24 | Zipline Medical, Inc. | Systems and methods for monitoring physical therapy of the knee and other joints |
US11992334B2 (en) | 2016-10-31 | 2024-05-28 | Zipline Medical, Inc. | Systems and methods for monitoring physical therapy of the knee and other joints |
US20210290107A1 (en) * | 2017-05-13 | 2021-09-23 | William Thomas Murray | Joint mobility measurement device |
US10545578B2 (en) * | 2017-12-22 | 2020-01-28 | International Business Machines Corporation | Recommending activity sensor usage by image processing |
US20210085220A1 (en) * | 2018-06-19 | 2021-03-25 | Tornier, Inc. | Extended reality visualization of range of motion |
US11849415B2 (en) | 2018-07-27 | 2023-12-19 | Mclaren Applied Technologies Limited | Time synchronisation |
US11557215B2 (en) * | 2018-08-07 | 2023-01-17 | Physera, Inc. | Classification of musculoskeletal form using machine learning model |
US20230119404A1 (en) * | 2018-12-28 | 2023-04-20 | Gree, Inc. | Video distribution system for live distributing video containing animation of character object generated based on motion of distributor user, video distribution method, and storage medium storing thereon video distribution program |
US12015818B2 (en) * | 2018-12-28 | 2024-06-18 | Gree, Inc. | Video distribution system for live distributing video containing animation of character object generated based on motion of distributor user, video distribution method, and storage medium storing thereon video distribution program |
US11557075B2 (en) | 2019-02-06 | 2023-01-17 | Snap Inc. | Body pose estimation |
JP2020174910A (en) * | 2019-04-18 | 2020-10-29 | 株式会社Sportip | Exercise support system |
SE1950879A1 (en) * | 2019-07-10 | 2021-01-11 | Wememove Ab | Torso-mounted accelerometer signal reconstruction |
US20220293257A1 (en) * | 2019-10-03 | 2022-09-15 | Rom Technologies, Inc. | System and method for processing medical claims |
US20210142893A1 (en) * | 2019-10-03 | 2021-05-13 | Rom Technologies, Inc. | System and method for processing medical claims |
US11898874B2 (en) | 2019-10-18 | 2024-02-13 | Mclaren Applied Technologies Limited | Gyroscope bias estimation |
US12004967B2 (en) | 2020-06-02 | 2024-06-11 | Howmedica Osteonics Corp. | Systems and methods for planning placement of an acetabular implant for a patient based on pelvic tilt |
EP4164471A4 (en) * | 2020-06-10 | 2024-03-06 | PMotion, Inc. | Enhanced goniometer |
US11957452B2 (en) * | 2020-06-10 | 2024-04-16 | Pmotion, Inc. | Enhanced goniometer |
US20210386323A1 (en) * | 2020-06-10 | 2021-12-16 | Pmotion, Inc. | Enhanced goniometer |
US11660022B2 (en) | 2020-10-27 | 2023-05-30 | Snap Inc. | Adaptive skeletal joint smoothing |
US11615592B2 (en) | 2020-10-27 | 2023-03-28 | Snap Inc. | Side-by-side character animation from realtime 3D body motion capture |
US11748931B2 (en) | 2020-11-18 | 2023-09-05 | Snap Inc. | Body animation sharing and remixing |
US11734894B2 (en) | 2020-11-18 | 2023-08-22 | Snap Inc. | Real-time motion transfer for prosthetic limbs |
US12002175B2 (en) | 2020-11-18 | 2024-06-04 | Snap Inc. | Real-time motion transfer for prosthetic limbs |
US11450051B2 (en) * | 2020-11-18 | 2022-09-20 | Snap Inc. | Personalized avatar real-time motion capture |
US11886245B2 (en) * | 2021-05-06 | 2024-01-30 | Microsoft Technology Licensing, Llc | Estimating runtime-frame velocity of wearable device |
US20220365555A1 (en) * | 2021-05-06 | 2022-11-17 | Microsoft Technology Licensing, Llc | Estimating runtime-frame velocity of wearable device |
US11880947B2 (en) | 2021-12-21 | 2024-01-23 | Snap Inc. | Real-time upper-body garment exchange |
US20240057893A1 (en) * | 2022-08-17 | 2024-02-22 | August River, Ltd Co | Remotely tracking range of motion measurement |
Also Published As
Publication number | Publication date |
---|---|
US11266328B2 (en) | 2022-03-08 |
US11717190B2 (en) | 2023-08-08 |
US20220151513A1 (en) | 2022-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11717190B2 (en) | Systems and methods for evaluating body motion | |
US11803241B2 (en) | Wearable joint tracking device with muscle activity and methods thereof | |
US10314536B2 (en) | Method and system for delivering biomechanical feedback to human and object motion | |
US20220005577A1 (en) | Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation | |
US10646157B2 (en) | System and method for measuring body joint range of motion | |
EP2726164B1 (en) | Augmented-reality range-of-motion therapy system and method of operation thereof | |
US20120259652A1 (en) | Systems and methods for remote monitoring, management and optimization of physical therapy treatment | |
US20120259650A1 (en) | Systems and methods for remote monitoring, management and optimization of physical therapy treatment | |
US20120259651A1 (en) | Systems and methods for remote monitoring, management and optimization of physical therapy treatment | |
US20120259648A1 (en) | Systems and methods for remote monitoring, management and optimization of physical therapy treatment | |
US20120259649A1 (en) | Systems and methods for remote monitoring, management and optimization of physical therapy treatment | |
Ianculescu et al. | A smart assistance solution for remotely monitoring the orthopaedic rehabilitation process using wearable technology: Re. flex system | |
WO2022133063A1 (en) | Wearable intertial sensor system and methods | |
US11049321B2 (en) | Sensor-based object tracking and monitoring | |
US20220157427A1 (en) | Remote physical therapy and assessment of patients | |
Bleser et al. | Development of an inertial motion capture system for clinical application: Potentials and challenges from the technology and application perspectives | |
García-de-Villa et al. | A database of physical therapy exercises with variability of execution collected by wearable sensors | |
US20210134011A1 (en) | Calibrating 3d motion capture system for skeletal alignment using x-ray data | |
Narváez et al. | Kushkalla: a web-based platform to improve functional movement rehabilitation | |
Wai et al. | Development of holistic physical therapy management system using multimodal sensor network | |
Chiensriwimol et al. | Frozen shoulder rehabilitation: exercise simulation and usability study | |
US20240260892A1 (en) | Systems and methods for sensor-based, digital patient assessments | |
Hahn et al. | Accurate human motion estimation using inertia measurement units for use in biomechanical analysis | |
Bucciero et al. | A biomechanical analysis system of posture | |
Köse | Investigation of wearable motion capture system towards biomechanical modelling |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LATELLA SPORTS TECHNOLOGIES, LLC, CONNECTICUT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LATELLA, FRANK A., JR.;REEL/FRAME:046548/0995 Effective date: 20170922 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |