WO2016014163A1 - Three dimensional sensor-based interactive pain maps for localizing pain - Google Patents

Three dimensional sensor-based interactive pain maps for localizing pain Download PDF

Info

Publication number
WO2016014163A1
WO2016014163A1 PCT/US2015/035185 US2015035185W WO2016014163A1 WO 2016014163 A1 WO2016014163 A1 WO 2016014163A1 US 2015035185 W US2015035185 W US 2015035185W WO 2016014163 A1 WO2016014163 A1 WO 2016014163A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
pain
recited
motion
representation
Prior art date
Application number
PCT/US2015/035185
Other languages
French (fr)
Inventor
Jay Han
Marcelo Kallmann
Original Assignee
The Regents Of The University Of California
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Regents Of The University Of California filed Critical The Regents Of The University Of California
Publication of WO2016014163A1 publication Critical patent/WO2016014163A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4528Joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4824Touch or pain perception evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • A61B5/4566Evaluating the spine

Definitions

  • Musculoskeletal pain complaints (including neck, back and shoulder joints) represent three of the most common reasons for doctor visits. More U.S. health care dollars are spent treating back and neck pain than almost any other medical condition, but a recent study suggests much of that money may be wasted with little improvement in overall health outcomes and function.
  • an objective of the present description is a system and method to characterize pain at these complex joints is needed and represents the motivation for the inventors.
  • An aspect of the present disclosure is a system for graphical and quantitative representation for allowing users to localize and visualize musculoskeletal pain felt during articulation movement.
  • the system of the present disclosure produces a 3D pain map representation with the use of an interactive application where the patient selects the postures that produce pain as he or she moves in front of a depth-sensing camera such as the Microsoft Kinect.
  • the representation provides a way to describe, quantify and to track pain reduction during treatment, !n addition, developing a digital data for 3D pain mapping allows reconstruction via 3D printing for medical and health applications.
  • the technology described herein allows users to locate and "paint" pain interactively in a graphical representation as they hold or move in front of a sensor.
  • the representation is also unique given that it is created automatically from a 3D sensor, and that minimizes imprecision and provides standardization.
  • the methodology allows a better way to diagnose, track therapy progress and medication effectiveness, visualize for patient education, and plan for intervention such as surgery. Such a solution has the potential to be widely adopted for various healthcare, telemedicine, and fitness applications.
  • FIG. 1 shows a schematic diagram of an exemplary real-time
  • FIG. 2 illustrates a schematic diagram of an exemplary interactive 3D articulation pain mapping system that may be incorporated with the mapping system of FIG. 1 , or similar system.
  • FIG. 3 shows a digital representation of a patient and representative coordinate system.
  • FIG. 4A through FIG. 4C illustrate how different s vectors correspond to the shoulder orientations for patient 3D model.
  • FIG. 5A defines a 2D space where each axis represents Euler
  • FIG. 5B through FIG. 5D show the obtained 3D curves
  • FIG. 6A, FIG. 6B, and FIG. 6C show perspective, front, and top
  • FIG. 7 and FIG. 8 show screen shots of front and perspective views of a patient through various shoulder orientations and corresponding 3D pain map.
  • FIG. 9 shows a corresponding 2D map wherein each point in the pain diagram has precise coordinates indicating the corresponding arm orientation for the marked pain level or type.
  • FIG. 10A and FIG. 10B show images configured to represent pain with respect to different spine movements.
  • the present description includes a method for spatial and temporal mapping of musculoskeletal pain unobtrusively and efficiently using a scalable 3D depth-ranging camera sensor system.
  • the system and methodology is independent of the sensor used, and can be directly integrated with any other sensor able to track body motions.
  • the system of the present description allows the user to interactively "paint" pain locations as he or she moves while being tracked by a motion sensor.
  • pain marking methods are possible based on customizable color schemes and interactive interfaces (e.g. either by pressing a simple button every time the user wants to notify a pain location or also
  • visualization and a 2D diagram visualization suitable for forms and documents, but also provide the unique ability of temporally recording, as in a video, the entire movement performed by the user while the user is building a pain map. This provides a unique capability for helping to understand the motions causing pain, any regions being avoided, velocity vectors etc.; and also provides the capability to extract additional data analysis from the full original data that was collected and stored.
  • the systems and methods provide rich information that assists in better characterization and tracking of musculoskeletal pain, providing unprecedented insight into kinematics of pain-causing joint motion disorders.
  • the systems and methods can be coordinated with affordable 3D printing for visualization, patient education, therapy/intervention tracking, or for surgical planning purposes.
  • Prior 2D based pain mapping methods do not allow this capability, whereas the systems and methods described herein allow for interaction with 3D printing methods to reconstruct a 3D model of pain for medical/health applications.
  • the 3D pain reconstruction can be overlaid onto an individual's 3D model of shoulder anatomical structures reconstructed from imaging data (e.g. MRI) to provide additional information about potential pain generator or etiology.
  • imaging data e.g. MRI
  • FIG. 1 shows an exemplary real-time interactive pain mapping
  • FIG. 2 illustrates an exemplary interactive 3D articulation pain mapping system 50 that may be incorporated with mapping system 10, or similar system.
  • motion capture device 20 preferably comprises an optical sensor providing 3 degree-of-freedom (DOF) rotation tracking of target articulation.
  • DOF degree-of-freedom
  • the pain map representation is designed for a generic 3 degrees of freedom (DOFs) joint 54.
  • FIG. 4A through FIG. 4C illustrate 3D 90 of how different s vectors correspond to the shoulder orientations for patient 3D model 60.
  • FIG. 5A defines a 2D space where each axis represents Euler angles.
  • the obtained 3D curves 92, 94, and 96, corresponding to circles 82, 84, and 86 in the 2D diagram 80 are observed from three different points of view (front view in FIG. 5B, perspective view in FIG. 5C, top view in FIG. 5D).
  • FIG. 6A, FIG. 6B, and FIG. 6C show perspective, front, and top views of 3D trajectory model 100 produced by axis-angle parameterization in accordance with the present description.
  • system 10 the patient stands in front of a computing device 12 equipped with a motion tracking sensor 20, and by holding a remote controller 32 with button 36, which the user may variably press to mark pain locations interactively.
  • Computing device 12 comprises a processor 14 and non-transitory memory 16 for storing application programming (instructions) 18 executable on the processor 14 for interpreting data acquired from a capture volume of motion sensor 20 and user input device 30.
  • the pain mapping system 10 comprises a computer 12 having a USB input (not shown) allowing input of a USB receiver 34 for receiving user input data from controller 32.
  • the user input device 30 may input the pain location data via verbalization (e.g. via a controller
  • the painted colored regions may be directly mapped to the 2D swing plane
  • motion capture device 20 preferably comprises an optical sensor providing 3 degree-of-freedom (DOF) rotation tracking of target articulation.
  • motion capture device 20 may comprise a Microsoft Kinect sensor, or a similar motion capture device available in the art.
  • the motion capture device 20 is coupled to computer 12 via USB or like input. While such a 3D depth- ranging camera sensor is suitable as a fixed installation in many settings, an implementation suitable for home use may incorporate data from a user holding a smart phone equipped with motion tracking capabilities.
  • the smart phone would act to include the functionality of computing device 12 (including an installed app as application software 18), motion tracking/capture device 20, and/or user input device 30 (e.g. via a microphone on the phone). New motion tracking technologies based on feature tracking from the phone's video input may be used to provide translation and orientation information to reconstruct the desired articulation movement.
  • motion capture device 20 preferably comprises an optical sensor, it is appreciated that other embodiments may be employed using other technologies, e.g. magnetic, mechanical, or inertial sensors.
  • a smart phone may also be used to communicate with a smart phone.
  • depth-range sensors 20 augment the tracking ability of a depth-ranging camera 20, by providing tracking information when the depth-range sensor 20 cannot distinguish motion.
  • depth-range sensors are typically difficult situations for depth-range sensors when the axis of the user's arm is orthogonal to the sensor's plane and the user twists his or her arm.
  • a smart phone may be used to notify the pain locations collected by the system 10, 50, instead of relying on a dedicated controller held by the user.
  • the pain mapping system 10, 50 has the flexibility to use other scalable and wireless devices, such as hand-held smart phone or wearable wireless wrist device in combination with the presently illustrated components shown in FIG. 1 and FIG. 2 (e.g. beyond just a hand-held wireless remote controller).
  • the system 10, 50 may utilize sensor capabilities that come with a smart phone, not only by using the typical built-in accelerometer and gyroscope functions, but also by using the camera/video function of the smart phone in an innovative way (for example using panoramic picture/video function) to provide information about arm motion and/or augment reconstruction of the arm motion.
  • application software 18 may comprise two
  • a conversion module 22 may be employed for conversion of sensed articulation/rotation to the used representation in the adopted local frame system (described in further detail below and illustrated further in FIG. 4A through 6C). Further, application software 18 may include an interactive component 24 for painting a current selected color into a 3D spherical representation whenever the corresponding controller button 36 or voice command is detected (illustrated further in FIG. 7 through FIG. 9).
  • the pain map representation is designed for a generic 3 degrees of freedom (DOFs) joint 54.
  • DOEs degrees of freedom
  • the majority of embodiments disclosed herein are directed to addressing pain related to the shoulder joint, it is also appreciated the methodology of the present technology may be applied to back and neck (spinal) joints, or any joint articulation.
  • the neutral swing orientation is defined as the arm point outwards on the coronal plane, along the local shoulder Z axis. Vector s is explained in further detail below.
  • An initial calibration motion protocol can also be implemented in application software 18 to identify a relative correction vector to be applied to the considered origin of the shoulder joint (0,0,0), in case improved measurement precision is needed for special applications.
  • the procedure works as follows: the user is asked to perform a motion protocol that allows the system to compute an estimated joint center of rotation, and the difference from that location and the location of the joint center given by the sensor 20 results in correction vectors to be applied during the pain map construction.
  • the protocol motion starts at the resting position with the arm at the side of the body with palm facing forward, then abduction of the humerus to 90 degrees at horizontal level with palm facing forward, and forward flexion of the shoulder to 90 degrees at horizontal level with fingers pointing forward.
  • the estimated center of rotation for each protocol motion is the average shoulder joint center measured during the performance of each motion.
  • FIG. 4A through FIG. 4C illustrate how different s vectors correspond to the shoulder orientations for patient 3D model 60.
  • the top-left diagram 70 locates the respective orientation s in a 2D axis displayed after a clockwise 90 degrees rotation (e.g.
  • 1 15.3 degrees.
  • the ellipse 72 shown in the diagram 70 is used to provide a reference of the expected swing range of motion limit for a healthy adult.
  • the used axis-angle representation is important for achieving a joint orientation representation that is intuitive and efficient for the pain map to be represented both in 3D and as a 2D diagram.
  • a popular approach for orientation representation is to rely on independent Euler angles.
  • the 2D diagram 80 shown in FIG. 5A defines a 2D space where each axis represents Euler angles, and the corresponding swing rotation of each point in the diagram is achieved by one rotation around Y and then another rotation around X.
  • the red axis is X and the green axis is Y.
  • the points in the trajectories of the three concentric circles 82, 84, and 86 in the 2D diagram 80 are converted to their respective swing orientations, each orientation is then applied to the shoulder joint, and then the arm intersection with a 3D sphere centered at the joint is computed in order to achieve the 3D visualization of the trajectories.
  • the obtained 3D curves 92, 94, and 96, corresponding to circles 82, 84, and 86 in the 2D diagram 80 are observed from three different points of view (front view in FIG. 5B, perspective view in FIG. 5C, top view in FIG. 5D).
  • FIG. 5A through FIG. 5D illustrate that Euler angles do not provide an
  • FIG. 6A, FIG. 6B, and FIG. 6C show perspective, front, and top
  • FIG. 6A, FIG. 6B, and FIG. 6C has only one singularity, which is carefully placed in an unreachable location that is along the -Z axis of the chosen local axis frame.
  • buttons 36 in the controller 30 which is held by the patient, or verbalizes a command, in order to indicate pain at the current swing orientation.
  • the controller may also comprise a multiple button configuration for indicating varying pain intensity.
  • two button combinations may be incorporated wherein a first button 38 selects the current color to be painted and which corresponds to a given intensity or type of pain to be painted; the and a second button 36 serves to actually paint the active color at location p, on the sphere.
  • the user may hold the button 36 pressed while moving the arm in order to continuously paint regions of pain.
  • button 36 may also incorporate pressure sensing such that varying levels of applied pressure represent varying levels of pain.
  • the screen shots 1 10 and 120 of FIG. 7 and FIG. 8 show front and perspective views of a patient through various shoulder orientations and corresponding 3D pain map 1 14.
  • a button is pressed in order to paint the current swing location with the current color.
  • the pain map 1 14 in FIG. 7 and FIG. 8 is shown with a first hatch pattern corresponding to high pain, and a second hatch pattern corresponding to low pain, with shades of grey representing intermediate pain.
  • the actual visual output is color coded, with e.g. a dark blue color reserved to notify no pain, red reserved to show high pain, and other colors (e.g. light blue, yellow, etc.) specified to show intermediate levels of pain.
  • a color coded ring 1 18 disposed around the user's arm may also be employed showing a varying color that represents the current selected color by the user 1 12.
  • Window 1 16 may be used to show additional information to the user.
  • the dark blue color is ideally reserved to
  • colors represent varying levels of pain as selected by the user: e.g. light blue, yellow and red. These colors can represent increasing levels of pain (e.g. light (light blue), moderate or intense (yellow), and severe (red)), or different types of pain (e.g. dull, sharp, burning).
  • the meaning of the colors is given by the therapist/doctor according to the application, and additional colors can be easily added or modified to give increased flexibility as needed.
  • the current selected color is preferably shown as a colored ring 1 18 around the user's arm, so that the user 1 12 maintains focus on its arm movement and does not need to look at other locations in the computer screen.
  • the current color cycles through the available colors, including the dark blue color indicating no pain. This allows the user to paint a dark blue color on top of a pain color if he/she desires to change a previously painted pain color. This overall procedure allows the user to intuitively paint his/her pain map until the obtained results look accurate.
  • Figure 9 illustrates one map diagram produced by our prototype application. [0059] FIG.
  • each point 132 in the pain diagram has precise coordinates indicating the corresponding arm orientation for the marked pain level or type.
  • the marked ellipse 134 (of variable dimensions) may be used to indicate the expected range of motion to be explored by the patient.
  • the colors on the pain maps shown in FIG. 7 through FIG. 9 may be stored in a texture image that is initially fully transparent. For every painted point p., its position in the texture is determined and the corresponding texture pixel has its color changed to reflect the selected color.
  • a radial basis function is preferably used to distribute the color to a region around the marked point.
  • the radius of the region is a parameter of the system, and it allows definition of how coarse or fine each marked location should be painted at the current color. Normally this radius is fixed by the therapist. However, the system may allow for the patient to specify painting a more precise variation of colors such that the therapist may then fine-tune the painting radius.
  • an export module 40 may be implemented in executable application software for display or transfer of acquired data.
  • the module 40 may include an option 42 for the full 3D collected map (e.g. similar to map 1 14 shown in FIG. 7 ad FIG. 8) to be reloaded and/or visualized by the physician/and or patient as a .jpg or like image.
  • Temporal data may also be included so that the 3D map is presented as a video showing pain regions through the patient's articulation of the joint.
  • a 3D printing module 44 may be included such that the acquired digital data of the pain map 1 14 is reproduced into a 3D model (not shown) using 3D printing.
  • a 2D module 46 may also be included for generating a 2D map (similar to map 130 shown in FIG. 9) that is suitable for paper documents.
  • the above- described pain mapping system 10/50 can also be used in combination with clinically useful predefined movement protocols to elicit pain (provocative maneuvers). For example, with selection of these provocative maneuvers under a menu of pain mapping, specific protocols and characterization of pain will provide easily recognized patterns of pain that can be correlated to known biomechanics and pathologic processes.
  • FIG. 10A and FIG. 10B show images 140 and 150 respectively configured to represent pain with respect to different spine movements.
  • a pain map 146 can be colored and displayed together with a skeleton representation 142 in order to reduce occlusion of the pain map being painted.
  • a simple cylinder- based skeleton representation is shown in FIG. 10A and FIG. 10B, the system can also display a realistic human skeleton including all cervical, thoracic, and lumbar spine joints in order to better assist the therapist's analysis.
  • a generic way to extend the solution to other joints is to display the user's character representation as a simple skeleton as shown in FIG. 10A and FIG. 10B (or use of respective spinal vertebral bone models) to minimize occlusion, and then to display the spherical pain map being painted centered at the joint of interest. It is envisioned that specifying the pain at different levels of the vertebrae can be controlled by the patient again using either buttons (remote control) or through verbalization. The pain map representation can therefore be applied to a generic 3 degrees of freedom (DOFs) motion centered around any selected joint.
  • DOEs degrees of freedom
  • the lumbar region back pain can be mapped in regards to cardinal motions (flexion, extension, side bending, and rotation as well as certain combination of movements that are clinically useful, provocative maneuvers).
  • mapping is constructed by considering the vector from the base/lower cervical vertebrae to the rostral cranio-cervical joint.
  • system 10, 50 may be
  • the system 10, 50 may allow for painting three different pain maps at the same time: one to represent the pain in the neutral arm, another for the pronated arm, and another for the supinated arm.
  • Options may be provided allowing for the therapist at any point to change the current pain map being painted and instruct the patient to explore sensations in the corresponding neutral, supinated or pronated arm; or the twist orientation of the user arm can be tracked and used to automatically switch between the current pain maps being painted.
  • the second option although automatic, may be confusing for some patients and it also depends on how well the sensor can detect arm twist rotations.
  • the same pain map representation and painting procedure can be extended to represent pain for any 3 degrees of freedom joint articulation. It can therefore be extended to represent lower-body, spine and neck motion pain.
  • the shoulder joint is in a location that is suitable for visualization of the spherical pain map interactively while painting it; but for other joints, the representation can easily occlude the pain map being painted.
  • each block or step of a flowchart, and combinations of blocks (and/or steps) in a flowchart, algorithm, formula, or computational depiction can be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions embodied in computer-readable program code logic.
  • any such computer program instructions may be loaded onto a computer, including without limitation a general purpose computer or special purpose computer, or other programmable processing apparatus to produce a machine, such that the computer program instructions which execute on the computer or other programmable processing apparatus create means for implementing the functions specified in the block(s) of the flowchart(s).
  • computational depictions support combinations of means for performing the specified functions, combinations of steps for performing the specified functions, and computer program instructions, such as embodied in computer-readable program code logic means, for performing the specified functions. It will also be understood that each block of the flowchart illustrations, algorithms, formulae, or computational depictions and combinations thereof described herein, can be implemented by special purpose hardware-based computer systems which perform the specified functions or steps, or combinations of special purpose hardware and computer-readable program code logic means.
  • embodied in computer-readable program code logic may also be stored in a computer-readable memory that can direct a computer or other programmable processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the block(s) of the flowchart(s).
  • the computer program instructions may also be loaded onto a computer or other programmable processing apparatus to cause a series of operational steps to be performed on the computer or other programmable processing apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable processing apparatus provide steps for implementing the functions specified in the block(s) of the flowcharts), algorithm(s), formula(e), or computational depiction(s).
  • program executable refer to one or more instructions that can be executed by a processor to perform a function as described herein.
  • the instructions can be embodied in software, in firmware, or in a
  • the instructions can be stored local to the device in non-transitory media, or can be stored remotely such as on a server, or all or a portion of the instructions can be stored locally and remotely. Instructions stored remotely can be downloaded (pushed) to the device by user initiation, or automatically based on one or more factors. It will further be appreciated that as used herein, that the terms processor, computer processor, central processing unit (CPU), and computer are used synonymously to denote a device capable of executing the instructions and communicating with input/output interfaces and/or peripheral devices.
  • present disclosure encompasses multiple embodiments which include, but are not limited to, the following:
  • a system for graphically representing pain felt by a user during articulation movement comprising: (a) a motion capture sensor; (b) a computer processor; and (c) a non-transitory memory storing instructions executable on the computer processor; (d) said instructions when executed performing steps comprising: (i) acquiring input from a user, the user input comprising data indicative of pain with respect to one or more locations of the user as the user moves in a capture volume of the motion capture sensor; (ii) acquiring data relating to articulation movement of the user from the motion capture sensor; and (iii) generating a three- dimensional graphic representation of the user's articulation movement and indicated pain levels at locations within said movement.
  • a user input device configured for allowing the user to interactively mark locations indicative of pain within a range of motion of the user's articulation movement.
  • the user input device comprises a wireless hand-held device that allows the user to manually trigger an assigned pain designation according to a location within the user's range of motion.
  • the user input device comprises a voice activated device that allows the user to verbally trigger an assigned pain designation according to a location within the user's range of motion.
  • visualization comprises a temporal recording in the form of a video of the of the user's articulation movement.
  • visualization comprises color-coded regions to show variable levels of pain as an interactive pain map; wherein each point in the pain map comprises coordinates indicating the corresponding patient orientation for a marked pain level.
  • mapping the color-coded regions is a function of a swing-twist rotation decomposition for a low-deformation and singularity-free 2D diagram representation.
  • a method for graphically representing pain felt by a user during articulation movement comprising: acquiring input from a user, the user input comprising data indicative of pain with respect to one or more locations of the user as the user moves in a capture volume of a motion capture sensor; acquiring data relating to articulation movement of the user from the motion capture sensor; and generating a three-dimensional graphic representation of the user's articulation movement and indicated pain locations within said movement.
  • acquiring input comprises acquiring a signal from a user input device allowing the user to interactively mark locations indicative of pain within a range of motion of the user's articulation movement.
  • the user input device comprises a wireless hand-held device that allows the user to manually trigger an assigned pain designation according to a location within the user's range of motion.
  • the user input device comprises a voice activated device that allows the user to verbally trigger an assigned pain designation according to a location within the user's range of motion.
  • the motion capture sensor comprises scalable three-dimensional depth-ranging camera sensor system.
  • representation comprises a temporal recording in the form of a video of the of the user's articulation movement.
  • each point in the pain map comprises coordinates indicating the corresponding patient orientation for a marked pain level.
  • mapping the color- coded regions is a function of a swing-twist rotation decomposition for a low-deformation and singularity-free 2D diagram representation.
  • a system for graphically representing pain felt by a user during articulation movement comprising: (a) a user input device configured for allowing a user to interactively mark locations indicative of pain within a range of motion of the user's articulation movement; (b) a motion capture sensor having a capture volume configured to acquire data relating to the user's articulation movement through at least a portion of the range of motion; (c) a computer processor; and (d) a non-transitory memory storing instructions executable on the computer processor; (e) said instructions comprising: (i) a conversion module for converting articulation sensed by the motion capture sensor to a used representation in a local frame system; and (ii) an interactive module configured for painting a specified indicator into a three-dimensional graphical representation corresponding to the sensed articulation from the motion sensor, said indicator being indicative of pain sensed by the user at a location within the user's articulation movement.
  • conversion module and interactive module generate a three-dimensional spherical representation of the user's articulation movement and indicated pain locations within said movement.
  • the user input device comprises a wireless hand-held device that allows the user to manually trigger an assigned pain designation according to a location within the user's range of motion.
  • the user input device comprises a voice activated device that allows the user to verbally trigger an assigned pain designation according to a location within the user's range of motion.
  • representation comprises a temporal recording in the form of a video of the of the user's articulation movement.
  • mapping the color-coded regions is a function of a swing-twist rotation decomposition for a low-deformation and singularity-free 2D diagram representation.

Abstract

A system and method for spatial and temporal mapping of musculoskeletal pain unobtrusively and efficiently using a scalable 3D depth-ranging camera sensor system. The system and methodology is independent of the sensor used, and can be directly integrated with any other sensor able to track body motions.

Description

THREE DIMENSIONAL SENSOR-BASED INTERACTIVE
PAIN MAPS FOR LOCALIZING PAIN
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to, and the benefit of, U.S. provisional patent application serial number 62/028,804 filed on July 25, 2014, incorporated herein by reference in its entirety.
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
Not Applicable
INCORPORATION-BY-REFERENCE OF
COMPUTER PROGRAM APPENDIX
[0003] Not Applicable
NOTICE OF MATERIAL SUBJECT TO COPYRIGHT PROTECTION
[0004] A portion of the material in this patent document is subject to
copyright protection under the copyright laws of the United States and of other countries. The owner of the copyright rights has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the United States Patent and Trademark Office publicly available file or records, but otherwise reserves all copyright rights whatsoever. The copyright owner does not hereby waive any of its rights to have this patent document maintained in secrecy, including without limitation its rights pursuant to 37 C.F.R. § 1 .14.
BACKGROUND
[0005] 1 . Technical Field
[0006] This description pertains generally to pain management, and
particularly to sensor -based interactive pain mapping. [0007] 2. Background Discussion
[0008] Musculoskeletal pain complaints (including neck, back and shoulder joints) represent three of the most common reasons for doctor visits. More U.S. health care dollars are spent treating back and neck pain than almost any other medical condition, but a recent study suggests much of that money may be wasted with little improvement in overall health outcomes and function.
9] Shoulder pain complaints are also very prevalent, with estimates that half of the population will experience shoulder pain annually. Moreover, prevalence of severe shoulder pain increases with age, and one study shows a prevalence of 21 % in a population survey of adults over 70 years old. Shoulder disorders are a significant source of morbidity and directly impacts a person's ability to perform basic activities of daily living (ADLs, self-care such as: feeding, grooming, dressing and bowel/bladder care). Together, the combined economic impact and lost worker productivity of the spinal and upper extremity musculoskeletal complaints are significant at the societal level. Better diagnosis and appropriate therapeutic intervention based on accurate assessment may hold the key to reducing cost and improving health outcomes.
0] Currently, and as it has been for decades, the main method of documenting and tracking musculoskeletal pain is the "paper-and-pen" pain diagram or similar methods. This two-dimensional static picture can offer basic information on general location, extent and type of pain. However, it fails to provide accurate annotation and a detailed spatial and temporal mapping/characterization of pain. This situation limits the quality of the data collected and the perception of the intrinsic 3D information that is
associated with joint pain location, such as range-of-motion or insight into the limitation in function that is associated with pain.
1] Several variations of such diagram-based pain representations can be found in the literature, each being developed for a specific type of study or purpose. Currently most of the methods are paper-based but computers and tablet devices have already been used to allow the user to digitally paint the 2D drawings. However all of the approaches remain based on 2D diagrams and remain based on asking the user to mark the pain locations on the diagram while they try to remember/recall where the pain is felt, which renders this method a highly imprecise approach. Furthermore, pain at the joints with multiple degrees of freedom and joint motion are especially difficult to characterize via a 2D pain diagram.
] Accordingly, an objective of the present description is a system and method to characterize pain at these complex joints is needed and represents the motivation for the inventors.
BRIEF SUMMARY
] An aspect of the present disclosure is a system for graphical and quantitative representation for allowing users to localize and visualize musculoskeletal pain felt during articulation movement. The system of the present disclosure produces a 3D pain map representation with the use of an interactive application where the patient selects the postures that produce pain as he or she moves in front of a depth-sensing camera such as the Microsoft Kinect. The representation provides a way to describe, quantify and to track pain reduction during treatment, !n addition, developing a digital data for 3D pain mapping allows reconstruction via 3D printing for medical and health applications.
] In one embodiment, the technology described herein allows users to locate and "paint" pain interactively in a graphical representation as they hold or move in front of a sensor. The representation is also unique given that it is created automatically from a 3D sensor, and that minimizes imprecision and provides standardization. The methodology allows a better way to diagnose, track therapy progress and medication effectiveness, visualize for patient education, and plan for intervention such as surgery. Such a solution has the potential to be widely adopted for various healthcare, telemedicine, and fitness applications.
] Further aspects of the technology will be brought out in the following portions of the specification, wherein the detailed description is for the purpose of fully disclosing preferred embodiments of the technology without placing limitations thereon.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S)
[0016] The technology described herein will be more fully understood by reference to the following drawings which are for illustrative purposes only:
[0017] FIG. 1 shows a schematic diagram of an exemplary real-time
interactive pain mapping system in accordance with the present description.
[0018] FIG. 2 illustrates a schematic diagram of an exemplary interactive 3D articulation pain mapping system that may be incorporated with the mapping system of FIG. 1 , or similar system.
[0019] FIG. 3 shows a digital representation of a patient and representative coordinate system.
[0020] FIG. 4A through FIG. 4C illustrate how different s vectors correspond to the shoulder orientations for patient 3D model.
[0021] FIG. 5A defines a 2D space where each axis represents Euler
angles.
[0022] FIG. 5B through FIG. 5D show the obtained 3D curves
corresponding to circles in the 2D diagram of FIG. 5A observed from three different points of view (front view in FIG. 5B, perspective view in FIG. 5C, top view in FIG. 5D).
[0023] FIG. 6A, FIG. 6B, and FIG. 6C show perspective, front, and top
views of 3D trajectory model produced by axis-angle parameterization in accordance with the present description.
[0024] FIG. 7 and FIG. 8 show screen shots of front and perspective views of a patient through various shoulder orientations and corresponding 3D pain map.
[0025] FIG. 9 shows a corresponding 2D map wherein each point in the pain diagram has precise coordinates indicating the corresponding arm orientation for the marked pain level or type.
[0026] FIG. 10A and FIG. 10B show images configured to represent pain with respect to different spine movements.
DETAILED DESCRIPTION
[0027] The present description includes a method for spatial and temporal mapping of musculoskeletal pain unobtrusively and efficiently using a scalable 3D depth-ranging camera sensor system. The system and methodology is independent of the sensor used, and can be directly integrated with any other sensor able to track body motions.
[0028] The system of the present description allows the user to interactively "paint" pain locations as he or she moves while being tracked by a motion sensor. Several pain marking methods are possible based on customizable color schemes and interactive interfaces (e.g. either by pressing a simple button every time the user wants to notify a pain location or also
verbalization while undergoing painful motion).
[0029] The systems and methods disclosed herein not only provide 3D
visualization and a 2D diagram visualization suitable for forms and documents, but also provide the unique ability of temporally recording, as in a video, the entire movement performed by the user while the user is building a pain map. This provides a unique capability for helping to understand the motions causing pain, any regions being avoided, velocity vectors etc.; and also provides the capability to extract additional data analysis from the full original data that was collected and stored.
[0030] The systems and methods provide rich information that assists in better characterization and tracking of musculoskeletal pain, providing unprecedented insight into kinematics of pain-causing joint motion disorders.
[0031] The capability to digitally record, analyze, and 3D-graphically
visualize joint pain while dynamically moving the body region of interest provides previously unavailable clinical information to the practitioners.
[0032] The systems and methods improve the diagnostic capability of
practitioners and also improve the efficiency of workflow by recording, tracking, and providing comparisons of digitalized pain maps that can readily be incorporated into electronic medical records (EMR). Furthermore, the systems and methods can be coordinated with affordable 3D printing for visualization, patient education, therapy/intervention tracking, or for surgical planning purposes. Prior 2D based pain mapping methods do not allow this capability, whereas the systems and methods described herein allow for interaction with 3D printing methods to reconstruct a 3D model of pain for medical/health applications. For example, the 3D pain reconstruction can be overlaid onto an individual's 3D model of shoulder anatomical structures reconstructed from imaging data (e.g. MRI) to provide additional information about potential pain generator or etiology.
[0033] FIG. 1 shows an exemplary real-time interactive pain mapping
system 10 using a user input device 30 (e.g. an off-the-shelf wireless device/controller) that comprises one or more buttons to be used during the interactive painting procedure simultaneously with dynamic motion of the limb. FIG. 2 illustrates an exemplary interactive 3D articulation pain mapping system 50 that may be incorporated with mapping system 10, or similar system. As shown in FIG. 2, motion capture device 20 preferably comprises an optical sensor providing 3 degree-of-freedom (DOF) rotation tracking of target articulation. For the patient digital representation 52 shown in FIG. 3, the pain map representation is designed for a generic 3 degrees of freedom (DOFs) joint 54. FIG. 4A through FIG. 4C illustrate 3D 90 of how different s vectors correspond to the shoulder orientations for patient 3D model 60. FIG. 5A defines a 2D space where each axis represents Euler angles. The obtained 3D curves 92, 94, and 96, corresponding to circles 82, 84, and 86 in the 2D diagram 80 are observed from three different points of view (front view in FIG. 5B, perspective view in FIG. 5C, top view in FIG. 5D). FIG. 6A, FIG. 6B, and FIG. 6C show perspective, front, and top views of 3D trajectory model 100 produced by axis-angle parameterization in accordance with the present description.
[0034] In system 10, the patient stands in front of a computing device 12 equipped with a motion tracking sensor 20, and by holding a remote controller 32 with button 36, which the user may variably press to mark pain locations interactively.
[0035] Computing device 12 comprises a processor 14 and non-transitory memory 16 for storing application programming (instructions) 18 executable on the processor 14 for interpreting data acquired from a capture volume of motion sensor 20 and user input device 30. In one embodiment shown in FIG. 1 , the pain mapping system 10 comprises a computer 12 having a USB input (not shown) allowing input of a USB receiver 34 for receiving user input data from controller 32.
[0036] In another embodiment (see FIG. 2), the user input device 30 may input the pain location data via verbalization (e.g. via a controller
comprising a microphone) in combination with voice recognition software (as part of application software 18) for automated pain mapping (e.g. via a 3D pain map 36) during dynamic movement of the body part. The painted colored regions may be directly mapped to the 2D swing plane
representation 38, and the result constitutes the proposed pain map.
[0037] As shown in FIG. 2, motion capture device 20 preferably comprises an optical sensor providing 3 degree-of-freedom (DOF) rotation tracking of target articulation. In one embodiment, motion capture device 20 may comprise a Microsoft Kinect sensor, or a similar motion capture device available in the art. In such embodiment, the motion capture device 20 is coupled to computer 12 via USB or like input. While such a 3D depth- ranging camera sensor is suitable as a fixed installation in many settings, an implementation suitable for home use may incorporate data from a user holding a smart phone equipped with motion tracking capabilities. In such embodiments, the smart phone would act to include the functionality of computing device 12 (including an installed app as application software 18), motion tracking/capture device 20, and/or user input device 30 (e.g. via a microphone on the phone). New motion tracking technologies based on feature tracking from the phone's video input may be used to provide translation and orientation information to reconstruct the desired articulation movement.
[0038] While motion capture device 20 preferably comprises an optical sensor, it is appreciated that other embodiments may be employed using other technologies, e.g. magnetic, mechanical, or inertial sensors.
[0039] In another embodiment, a smart phone may also be used to
augment the tracking ability of a depth-ranging camera 20, by providing tracking information when the depth-range sensor 20 cannot distinguish motion. For example, one typical difficult situation for depth-range sensors is when the axis of the user's arm is orthogonal to the sensor's plane and the user twists his or her arm.
[0040] In addition, a smart phone may be used to notify the pain locations collected by the system 10, 50, instead of relying on a dedicated controller held by the user. In other words, the pain mapping system 10, 50 has the flexibility to use other scalable and wireless devices, such as hand-held smart phone or wearable wireless wrist device in combination with the presently illustrated components shown in FIG. 1 and FIG. 2 (e.g. beyond just a hand-held wireless remote controller). Also, the system 10, 50 may utilize sensor capabilities that come with a smart phone, not only by using the typical built-in accelerometer and gyroscope functions, but also by using the camera/video function of the smart phone in an innovative way (for example using panoramic picture/video function) to provide information about arm motion and/or augment reconstruction of the arm motion.
[0041] Referring to FIG. 2, application software 18 may comprise two
primary components. A conversion module 22 may be employed for conversion of sensed articulation/rotation to the used representation in the adopted local frame system (described in further detail below and illustrated further in FIG. 4A through 6C). Further, application software 18 may include an interactive component 24 for painting a current selected color into a 3D spherical representation whenever the corresponding controller button 36 or voice command is detected (illustrated further in FIG. 7 through FIG. 9).
[0042] Referring to the patient digital representation 52 shown in FIG. 3, the pain map representation is designed for a generic 3 degrees of freedom (DOFs) joint 54. Although the majority of embodiments disclosed herein are directed to addressing pain related to the shoulder joint, it is also appreciated the methodology of the present technology may be applied to back and neck (spinal) joints, or any joint articulation.
[0043] The shoulder orientation is decomposed in the twist and swing
rotations of the upper-arm according to axis 56. The swing orientation is defined as a vector s=(sx,sy) defined in a local 2D frame on the sagittal frame centered at the shoulder, with the X axis parallel to the ground and pointing forward, and the Y axis pointing up. The neutral swing orientation is defined as the arm point outwards on the coronal plane, along the local shoulder Z axis. Vector s is explained in further detail below.
[0044] Detailed shoulder kinematic studies demonstrate antero-posterior, medio-lateral as well as elevation/depression translational motion of the shoulder joint as part of a stereotypical coordinated motion of the shoulder joint contributing to the cardinal motions of the shoulder (shoulder flexion, extension, adduction, abduction, rotation, humeral internal and external rotation, and scaption); however, the simplification of only considering the shoulder joint motion under the described single-joint framework can still offer a reasonable account of the component underlying motions.
[0045] An initial calibration motion protocol can also be implemented in application software 18 to identify a relative correction vector to be applied to the considered origin of the shoulder joint (0,0,0), in case improved measurement precision is needed for special applications. The procedure works as follows: the user is asked to perform a motion protocol that allows the system to compute an estimated joint center of rotation, and the difference from that location and the location of the joint center given by the sensor 20 results in correction vectors to be applied during the pain map construction. The protocol motion starts at the resting position with the arm at the side of the body with palm facing forward, then abduction of the humerus to 90 degrees at horizontal level with palm facing forward, and forward flexion of the shoulder to 90 degrees at horizontal level with fingers pointing forward. The estimated center of rotation for each protocol motion is the average shoulder joint center measured during the performance of each motion. [0046] Considering that the inherent biological variability is relatively large and since the spatial resolution of pain location will not be clinically significant for small changes (for example less than em's range), it is envisioned that the described framework strikes a balanced approach to providing a practical, adequate, and efficient methodology for "mapping" pain.
[0047] For a given swing rotation s, if s is the null vector, then it represents the neutral arm posture (along the local shoulder Z axis). If s is not null, then it defines the 2D axis of rotation on the local XY shoulder frame to rotate the upper arm from its neutral position by the angle of ||s|| degrees. This representation is known as the axis-angle representation. FIG. 4A through FIG. 4C illustrate how different s vectors correspond to the shoulder orientations for patient 3D model 60. For each posture shown in FIG. 4A through FIG. 4C, the top-left diagram 70 locates the respective orientation s in a 2D axis displayed after a clockwise 90 degrees rotation (e.g. in the X axis and Y axis, which may be color coded). Additional color- coded axes 62 may be depicted for each joint. The diagrams are rotated because in this way the correspondence between the s location and the shoulder orientation becomes more intuitive.
[0048] In FIG. 4A, the shoulder is shown in its neutral orientation s=(0,0).
Center: In FIG. 4B, the shoulder is shown in orientation s=(-29.8,41 .0), which represents a rotation around the s axis of ||s||=50.7 degrees. In FIG. 4C, the shoulder is shown shoulder in orientation s=(-65.9,94.5), which represents a rotation around the s axis of ||s||=1 15.3 degrees. The ellipse 72 shown in the diagram 70 is used to provide a reference of the expected swing range of motion limit for a healthy adult.
[0049] The used axis-angle representation is important for achieving a joint orientation representation that is intuitive and efficient for the pain map to be represented both in 3D and as a 2D diagram. For example, a popular approach for orientation representation is to rely on independent Euler angles. Referring now to FIG. 5A through FIG. 5D, the 2D diagram 80 shown in FIG. 5A defines a 2D space where each axis represents Euler angles, and the corresponding swing rotation of each point in the diagram is achieved by one rotation around Y and then another rotation around X. The red axis is X and the green axis is Y. The points in the trajectories of the three concentric circles 82, 84, and 86 in the 2D diagram 80 are converted to their respective swing orientations, each orientation is then applied to the shoulder joint, and then the arm intersection with a 3D sphere centered at the joint is computed in order to achieve the 3D visualization of the trajectories. The obtained 3D curves 92, 94, and 96, corresponding to circles 82, 84, and 86 in the 2D diagram 80 are observed from three different points of view (front view in FIG. 5B, perspective view in FIG. 5C, top view in FIG. 5D).
[0050] Deformation effects of the singularities in the Euler angle
parameterization are noticeable, making this method unsuitable for a pain map representation. The circles 82, 84, and 86 have 45, 75, and 105 degrees respectively as radii in the 2D diagram 80 of FIG. 5A. Thus, FIG. 5A through FIG. 5D illustrate that Euler angles do not provide an
acceptable mapping due its high mapping deformation and the two singularities that exist inside the reachable range of motion of the joint being measured.
[0051] FIG. 6A, FIG. 6B, and FIG. 6C show perspective, front, and top
views of 3D trajectory model 100 produced by axis-angle parameterization in accordance with the present description. Curves 102, 104, and 106 correspond to circles 82, 84, and 86 in the 2D diagram 80 of FIG. 5A. As seen in FIG. 6A, FIG. 6B, and FIG. 6C, the adopted swing-twist
parameterization produces a high quality 3D mapping without distortions, achieving a suitable mapping between 3D and 2D pain map
representations. The above described axis-angle representation shown in FIG. 6A, FIG. 6B, and FIG. 6C has only one singularity, which is carefully placed in an unreachable location that is along the -Z axis of the chosen local axis frame.
[0052] Given a system 10, 50 where a patient or user stands in front of the motion tracking system coupled to a computer 12, the user's shoulder 54 orientation can be then constantly tracked in order to automatically create a 3D pain map 36 on the described swing 2D plane representation illustrated in FIG. 4A through FIG. 4C. In this way, the patient will be painting and 'mapping' with operation of controller 30 where pain is felt across the swing rotation space of the shoulder joint 54. The history of all painted swing orientations represents the pain map representation.
[0053] At every tracked frame i, the respective swing orientation s, is
measured, and the intersection point p, of the upper-arm skeleton segment at orientation s, and a sphere centered at the shoulder joint is computed. For any given point that the patient feels pain while moving the shoulder, the patient presses a button 36 in the controller 30 (which is held by the patient), or verbalizes a command, in order to indicate pain at the current swing orientation. It is appreciated that the controller may also comprise a multiple button configuration for indicating varying pain intensity. For Example, two button combinations may be incorporated wherein a first button 38 selects the current color to be painted and which corresponds to a given intensity or type of pain to be painted; the and a second button 36 serves to actually paint the active color at location p, on the sphere. The user may hold the button 36 pressed while moving the arm in order to continuously paint regions of pain. In an alternative embodiment, button 36 may also incorporate pressure sensing such that varying levels of applied pressure represent varying levels of pain.
[0054] The screen shots 1 10 and 120 of FIG. 7 and FIG. 8 show front and perspective views of a patient through various shoulder orientations and corresponding 3D pain map 1 14. For each posture that the user feels pain, a button is pressed in order to paint the current swing location with the current color. For clarity purposes, the pain map 1 14 in FIG. 7 and FIG. 8 is shown with a first hatch pattern corresponding to high pain, and a second hatch pattern corresponding to low pain, with shades of grey representing intermediate pain. It is appreciated, however, that the actual visual output is color coded, with e.g. a dark blue color reserved to notify no pain, red reserved to show high pain, and other colors (e.g. light blue, yellow, etc.) specified to show intermediate levels of pain. For purposes of this disclosure, pain mapping will be described with reference to different color- coding, even though the drawings show various patterns to represent differing levels of pain. In such a representation, the dark blue regions in the 3D map 1 14 are always painted while the user moves his/her arm. In this way the dark blue color is designated to represent the shoulder range of motion explored during the pain map construction.
[0055] A color coded ring 1 18 disposed around the user's arm may also be employed showing a varying color that represents the current selected color by the user 1 12. Window 1 16 may be used to show additional information to the user.
[0056] As mentioned above, the dark blue color is ideally reserved to
represent no pain within a region, and is always painted while the user moves his/her arm to represent the shoulder range of motion explored during the pain map construction.
[0057] The three other colors are then used in the shown examples to
represent varying levels of pain as selected by the user: e.g. light blue, yellow and red. These colors can represent increasing levels of pain (e.g. light (light blue), moderate or intense (yellow), and severe (red)), or different types of pain (e.g. dull, sharp, burning). The meaning of the colors is given by the therapist/doctor according to the application, and additional colors can be easily added or modified to give increased flexibility as needed.
[0058] The current selected color is preferably shown as a colored ring 1 18 around the user's arm, so that the user 1 12 maintains focus on its arm movement and does not need to look at other locations in the computer screen. Each time one button in the controller 30 is pressed, the current color cycles through the available colors, including the dark blue color indicating no pain. This allows the user to paint a dark blue color on top of a pain color if he/she desires to change a previously painted pain color. This overall procedure allows the user to intuitively paint his/her pain map until the obtained results look accurate. Figure 9 illustrates one map diagram produced by our prototype application. [0059] FIG. 9 shows a corresponding 2D map 130, wherein each point 132 in the pain diagram has precise coordinates indicating the corresponding arm orientation for the marked pain level or type. The marked ellipse 134 (of variable dimensions) may be used to indicate the expected range of motion to be explored by the patient.
[0060] The colors on the pain maps shown in FIG. 7 through FIG. 9 may be stored in a texture image that is initially fully transparent. For every painted point p., its position in the texture is determined and the corresponding texture pixel has its color changed to reflect the selected color. Each time a position is painted, a radial basis function is preferably used to distribute the color to a region around the marked point. The radius of the region is a parameter of the system, and it allows definition of how coarse or fine each marked location should be painted at the current color. Normally this radius is fixed by the therapist. However, the system may allow for the patient to specify painting a more precise variation of colors such that the therapist may then fine-tune the painting radius.
[0061] Referring back to FIG. 2, an export module 40 may be implemented in executable application software for display or transfer of acquired data. The module 40 may include an option 42 for the full 3D collected map (e.g. similar to map 1 14 shown in FIG. 7 ad FIG. 8) to be reloaded and/or visualized by the physician/and or patient as a .jpg or like image. Temporal data may also be included so that the 3D map is presented as a video showing pain regions through the patient's articulation of the joint. In addition, for improved visualization, a 3D printing module 44 may be included such that the acquired digital data of the pain map 1 14 is reproduced into a 3D model (not shown) using 3D printing. A 2D module 46 may also be included for generating a 2D map (similar to map 130 shown in FIG. 9) that is suitable for paper documents.
[0062] The above- described pain mapping system 10/50 can also be used in combination with clinically useful predefined movement protocols to elicit pain (provocative maneuvers). For example, with selection of these provocative maneuvers under a menu of pain mapping, specific protocols and characterization of pain will provide easily recognized patterns of pain that can be correlated to known biomechanics and pathologic processes.
[0063] It is appreciated that the systems and methods detailed above for shoulder pain may also be extended to other regions of the body. FIG. 10A and FIG. 10B show images 140 and 150 respectively configured to represent pain with respect to different spine movements. In such configuration, while the user moves his or her spine, a pain map 146 can be colored and displayed together with a skeleton representation 142 in order to reduce occlusion of the pain map being painted. While a simple cylinder- based skeleton representation is shown in FIG. 10A and FIG. 10B, the system can also display a realistic human skeleton including all cervical, thoracic, and lumbar spine joints in order to better assist the therapist's analysis.
[0064] A generic way to extend the solution to other joints is to display the user's character representation as a simple skeleton as shown in FIG. 10A and FIG. 10B (or use of respective spinal vertebral bone models) to minimize occlusion, and then to display the spherical pain map being painted centered at the joint of interest. It is envisioned that specifying the pain at different levels of the vertebrae can be controlled by the patient again using either buttons (remote control) or through verbalization. The pain map representation can therefore be applied to a generic 3 degrees of freedom (DOFs) motion centered around any selected joint.
[0065] For representing back pain, we consider the vector from the base lumbar joint of the spine to the top thoracic vertebra. Twist rotations that are sensed on the spine joints are also applied to this representative vector. Such a vector augmented with a twist rotation simplifies the spine
configuration to a single 3-DOF rotation that can then be directly mapped to our pain map representation. For example, the lumbar region back pain can be mapped in regards to cardinal motions (flexion, extension, side bending, and rotation as well as certain combination of movements that are clinically useful, provocative maneuvers).
[0066] The same approach can be extended to neck joints, such that similar cervical region neck pain can be mapped in regards to cardinal motions and available provocative maneuvers). In such configurations, mapping is constructed by considering the vector from the base/lower cervical vertebrae to the rostral cranio-cervical joint.
[0067] In another embodiment (not shown) the system 10, 50 may be
configured to represent pain for different arm twist rotations (pronation or supination). For example, the system 10, 50 may allow for painting three different pain maps at the same time: one to represent the pain in the neutral arm, another for the pronated arm, and another for the supinated arm. Options may be provided allowing for the therapist at any point to change the current pain map being painted and instruct the patient to explore sensations in the corresponding neutral, supinated or pronated arm; or the twist orientation of the user arm can be tracked and used to automatically switch between the current pain maps being painted. The second option, although automatic, may be confusing for some patients and it also depends on how well the sensor can detect arm twist rotations.
Therefore both options are provided by the system 10, 50 for the therapist to decide the most suitable solution.
[0068] The same pain map representation and painting procedure can be extended to represent pain for any 3 degrees of freedom joint articulation. It can therefore be extended to represent lower-body, spine and neck motion pain. The shoulder joint is in a location that is suitable for visualization of the spherical pain map interactively while painting it; but for other joints, the representation can easily occlude the pain map being painted.
[0069] Embodiments of the present technology may be described with
reference to flowchart illustrations of methods and systems according to embodiments of the technology, and/or algorithms, formulae, or other computational depictions, which may also be implemented as computer program products. In this regard, each block or step of a flowchart, and combinations of blocks (and/or steps) in a flowchart, algorithm, formula, or computational depiction can be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions embodied in computer-readable program code logic. As will be appreciated, any such computer program instructions may be loaded onto a computer, including without limitation a general purpose computer or special purpose computer, or other programmable processing apparatus to produce a machine, such that the computer program instructions which execute on the computer or other programmable processing apparatus create means for implementing the functions specified in the block(s) of the flowchart(s).
[0070] Accordingly, blocks of the flowcharts, algorithms, formulae, or
computational depictions support combinations of means for performing the specified functions, combinations of steps for performing the specified functions, and computer program instructions, such as embodied in computer-readable program code logic means, for performing the specified functions. It will also be understood that each block of the flowchart illustrations, algorithms, formulae, or computational depictions and combinations thereof described herein, can be implemented by special purpose hardware-based computer systems which perform the specified functions or steps, or combinations of special purpose hardware and computer-readable program code logic means.
[0071] Furthermore, these computer program instructions, such as
embodied in computer-readable program code logic, may also be stored in a computer-readable memory that can direct a computer or other programmable processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the block(s) of the flowchart(s). The computer program instructions may also be loaded onto a computer or other programmable processing apparatus to cause a series of operational steps to be performed on the computer or other programmable processing apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable processing apparatus provide steps for implementing the functions specified in the block(s) of the flowcharts), algorithm(s), formula(e), or computational depiction(s).
[0072] It will further be appreciated that the terms "programming" or
"program executable" as used herein refer to one or more instructions that can be executed by a processor to perform a function as described herein. The instructions can be embodied in software, in firmware, or in a
combination of software and firmware. The instructions can be stored local to the device in non-transitory media, or can be stored remotely such as on a server, or all or a portion of the instructions can be stored locally and remotely. Instructions stored remotely can be downloaded (pushed) to the device by user initiation, or automatically based on one or more factors. It will further be appreciated that as used herein, that the terms processor, computer processor, central processing unit (CPU), and computer are used synonymously to denote a device capable of executing the instructions and communicating with input/output interfaces and/or peripheral devices.
[0073] From the description herein, it will be appreciated that that the
present disclosure encompasses multiple embodiments which include, but are not limited to, the following:
[0074] 1 . A system for graphically representing pain felt by a user during articulation movement, the system comprising: (a) a motion capture sensor; (b) a computer processor; and (c) a non-transitory memory storing instructions executable on the computer processor; (d) said instructions when executed performing steps comprising: (i) acquiring input from a user, the user input comprising data indicative of pain with respect to one or more locations of the user as the user moves in a capture volume of the motion capture sensor; (ii) acquiring data relating to articulation movement of the user from the motion capture sensor; and (iii) generating a three- dimensional graphic representation of the user's articulation movement and indicated pain levels at locations within said movement.
[0075] 2. The system of any preceding embodiment, further comprising: a user input device configured for allowing the user to interactively mark locations indicative of pain within a range of motion of the user's articulation movement. [0076] 3. The system of any preceding embodiment, wherein the user input device comprises a wireless hand-held device that allows the user to manually trigger an assigned pain designation according to a location within the user's range of motion.
[0077] 4. The system of any preceding embodiment, wherein the user input device comprises a voice activated device that allows the user to verbally trigger an assigned pain designation according to a location within the user's range of motion.
[0078] 5. The system of any preceding embodiment, wherein the motion capture sensor comprises scalable three-dimensional depth-ranging camera sensor system.
[0079] 6. The system of any preceding embodiment, wherein the 3D
visualization comprises a temporal recording in the form of a video of the of the user's articulation movement.
[0080] 7. The system of any preceding embodiment, wherein the 3D
visualization comprises color-coded regions to show variable levels of pain as an interactive pain map; wherein each point in the pain map comprises coordinates indicating the corresponding patient orientation for a marked pain level.
[0081] 8. A system as recited in claim 7, wherein said instructions when executed by the computer processor further perform steps comprising mapping the color-coded regions to a 2D swing plane representation.
[0082] 9. The system of any preceding embodiment, wherein mapping the color-coded regions is a function of a swing-twist rotation decomposition for a low-deformation and singularity-free 2D diagram representation.
[0083] 10. A system as recited in claim 2, wherein said instructions when executed by the computer processor further perform steps comprising reproducing the 3D visualization into a physical 3D model via a 3D printer.
[0084] 1 1 . A method for graphically representing pain felt by a user during articulation movement, comprising: acquiring input from a user, the user input comprising data indicative of pain with respect to one or more locations of the user as the user moves in a capture volume of a motion capture sensor; acquiring data relating to articulation movement of the user from the motion capture sensor; and generating a three-dimensional graphic representation of the user's articulation movement and indicated pain locations within said movement.
[0085] 12. The method of any preceding embodiment, wherein acquiring input comprises acquiring a signal from a user input device allowing the user to interactively mark locations indicative of pain within a range of motion of the user's articulation movement.
[0086] 13. The method of any preceding embodiment, wherein the user input device comprises a wireless hand-held device that allows the user to manually trigger an assigned pain designation according to a location within the user's range of motion.
[0087] 14. The method of any preceding embodiment, wherein the user input device comprises a voice activated device that allows the user to verbally trigger an assigned pain designation according to a location within the user's range of motion.
[0088] 15. The method of any preceding embodiment, wherein the motion capture sensor comprises scalable three-dimensional depth-ranging camera sensor system.
[0089] 16. The method of any preceding embodiment, wherein the 3D
representation comprises a temporal recording in the form of a video of the of the user's articulation movement.
[0090] 17. The method of any preceding embodiment: wherein the 3D
representation comprises color-coded regions to show variable levels of pain as an interactive pain map; and wherein each point in the pain map comprises coordinates indicating the corresponding patient orientation for a marked pain level.
[0091] 18. The method of any preceding embodiment, the method further comprising: mapping the color-coded regions to a 2D swing plane representation.
[0092] 19. A method as recited in claim 18, wherein mapping the color- coded regions is a function of a swing-twist rotation decomposition for a low-deformation and singularity-free 2D diagram representation.
[0093] 20. The method of any preceding embodiment, the method further comprising: reproducing the 3D visualization into a physical 3D model via a 3D printer.
[0094] 21 . A system for graphically representing pain felt by a user during articulation movement, the system comprising: (a) a user input device configured for allowing a user to interactively mark locations indicative of pain within a range of motion of the user's articulation movement; (b) a motion capture sensor having a capture volume configured to acquire data relating to the user's articulation movement through at least a portion of the range of motion; (c) a computer processor; and (d) a non-transitory memory storing instructions executable on the computer processor; (e) said instructions comprising: (i) a conversion module for converting articulation sensed by the motion capture sensor to a used representation in a local frame system; and (ii) an interactive module configured for painting a specified indicator into a three-dimensional graphical representation corresponding to the sensed articulation from the motion sensor, said indicator being indicative of pain sensed by the user at a location within the user's articulation movement.
[0095] 22. The system of any preceding embodiment, wherein the
conversion module and interactive module generate a three-dimensional spherical representation of the user's articulation movement and indicated pain locations within said movement.
[0096] 23. The system of any preceding embodiment, wherein the user input device comprises a wireless hand-held device that allows the user to manually trigger an assigned pain designation according to a location within the user's range of motion.
[0097] 24. The system of any preceding embodiment, wherein the user input device comprises a voice activated device that allows the user to verbally trigger an assigned pain designation according to a location within the user's range of motion.
[0098] 25. The system of any preceding embodiment, wherein the motion capture sensor comprises scalable three-dimensional depth-ranging camera sensor system.
[0099] 26. The system of any preceding embodiment, wherein the 3D
representation comprises a temporal recording in the form of a video of the of the user's articulation movement.
[00100] 27. The system of any preceding embodiment: wherein the specified indicator of the 3D representation comprises color-coded regions to show variable levels of pain as an interactive pain map; and wherein each point in the pain map comprises coordinates indicating the corresponding patient orientation for a marked pain level.
[00101] 28. A system as recited in claim 27, wherein said instructions when executed by the computer processor performs steps comprising mapping the color-coded regions to a 2D swing plane representation.
[00102] 29. The system of any preceding embodiment, wherein mapping the color-coded regions is a function of a swing-twist rotation decomposition for a low-deformation and singularity-free 2D diagram representation.
[00103] 30. The system of any preceding embodiment, wherein the 3D
representation is generated via a swing-twist parameterization.
[00104] Although the description herein contains many details, these should not be construed as limiting the scope of the disclosure but as merely providing illustrations of some of the presently preferred embodiments. Therefore, it will be appreciated that the scope of the disclosure fully encompasses other embodiments which may become obvious to those skilled in the art.
[00105] In the claims, reference to an element in the singular is not intended to mean "one and only one" unless explicitly so stated, but rather "one or more." All structural, chemical, and functional equivalents to the elements of the disclosed embodiments that are known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the present claims. Furthermore, no element, component, or method step in the present disclosure is intended to be dedicated to the public regardless of whether the element, component, or method step is explicitly recited in the claims. No claim element herein is to be construed as a "means plus function" element unless the element is expressly recited using the phrase "means for". No claim element herein is to be construed as a "step plus function" element unless the element is expressly recited using the phrase "step for".

Claims

CLAIMS What is claimed is:
1 . A system for graphically representing pain felt by a user during articulation movement, the system comprising:
(a) a motion capture sensor;
(b) a computer processor; and
(c) a non-transitory memory storing instructions executable on the computer processor;
(d) said instructions when executed performing steps comprising:
(i) acquiring input from a user, the user input comprising data indicative of pain with respect to one or more locations of the user as the user moves in a capture volume of the motion capture sensor;
(ii) acquiring data relating to articulation movement of the user from the motion capture sensor; and
(iii) generating a three-dimensional graphic representation of the user's articulation movement and indicated pain levels at locations within said movement.
2. A system as recited in claim 1 , further comprising:
a user input device configured for allowing the user to interactively mark locations indicative of pain within a range of motion of the user's articulation movement.
3. A system as recited in claim 2, wherein the user input device comprises a wireless hand-held device that allows the user to manually trigger an assigned pain designation according to a location within the user's range of motion.
4. A system as recited in claim 2, wherein the user input device comprises a voice activated device that allows the user to verbally trigger an assigned pain designation according to a location within the user's range of motion.
5. A system as recited in claim 2, wherein the motion capture sensor comprises scalable three-dimensional depth-ranging camera sensor system.
6. A system as recited in claim 2, wherein the 3D visualization comprises a temporal recording in the form of a video of the of the user's articulation movement.
7. A system as recited in claim 2, wherein the 3D visualization comprises color-coded regions to show variable levels of pain as an interactive pain map; wherein each point in the pain map comprises coordinates indicating the corresponding patient orientation for a marked pain level.
8. A system as recited in claim 7, wherein said instructions when executed by the computer processor further perform steps comprising mapping the color-coded regions to a 2D swing plane representation.
9. A system as recited in claim 8, wherein mapping the color-coded regions is a function of a swing-twist rotation decomposition for a low-deformation and singularity-free 2D diagram representation.
10. A system as recited in claim 2, wherein said instructions when executed by the computer processor further perform steps comprising reproducing the 3D visualization into a physical 3D model via a 3D printer.
1 1 . A method for graphically representing pain felt by a user during articulation movement, comprising:
acquiring input from a user, the user input comprising data indicative of pain with respect to one or more locations of the user as the user moves in a capture volume of a motion capture sensor;
acquiring data relating to articulation movement of the user from the motion capture sensor; and
generating a three-dimensional graphic representation of the user's articulation movement and indicated pain locations within said movement.
12. A method as recited in claim 1 1 , wherein acquiring input comprises acquiring a signal from a user input device allowing the user to interactively mark locations indicative of pain within a range of motion of the user's articulation movement.
13. A method as recited in claim 12, wherein the user input device comprises a wireless hand-held device that allows the user to manually trigger an assigned pain designation according to a location within the user's range of motion.
14. A method as recited in claim 12, wherein the user input device comprises a voice activated device that allows the user to verbally trigger an assigned pain designation according to a location within the user's range of motion.
15. A method as recited in claim 12, wherein the motion capture sensor comprises scalable three-dimensional depth-ranging camera sensor system.
16. A method as recited in claim 12, wherein the 3D representation comprises a temporal recording in the form of a video of the of the user's articulation movement.
17. A method as recited in claim 12:
wherein the 3D representation comprises color-coded regions to show variable levels of pain as an interactive pain map; and
wherein each point in the pain map comprises coordinates indicating the corresponding patient orientation for a marked pain level.
18. A method as recited in claim 17, the method further comprising: mapping the color-coded regions to a 2D swing plane representation.
19. A method as recited in claim 18, wherein mapping the color-coded regions is a function of a swing-twist rotation decomposition for a low-deformation and singularity-free 2D diagram representation.
20. A method as recited in claim 12, the method further comprising: reproducing the 3D visualization into a physical 3D model via a 3D printer.
21 . A system for graphically representing pain felt by a user during articulation movement, the system comprising:
(a) a user input device configured for allowing a user to interactively mark locations indicative of pain within a range of motion of the user's articulation movement;
(b) a motion capture sensor having a capture volume configured to acquire data relating to the user's articulation movement through at least a portion of the range of motion;
(c) a computer processor; and
(d) a non-transitory memory storing instructions executable on the computer processor;
(e) said instructions comprising:
(i) a conversion module for converting articulation sensed by the motion capture sensor to a used representation in a local frame system; and
(ii) an interactive module configured for painting a specified indicator into a three-dimensional graphical representation corresponding to the sensed articulation from the motion sensor, said indicator being indicative of pain sensed by the user at a location within the user's articulation movement.
22. A system as recited in claim 21 , wherein the conversion module and interactive module generate a three-dimensional spherical representation of the user's articulation movement and indicated pain locations within said movement.
23. A system as recited in claim 22, wherein the user input device comprises a wireless hand-held device that allows the user to manually trigger an assigned pain designation according to a location within the user's range of motion.
24. A system as recited in claim 22, wherein the user input device comprises a voice activated device that allows the user to verbally trigger an assigned pain designation according to a location within the user's range of motion.
25. A system as recited in claim 22, wherein the motion capture sensor comprises scalable three-dimensional depth-ranging camera sensor system.
26. A system as recited in claim 22, wherein the 3D representation comprises a temporal recording in the form of a video of the of the user's articulation movement.
27. A system as recited in claim 22:
wherein the specified indicator of the 3D representation comprises color- coded regions to show variable levels of pain as an interactive pain map; and wherein each point in the pain map comprises coordinates indicating the corresponding patient orientation for a marked pain level.
28. A system as recited in claim 27, wherein said instructions when executed by the computer processor performs steps comprising mapping the color-coded regions to a 2D swing plane representation.
29. A system as recited in claim 28, wherein mapping the color-coded regions is a function of a swing-twist rotation decomposition for a low-deformation and singularity-free 2D diagram representation.
30. A system as recited in claim 22, wherein the 3D representation is generated via a swing-twist parameterization.
PCT/US2015/035185 2014-07-25 2015-06-10 Three dimensional sensor-based interactive pain maps for localizing pain WO2016014163A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462028804P 2014-07-25 2014-07-25
US62/028,804 2014-07-25

Publications (1)

Publication Number Publication Date
WO2016014163A1 true WO2016014163A1 (en) 2016-01-28

Family

ID=55163499

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/035185 WO2016014163A1 (en) 2014-07-25 2015-06-10 Three dimensional sensor-based interactive pain maps for localizing pain

Country Status (1)

Country Link
WO (1) WO2016014163A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017088047A1 (en) * 2015-11-24 2017-06-01 Bakker Ryan M System, device and method for monitoring physical recovery
WO2017132563A1 (en) * 2016-01-29 2017-08-03 Baylor Research Institute Joint disorder diagnosis with 3d motion capture
US20210085220A1 (en) * 2018-06-19 2021-03-25 Tornier, Inc. Extended reality visualization of range of motion

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060241720A1 (en) * 2005-04-26 2006-10-26 Woods Carla M Graphical representation of pain therapy
US20070270214A1 (en) * 2005-01-26 2007-11-22 Bentley Kinetics, Inc. Method and system for athletic motion analysis and instruction
US20090051647A1 (en) * 2007-08-24 2009-02-26 Hon Hai Precision Industry Co., Ltd. Portable electronic device with motion sensing module
US20100324457A1 (en) * 2008-12-10 2010-12-23 Jacob Bean Skeletal-muscular position monitoring device
US20110222081A1 (en) * 2010-03-15 2011-09-15 Chen Yi Printing Three-Dimensional Objects Using Hybrid Format Data
US20120127157A1 (en) * 2010-11-24 2012-05-24 Fujitsu Limited Recording and Analyzing Data on a 3D Avatar
US20130324857A1 (en) * 2012-05-31 2013-12-05 The Regents Of The University Of California Automated system for workspace, range of motion and functional analysis
US20140063003A1 (en) * 2012-08-31 2014-03-06 Greatbatch Ltd. Method and System of Producing 2D Representations of 3D Pain and Stimulation Maps and Implant Models on a Clinician Programmer

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070270214A1 (en) * 2005-01-26 2007-11-22 Bentley Kinetics, Inc. Method and system for athletic motion analysis and instruction
US20060241720A1 (en) * 2005-04-26 2006-10-26 Woods Carla M Graphical representation of pain therapy
US20090051647A1 (en) * 2007-08-24 2009-02-26 Hon Hai Precision Industry Co., Ltd. Portable electronic device with motion sensing module
US20100324457A1 (en) * 2008-12-10 2010-12-23 Jacob Bean Skeletal-muscular position monitoring device
US20110222081A1 (en) * 2010-03-15 2011-09-15 Chen Yi Printing Three-Dimensional Objects Using Hybrid Format Data
US20120127157A1 (en) * 2010-11-24 2012-05-24 Fujitsu Limited Recording and Analyzing Data on a 3D Avatar
US20130324857A1 (en) * 2012-05-31 2013-12-05 The Regents Of The University Of California Automated system for workspace, range of motion and functional analysis
US20140063003A1 (en) * 2012-08-31 2014-03-06 Greatbatch Ltd. Method and System of Producing 2D Representations of 3D Pain and Stimulation Maps and Implant Models on a Clinician Programmer

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017088047A1 (en) * 2015-11-24 2017-06-01 Bakker Ryan M System, device and method for monitoring physical recovery
WO2017132563A1 (en) * 2016-01-29 2017-08-03 Baylor Research Institute Joint disorder diagnosis with 3d motion capture
US20210085220A1 (en) * 2018-06-19 2021-03-25 Tornier, Inc. Extended reality visualization of range of motion

Similar Documents

Publication Publication Date Title
JP7091531B2 (en) Methods for physical gesture interface and projection display
JP6675462B2 (en) Motion information processing device
US20220336078A1 (en) System and method for tracking a portion of the user as a proxy for non-monitored instrument
Zhou et al. Human motion tracking for rehabilitation—A survey
US10635782B2 (en) Physical examination method and apparatus
US20130324857A1 (en) Automated system for workspace, range of motion and functional analysis
JP6381918B2 (en) Motion information processing device
JP6181373B2 (en) Medical information processing apparatus and program
CN104274183A (en) Motion information processing apparatus
Kurillo et al. Upper extremity reachable workspace evaluation with Kinect
KR20140132649A (en) Haptic glove and Surgical robot system
CN111091732A (en) Cardio-pulmonary resuscitation (CPR) guiding device and method based on AR technology
JP2021128794A (en) Stereoscopic image generation system, stereoscopic image generation method, and stereoscopic image generation program
Chèze Kinematic analysis of human movement
JP2015503393A (en) Method and apparatus for tracking hand and / or wrist rotation of a user performing exercise
WO2016014163A1 (en) Three dimensional sensor-based interactive pain maps for localizing pain
JP6162517B2 (en) Position determination support apparatus and medical image diagnostic apparatus
US11179065B2 (en) Systems, devices, and methods for determining an overall motion and flexibility envelope
JP2020054433A (en) Body posture detection system
CN106164821B (en) The method and system that movement of the limbs reference point in scheduled 3d space is assessed
EP4181789B1 (en) One-dimensional position indicator
JP2018047035A (en) Medical support method and medical support device
WO2022219491A1 (en) System and method for tracking a portion of the user as a proxy for non-monitored instrument
CN116453715A (en) Remote palpation method and system
CN117480569A (en) System and method for tracking a portion of a user as a proxy for non-monitoring instrumentation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15824238

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15824238

Country of ref document: EP

Kind code of ref document: A1