WO2019152566A1 - Systèmes et procédés de cartographie cinématique spécifique à un sujet - Google Patents

Systèmes et procédés de cartographie cinématique spécifique à un sujet Download PDF

Info

Publication number
WO2019152566A1
WO2019152566A1 PCT/US2019/015923 US2019015923W WO2019152566A1 WO 2019152566 A1 WO2019152566 A1 WO 2019152566A1 US 2019015923 W US2019015923 W US 2019015923W WO 2019152566 A1 WO2019152566 A1 WO 2019152566A1
Authority
WO
WIPO (PCT)
Prior art keywords
kinematic
imu
array
mapping
subject specific
Prior art date
Application number
PCT/US2019/015923
Other languages
English (en)
Inventor
Veronica Jade SANTOS
Eric Richard PELTOLA
Eunsuk CHONG
Original Assignee
The Regents Of The University Of California
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Regents Of The University Of California filed Critical The Regents Of The University Of California
Publication of WO2019152566A1 publication Critical patent/WO2019152566A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/76Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/76Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
    • A61F2002/7615Measuring means

Definitions

  • the present invention generally relates to distinguishing rotation axes in a joint with an arbitrary number of degrees-of-freedom.
  • Kinematics is a branch of classical mechanics that describes the motion of points, bodies (objects), and systems of bodies (groups of objects) without considering the forces that caused the motion. Kinematics is often used to describe the motion of systems that are composed of jointed parts (multi-link systems), such as engines, robotic arms, or the human skeleton.
  • multi-link systems such as engines, robotic arms, or the human skeleton.
  • a link is a nominally rigid body that possesses at least two nodes, and a node is an attachment point to other links via joints.
  • the three dimension (3D) rotation group (so(3)) is the group of all rotations about the origin of 3D Euclidean space R 3 .
  • One embodiment includes a subject specific kinematic measurement system including an array of inertial measurement units (IMUs), where each IMU in the array of IMUs is attached to a body in a kinematic linkage, a prosthetic, a processor, and a memory including a kinematic mapping application, where the kinematic mapping application directs the processor to acquire sensor data from the array of IMUs, pre-process the sensor data, define a numerical model representing the kinematic linkage, estimate optimal nonlinear mapping parameters by performing a nonlinear dimensionality reduction on the pre-processed sensor data, using the numerical model, extract joint axis orientation for each axis from the mapping parameters, estimate the joint axis position of each body, and generate a control data based on the joint axis orientations and the joint axis positions, where the control data can be used to manipulate the prosthetic.
  • IMUs inertial measurement units
  • the prosthetic is a virtual representation of the kinematic linkage.
  • the prosthetic is a robotic arm.
  • the prosthetic is a virtual avatar.
  • each IMU in the array of IMUs is attached to a unique body in the kinematic linkage.
  • the IMU array is arranged in a wearable configuration.
  • the IMU array includes a glove fitted to a human hand, where each phalange and metacarpal in the human hand is associated with one IMU in the IMU array.
  • the kinematic mapping application further directs the processor to provide kinematic diagnostic information regarding a human wearing the IMU array.
  • the nonlinear dimensionality reduction is achieved using the Baker-Campbell-Hausdorff Formula.
  • the kinematic mapping application further directs the processor to utilize a generative topographic mapping, where the Baker-Campbell-Hausdorff Formula is used as the basis functions for the mapping.
  • a method for subject specific kinematic measurement include acquiring sensor data from an array of inertial measurement units (IMUs), where each IMU in the array of IMUs is associated with a body in a kinematic linkage, pre-processing the sensor data, defining a numerical model representing the kinematic linkage, estimating optimal nonlinear mapping parameters by performing a nonlinear dimensionality reduction on the pre-processed sensor data, using the numerical model, extracting joint axis orientation for each axis from the mapping parameters, estimating the joint axis position of each body, and generating a control data based on the joint axis orientations and the joint axis positions, where the control data can be used to manipulate a prosthetic.
  • IMUs inertial measurement units
  • the prosthetic is a virtual representation of the kinematic linkage.
  • the prosthetic is a robotic arm.
  • the prosthetic is a virtual avatar.
  • each IMU in the array of IMUs is attached to a unique body in the kinematic linkage.
  • the IMU array is arranged in a wearable configuration.
  • the IMU array includes a glove fitted to a human hand, where each phalange and metacarpal in the human hand is associated with one IMU in the IMU array.
  • the method further includes providing kinematic diagnostic information regarding a human wearing the IMU array.
  • the nonlinear dimensionality reduction is achieved using the Baker-Campbell-Hausdorff Formula.
  • to estimating the optimal nonlinear mapping parameters includes utilizing a generative topographic mapping, where the Baker- Campbell-Hausdorff Formula is used as the basis functions for the mapping.
  • a wearable kinematic control system for prosthetics including a glove mounted inertial measurement unit (IMU) array, where each IMU in the IMU array is associated with a unique kinematic body in the human hand, a processor, a prosthetic, and a memory including a kinematic mapping application, where the kinematic mapping application directs the processor to acquire sensor data from the array of IMUs, pre-process the sensor data to normalize to zero mean, define a numerical model representing the kinematic linkage to be the Baker-Campbell Hausdorff Formula, estimate optimal model parameters by performing a nonlinear dimensionality reduction on the pre-processed sensor data using a generative topographic mapping, where the Baker-Campbell-Flausdorff Formula is used as the basis functions for the mapping, extract joint axis orientation for each axis from the mapping parameters, estimate the joint axis position of each body, and generate a control data based on the joint axis orientations and the joint
  • FIG. 1 illustrates a kinematic mapping system in accordance with an embodiment of the invention.
  • FIG. 2 conceptually illustrates a kinematic mapping processing system in accordance with an embodiment of the invention.
  • FIG. 3 is a flowchart illustrating a kinematic mapping process for estimating joint axis positions in accordance with an embodiment of the invention.
  • FIG. 4 is a diagram demonstrating the exponential representation of rotation of an IMU in accordance with an embodiment of the invention.
  • FIG. 5 is a flowchart illustrating a kinematic mapping process for preprocessing sensor data in accordance with an embodiment of the invention.
  • FIG. 6 is a visualization of an example of an arbitrary sample sensor data vs the preprocessed sensor data in accordance with an embodiment of the invention.
  • FIG. 7 is a flowchart illustrating a kinematics mapping process for estimating joint axis orientation values in accordance with an embodiment of the invention is illustrated
  • FIG. 8 is a visualization of a linkage system in accordance with an embodiment of the invention
  • FIG. 9 is a visualization of a linkage system in accordance with an embodiment of the invention.
  • FIG. 10 is the Baker-Campbell-Flausdorff Formula in accordance with an embodiment of the invention.
  • FIG. 11 is a graphical representation of the mapping of a 2D surface to a 2D manifold in the 3D space of all possible rotations, so(3) in accordance with an embodiment of the invention.
  • FIG. 12 is a visual illustration of generative topographic mapping in accordance with an embodiment of the invention.
  • FIG 13A is an illustration of a model of an arbitrary set of sensor data at the start of a modified generative topographic mapping algorithm in accordance with an embodiment of the invention.
  • FIG 13B is an illustration of a model of an arbitrary set of sensor data after convergence of the model parameters in the modified generative topographic mapping algorithm in accordance with an embodiment of the invention.
  • FIG 13C is an illustration of a model of an arbitrary set of sensor data after convergence of the model parameters in the modified generative topographic mapping algorithm with overlaid estimated and ground truth axes in accordance with an embodiment of the invention.
  • FIG. 14 is a high level flow chart of a kinematic mapping process for estimating axis positions utilizing either simulated data (left“simulation path”) or real data (right“real data path”) in accordance with an embodiment of the invention.
  • FIG. 15 illustrates the real and estimated axes of 16 arbitrary sample simulated data sets where the estimations were performed utilizing kinematic mapping processes similar to those described herein in accordance with an embodiment of the invention
  • kinematic information specific to the subject can be used for rehabilitation monitoring, sports training, or for any other activity where fine motor control is useful. Therefore, a solution that enables a lightweight, transportable, and wearable measurement system could provide value in the medical space.
  • Systems and methods described herein utilize a wearable array of inertial measurement units (IMUs) to capture sensor data describing kinematic information regarding the human body (as a“body” in the kinematic lexicon refers to a generic object, human body parts will be referred to using anatomical terms, or collectively as“the human body” for the purposes of this Application).
  • IMUs inertial measurement units
  • sensors discussed herein are discussed in terms of IMUs, any number of different devices capable of capturing inertial and gyroscopic data can be utilized as appropriate to the requirements of specific applications of embodiments of the invention.
  • Kinematic mapping processes can convert captured sensor data into control data that approximates the movement of bodies in contact with the sensors.
  • control data reflects human kinematic information regarding the movement of the subject.
  • control data can not only be used for medical diagnostics, but can also be used to control prosthetic devices.
  • prosthetic devices can be medical wearable prosthetics.
  • prosthetic devices can include non-human and/or human-like robotic systems.
  • the movement of a user’s hands can be translated into precise movement of a robotic surgical instrument.
  • prosthetic devices are digital avatars that can be directed to move in a virtual space in accordance with the real world movements of a user. While the control data is often used to control prosthetics that mimic the form of human body parts that are attached to the sensors, translational software can be used to convert the human motion to motions of any arbitrary kinematic system of a prosthetic device.
  • IMU arrays can be utilized in accordance with embodiments of the invention to capture sensor data.
  • IMU arrays are configured to capture data regarding a specific set of human body parts. Consequently, IMUs in the IMU array do not need to have a uniform or geometric distribution.
  • IMUs in the IMU array are arranged in such a way that each body has at least one sensor attached.
  • a body in the kinematic system of interest may have no sensors or more than one sensor attached.
  • any number of human body parts can be the subject of sensor data collection in accordance with various embodiments of the invention. Indeed, while the discussion of subject specific kinematic mapping herein is primarily discussed with reference to human subjects, the systems and methods described herein are widely applicable to any number of organisms or inanimate objects. Subject specific kinematic mapping systems are discussed below.
  • kinematic mapping systems can take many forms depending on the specific intended usage as appropriate to the requirements of specific applications of embodiments of the invention. In numerous embodiments, kinematic mapping systems are capable of performing kinematic mapping processes. Turning now to FIG. 1 , a kinematic mapping system in accordance with an embodiment of the invention is illustrated.
  • Kinematic mapping system 100 includes a kinematic processing system 100.
  • kinematic processing systems are capable of performing kinematic mapping processes, as well as other computational requirements as appropriate to the requirements of specific applications of embodiments of the invention.
  • a display and/or an interface is in communication with the kinematic processing system to enable the transfer of information between the system and a human operator.
  • System 100 further includes an IMU array 120.
  • the IMU array is a glove constructed in such a way that at least one IMU is located on each of the phalanges of interest. IMUs can be distributed over the metacarpals or carpals.
  • At least one IMU is placed over the radius and/or ulna.
  • any number of IMU arrangements can be utilized in the construction of an IMU array glove, including arrangements that do not include an IMU over each bone as appropriate to the requirements of specific applications of embodiments of the invention.
  • IMU array gloves can be constructed in such a way that both the fingertips and palms are left exposed.
  • flex sensors are utilized as opposed to IMUs.
  • These types of gloves are referred to as “cybergloves,” and in many instances have completely or mostly cover the skin of the hand. This construction reduces the amount of immersion of a user who loses the sense of touch with respect to any objects they are interfacing with physically.
  • these systems often are unable to track 3D motion without the assistance of external sensors such as camera arrays.
  • IMU arrays can be fabricated with sufficient number and placement of IMUs to enable the system to track 3D motion.
  • any number of different types of IMU configuration can be used including, but not limited to those without particular sensor components and/or those that measure fewer than 9DOF as appropriate to the requirements of specific applications of embodiments of the invention.
  • DOF 9 degree of freedom
  • the system 100 further includes a prosthetic device 130.
  • Prosthetic devices can be any number of devices, including, but not limited to, robotic arms, surgery robots, manufacturing robots, virtual prosthetics, or any other prosthetic device as appropriate to the requirements of specific applications of embodiments of the invention.
  • System 100 includes a virtual prosthetic system 140.
  • Virtual prosthetic systems enable the usage of IMU arrays as interface devices to control a virtual avatar. In numerous embodiments, these virtual avatars are cursors.
  • virtual avatars can be any number of virtual constructs, including, but not limited to, approximations of the bodies attached to the IMU array (e.g. a virtual hand, or even an entire human body).
  • a virtual reality interface 150 is a component of system 100.
  • Virtual reality interfaces can be alternate reality interfaces, mixed reality interfaces, pure virtual reality interface devices, or any other type of immersive display system as appropriate to the requirements of specific applications of embodiments of the invention.
  • a network 160 connects components of system 100.
  • Networks can be any type of wired and/or wireless networking solutions, including, but not limited to, the Internet, local access networks, wide access networks, intranets, and/or any other network type and combinations thereof.
  • FIG. 1 While a specific system in accordance with an embodiment of the invention is illustrated in FIG. 1 , any number of system configurations including, but not limited to, those that utilize fewer or more instances of various components can be utilized. In a variety of embodiments, various components are implemented on the same hardware platforms. For example, virtual prosthetic systems can be implemented on the same computing hardware as a kinematic processing system. Kinematic processing systems are discussed in more detail below. Kinematic Processing Systems
  • Kinematic processing systems are computing platforms capable of performing kinematic mapping processes.
  • Kinematic processing systems can be implemented into a wearable device also containing an IMU sensor array, and/or as a separate computing system.
  • FIG. 2 a block diagram of a kinematic processing system in accordance with an embodiment of the invention is illustrated.
  • Kinematic processing system 200 includes a processor 210.
  • processors can be implemented using any of a variety of different processing components including, but not limited to, central processing units (CPUs), graphics processing units (GPUs), field- programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), or any other processing circuitry capable of performing logical operations.
  • An input/output (I/O) interface 220 is included in kinematic processing system 200. I/O interfaces can enable communication with external devices such as, but not limited to, IMU arrays, user interface devices (e.g. keyboards, computer mice, displays, etc.), and/or any other computing devices as appropriate to the requirements of specific applications of embodiments of the invention. In numerous embodiments I/O interfaces can enable wired and/or wireless connections.
  • System 220 further includes a memory 230.
  • Memory can be implemented using volatile and/or non-volatile storage media.
  • Memory 230 contains a kinematic mapping application which can be used to direct the processor to perform various kinematic mapping processes.
  • memory 230 further contains sensor data 234 and prosthetic data 236.
  • Sensor data 234 includes information obtained from the IMU array, as well as any metadata about the IMU array.
  • IMUs produce accelerometric data, gyroscopic data, and/or magnetometric data.
  • the metadata describes the particular configuration of sensors in the IMU array. This description can be directly encoded as an explicit set of locations, or indirectly encoded as a reference to a particular known IMU array identifier.
  • Metadata can further include any other types of data regarding the IMU array.
  • Prosthetic data can include information regarding a particular set of prosthetic devices.
  • prosthetic data describes the type and model of prosthetic device.
  • prosthetic data describes a set of mappings from processed sensor data to a prosthetic in a way that enables the control of the prosthetic using the sensor data.
  • Kinematic mapping processes that enable the processing of sensor data and/or the mapping of processed sensor data to prosthetics are discussed below. Further, while a particular kinematic mapping system has been discussed above with reference to FIG. 2, any number of different arrangements of components and/or data can be utilized as appropriate to the requirements of specific applications of embodiments of the invention to enable performance of kinematic mapping processes.
  • Kinematic mapping processes enable the collection of sensor data from IMU arrays and the conversion of that data into actionable information.
  • kinematic mapping processes generate subject specific kinematic information about a user equipped with an IMU array.
  • the subject specific kinematic information is referred to as control data, as the information can be utilized to control the movement of a physical and/or virtual prosthetic.
  • control data can be utilized to observe the motion of an individual in a medical context and to derive diagnostic information.
  • a virtual prosthetic implemented as a model of the patient’s body part can indeed be manipulated and displayed to aid the diagnostic process.
  • control data can map subject specific kinematic information to a particular class of prosthetic that can involve the same or different numbers of bodies and/or joints as the subject.
  • a robotic arm may not have an identical number of joints and/or bodies as a human hand, but the manipulating appendages of the robotic arm and overall movement can still be controlled by the motion of the subject.
  • FIG. 3 a high-level process for generating control data in accordance with an embodiment of the invention is illustrated.
  • Process 300 includes obtaining (310) prosthetic data.
  • prosthetic data is acquired from prosthetic devices.
  • prosthetic data can be input by a user by inputting an identifier associated with the type of prosthetic device to be controlled.
  • “none” or a lack of prosthetic data is a valid prosthetic data indicating that the system should merely export acquired sensor data and/or processed sensor data with respect to a generic model or a model associated with the type of IMU array utilized.
  • Process 300 further includes obtaining (320) sensor data.
  • Sensor data includes information from IMU arrays.
  • Information from the IMU array can include acceleration metrics and gyroscopic metrics for IMUs in the array, (e.g. IMU-level orientation information and IMU-level linear acceleration data).
  • the sensor data is provided as a stream of sequential data points during operation. However, sensor data can be logged and provided with all or many of the points in a single data transfer.
  • the sensor data can further include metadata describing the particular IMU array, including, but not limited to, the arrangement of IMUs in the array.
  • sensor data is preprocessed 330.
  • Preprocessing of sensor data can enable normalization of the acquired sensor data and/or conversion of the sensor data into a more usable format.
  • Kinematic mapping processes for preprocessing sensor data are discussed in a below section.
  • Process 300 further includes estimating (340) joint axis orientation values for the IMUs in the sensor array. Due to the complexity of this computation, it is discussed in detail below. Given joint angle values, joint axis position can be estimated (350) utilizing analytical equations of relative emotion in the corrected reference frame. Using the completed kinematic model of the linkage, control data can be generated (360) by mapping the kinematic model to a prosthetic kinematic model.
  • Preprocessing sensor data can be performed to normalize the sensor data in order to increase efficiency and accuracy of other kinematic mapping processes.
  • a kinematic mapping process for preprocessing sensor data in accordance with an embodiment of the invention is illustrated in FIG. 5.
  • Process 500 includes calculating (510) the relative rotation of each IMU. Forcing sensor data from different IMUs to relative rotation can enable an understanding of the underlying structure of the data that can be leveraged during computation of joint axis orientation values.
  • An example of an arbitrary sample sensor data vs the preprocessed sensor data in accordance with an embodiment of the invention is illustrated in FIG. 6. In many embodiments, this preprocessing process is valuable as generative topographic mapping style processes work more efficiently with zero-mean data.
  • Process 700 includes defining (710) a numerical model representing the rotations of the bodies in the linkage.
  • a linkage representation of a kinematic system can be used to define the orientation and position (referred to as a “pose” of the linkage) by the following equation:
  • T represents the pose of the end frame
  • M represents the “home position”, i.e. if all joint angles were zero
  • q h represents the joint angle of the n th joint
  • S n represents the“screw axis” of the n th joint.
  • FIG. 8 A visual representation of this type of linkage system in accordance with an embodiment of the invention is illustrated in FIG. 8.
  • a “simplified” version of a linkage in accordance with an embodiment of the invention is illustrated in FIG. 9, where w 1 and w 2 represent two degrees of freedom.
  • kinematic mapping processes assume a simplified version of the linkage with a fixed base IMU and an unfixed body IMU. To compute the joint axis orientation values for the two IMUs, a simplified version of the above equation as follows can be w-symmetric version of the axis w:
  • This Formula can take a 2D surface (e.g., a grid of values, q ⁇ h ⁇ q 2 ), and map it into a manifold in 3D space.
  • the 3D space is the sphere of all possible 3D rotations referred to as the Lie algebra of all rotations denoted so(3).
  • a visualization of this mapping in accordance with an embodiment of the invention is illustrated in FIG. 11. However, in the instant situation, the manifold is derived from the sensor data, and therefore the mapping must be reversed by identifying the parameters of the mapping.
  • Process 700 further includes performing (720) a nonlinear dimensionality reduction from the input vector space of the sensor data to the latent variable space.
  • the nonlinear dimensionality reduction is achieved using techniques similar to generative topographic mapping (GTM).
  • GTM generative topographic mapping
  • FIG. 12 A visual illustration of the classic GTM algorithm in accordance with an embodiment of the invention is illustrated in FIG. 12.
  • the GTM class of algorithms generate a probability distribution in the latent space, p(x).
  • W) is utilized to transform the distribution into data space to create a transformed distribution p(t
  • the model parameters include the axes of rotation.
  • the classic GTM algorithm can be modified using the BCH formula as the basis for the mapping: y(x,W).
  • Process 700 further includes estimating (730) optimal nonlinear mapping parameters using the model.
  • the transformed distribution can be compared to the actual data, and likelihood metrics can be generated based on the model of the system y(x
  • FIGs. 13A, 13B, and 13C illustrate a modified GTM style algorithm applied to an arbitrary set of sensor data at the start of the algorithm, after convergence of the model parameters, and after convergence of the model parameters with the axes estimates overlaid on the ground truth measured axes, respectively.
  • FIG. 14 a high level flow chart of a kinematic mapping process for estimating axis positions utilizing either simulated data (left “simulation path”) or real data (right “real data path”) in accordance with an embodiment of the invention is illustrated.
  • FIG. 15 the real and estimated axes of 16 arbitrary sample simulated data sets where the estimations were performed utilizing kinematic mapping processes similar to those described herein are illustrated in accordance with an embodiment of the invention.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Physiology (AREA)
  • Pathology (AREA)
  • Dentistry (AREA)
  • Cardiology (AREA)
  • Vascular Medicine (AREA)
  • Transplantation (AREA)
  • Artificial Intelligence (AREA)
  • Geometry (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Prostheses (AREA)

Abstract

L'invention concerne des systèmes et des procédés de cartographie cinématique spécifique à un sujet, conformément à des modes de réalisation de l'invention qui sont illustrés. Un mode de réalisation comprend un système de mesure cinématique spécifique à un sujet comprenant un réseau d'unités de mesure inertielles UMI, chaque UMI étant fixée à un corps dans une liaison cinématique, une prothèse, un processeur et une mémoire comprenant une application de cartographie cinématique, l'application de cartographie cinématique dirigeant le processeur pour acquérir des données de capteur à partir du réseau d'UMI, prétraiter les données de capteur, définir un modèle numérique représentant la liaison cinématique, estimer des paramètres de cartographie non linéaires optimaux par réalisation d'une réduction de dimensionnalité non linéaire sur les données de capteur prétraitées, à l'aide du modèle numérique, extraire une orientation d'axe d'articulation pour chaque axe à partir des paramètres de cartographie, estimer la position d'axe d'articulation de chaque corps, et générer des données de commande sur la base des orientations d'axe d'articulation et des positions d'axe d'articulation, les données de commande pouvant être utilisées pour manipuler la prothèse.
PCT/US2019/015923 2018-01-30 2019-01-30 Systèmes et procédés de cartographie cinématique spécifique à un sujet WO2019152566A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862624029P 2018-01-30 2018-01-30
US62/624,029 2018-01-30

Publications (1)

Publication Number Publication Date
WO2019152566A1 true WO2019152566A1 (fr) 2019-08-08

Family

ID=67479888

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/015923 WO2019152566A1 (fr) 2018-01-30 2019-01-30 Systèmes et procédés de cartographie cinématique spécifique à un sujet

Country Status (1)

Country Link
WO (1) WO2019152566A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114248266A (zh) * 2021-09-17 2022-03-29 之江实验室 双臂机器人的拟人化动作轨迹生成方法及装置、电子设备
DE102022121662B3 (de) 2022-08-26 2023-09-28 Deutsches Zentrum für Luft- und Raumfahrt e.V. Verfahren und Messeinrichtung zur Bestimmung der Lage eines Objektes

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140135658A1 (en) * 2008-12-02 2014-05-15 Avenir Medical Inc. Method And System For Aligning A Prosthesis During Surgery Using Active Sensors
US9114030B2 (en) * 2007-02-06 2015-08-25 Deka Products Limited Partnership System for control of a prosthetic device
US9642572B2 (en) * 2009-02-02 2017-05-09 Joint Vue, LLC Motion Tracking system with inertial-based sensing units

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9114030B2 (en) * 2007-02-06 2015-08-25 Deka Products Limited Partnership System for control of a prosthetic device
US20140135658A1 (en) * 2008-12-02 2014-05-15 Avenir Medical Inc. Method And System For Aligning A Prosthesis During Surgery Using Active Sensors
US9642572B2 (en) * 2009-02-02 2017-05-09 Joint Vue, LLC Motion Tracking system with inertial-based sensing units

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
KENT, B. A. ET AL.: "Electromyogram synergy control of a dexterous artificial hand to unscrew and screw objects", JOURNAL OF NEUROENGINEERING AND REHABILITATION, vol. 11, 2014, XP021183491, doi:10.1186/1743-0003-11-41 *
LIU, H. ET AL.: "A Glove-based System for Studying Hand-Object Manipulation via Joint Pose and Force Sensing", IEEE /RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS, September 2017 (2017-09-01), pages 6617 - 6624, XP033266726, ISBN: 978-1-5386-2682-5, DOI: 10.1109/IROS.2017.8206575 *
PATEL, V. ET AL.: "Linear and Nonlinear Kinematic Synergies in the Grasping Hand", JOURNAL OF BIOENGINEERING & BIOMEDICAL SCIENCE, vol. 5, no. 3, 2015, pages 1 - 8, XP055627986, DOI: 10.4172/2155-9538.1000163 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114248266A (zh) * 2021-09-17 2022-03-29 之江实验室 双臂机器人的拟人化动作轨迹生成方法及装置、电子设备
CN114248266B (zh) * 2021-09-17 2024-03-26 之江实验室 双臂机器人的拟人化动作轨迹生成方法及装置、电子设备
DE102022121662B3 (de) 2022-08-26 2023-09-28 Deutsches Zentrum für Luft- und Raumfahrt e.V. Verfahren und Messeinrichtung zur Bestimmung der Lage eines Objektes

Similar Documents

Publication Publication Date Title
Slade et al. An open-source and wearable system for measuring 3D human motion in real-time
CN107616898B (zh) 基于日常动作的上肢穿戴式康复机器人及康复评价方法
CN109243575B (zh) 一种基于移动交互和增强现实的虚拟针灸方法及系统
US11998460B2 (en) Systems and methods for approximating musculoskeletal dynamics
Mortazavi et al. Continues online exercise monitoring and assessment system with visual guidance feedback for stroke rehabilitation
Liu et al. A new IMMU-based data glove for hand motion capture with optimized sensor layout
Cohen et al. Hand rehabilitation via gesture recognition using leap motion controller
Bumacod et al. Image-processing-based digital goniometer using OpenCV
WO2019152566A1 (fr) Systèmes et procédés de cartographie cinématique spécifique à un sujet
Houston et al. Evaluation of a multi-sensor Leap Motion setup for biomechanical motion capture of the hand
Callejas-Cuervo et al. Capture and analysis of biomechanical signals with inertial and magnetic sensors as support in physical rehabilitation processes
CN112183316B (zh) 一种运动员人体姿态测量方法
Liu et al. A reconfigurable data glove for reconstructing physical and virtual grasps
Chen et al. An inertial-based human motion tracking system with twists and exponential maps
Xu et al. A Low-Cost Wearable Hand Gesture Detecting System Based on IMU and Convolutional Neural Network
Ji et al. Motion trajectory of human arms based on the dual quaternion with motion tracker
Jiang et al. Variability analysis on gestures for people with quadriplegia
Prado et al. Artificial neural networks to solve forward kinematics of a wearable parallel robot with semi-rigid links
García-de-Villa et al. Inertial sensors for human motion analysis: A comprehensive review
Spasojević et al. Kinect-based approach for upper body movement assessment in stroke
Sun et al. Development of lower limb motion detection based on LPMS
Zhou et al. Toward Human Motion Digital Twin: A Motion Capture System for Human-Centric Applications
Sun et al. Hip, knee and ankle motion angle detection based on inertial sensor
Gavier et al. VirtualIMU: Generating Virtual Wearable Inertial Data from Video for Deep Learning Applications
Franchi et al. A numerical hand model for a virtual glove rehabilitation system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19747978

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19747978

Country of ref document: EP

Kind code of ref document: A1