WO2014114967A1 - Self-calibrating motion capture system - Google Patents

Self-calibrating motion capture system Download PDF

Info

Publication number
WO2014114967A1
WO2014114967A1 PCT/IB2013/000093 IB2013000093W WO2014114967A1 WO 2014114967 A1 WO2014114967 A1 WO 2014114967A1 IB 2013000093 W IB2013000093 W IB 2013000093W WO 2014114967 A1 WO2014114967 A1 WO 2014114967A1
Authority
WO
WIPO (PCT)
Prior art keywords
nodes
data
node
cmp
movements
Prior art date
Application number
PCT/IB2013/000093
Other languages
French (fr)
Inventor
Rolf Adelsberger
Original Assignee
WENNER, Fabian
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WENNER, Fabian filed Critical WENNER, Fabian
Priority to PCT/IB2013/000093 priority Critical patent/WO2014114967A1/en
Publication of WO2014114967A1 publication Critical patent/WO2014114967A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0024Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system for multiple sensor units attached to the patient, e.g. using a body or personal area network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/186Determination of attitude
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/14Systems for determining distance or velocity not using reflection or reradiation using ultrasonic, sonic, or infrasonic waves

Definitions

  • This invention relates to the field of capturing, analyzing and interpreting the motion of one or several objects through a plurality of sensors and recommending actions to the object(s) based on the interpretation of the motion data.
  • Motion capture is used extensively in computer animation, Bruderlin et al., Motion signal processing, Proceedings of SIGGRAPH 95, pp. 105-108, 1995, Gleicher, Retargetting motion to new characters, Proceedings of SIGGRAPH 9, pp. 33-42, 1998, Kovar et al., Motion graphs, ACM Transactions on Graphics 21, 3, pp. 4763-482, 2002, and Arikan et al., Motion synthesis from annotations, pp.402-408, 2003
  • US7'628'074(D1) describes a motion capture system that comprises units sending ultrasonic pulses and units containing respective receivers as well as IMUs (Inertial Measurement Unit). Emitters and receivers form a pair (rf. Dl, page 3, line 56f.) that is connected to a so-called driver module by cable.
  • IMUs Inertial Measurement Unit
  • prior art motion capture systems (like the one described in Dl) failed to be truly autonomous - either because they were not reliable in their representation of the object's underlying movements due to significant drift or because they were too complex to use for a non-expert (and not modular nor scalable) as they consisted of different types of units (ie sensors, ultrasonic sources, driver modules and similar), often were wired (connected to each other by cable) or only stored motion data rather than processed and transmitted data in real-time.
  • Prior art systems are not able to self-calibrate or to recalibrate (themselves) during use - for example to correct for drift, which still represents a major disadvantage of state-of-the-art systems and causes significant problems in real life applications since measurement errors are accumulated over time resulting in inacceptable aberrations of analyzed movement patterns even after only a few minutes.
  • Prior art systems describe automatic error corrections just on a theoretical basis and either just store the motion data and later correct it using joint and other restrictions or they fail to capture movements precisely in real-time, as indicated by the publications below:
  • Prior art systems consisted of different types of units (rather than one single type of node) that could not operate autonomously but required (power and computational support from) a central driver module to be able to communicate with each other.
  • Ultrasonic signal sources ie senders as well as sensors were usually connected to this driver module by cables; see also Dl).
  • Prior art systems are converting the analog ultrasonic signal into a digital one, transmitting it together with the IMU data and deciphering it on an external microprocessor. Such a process represents the traditional way.
  • Major disantavage of such a handling is the velocity (i.e. real time applications).
  • This invention shows a solution in which the distance measurements between the sensors are evaluated on board of each respective sensor, rather than converting the data and transmitting it together with the IMU data. Consequently, data processing and ultrasonic signal processing are significantly more efficient, therefore faster and more precise.
  • Prior art consists of and uses in particular the following devices and techniques which are disadvantageous: • Prior art systems used and differentiated two or more types of nodes: sources and sensors (and driver modules and other units).
  • Prior art systems consisted of a combination of a master device and multiple slave devices that were physically connected to the master node for power supply and communication.
  • Each sensor comprises its own microcontroller that communicates with that of the other nodes;
  • CMP central microprocessor
  • the nodes In autonomous mode, the nodes are capable of capturing, processing and storing all data onboard for later evaluation. In dependent mode nodes communicate, capture, process and transmit all data to a CMP that then triggers feedback for actions to the user;
  • CMP central microprocessor
  • the invention refers to a system that captures and transmits movements of objects (humans, animals, etc) in real-time, without drift and wireless to a computing device (PC, mobile device or other) which further processes the data - for example to analyze, visualize or animate the movements in a virtual 3D environment and / or to trigger signals depending on the pattern of movement.
  • a computing device PC, mobile device or other
  • the system is fully mobile, not stationary and thus can be used anywhere. It consists of wearable, unobtrusive (small) nodes (sensors) which are autonomous (independent) from each other with regards to energy supply and computational power. Their CPUs (microcontroller) allows them to react to specific events and calculate recommendations. They are fully controllable from a master device (CMP, central microprocessor).
  • CMP central microprocessor
  • the system consists of a number of nodes (la, lb, ...) arranged on an object (human, animal, other object).
  • the nodes (la, lb, ...) are placed on those extremities of the object whose movements are intended to be captured (refer to Figure 1).
  • the system can operate in two modes: In a) (dependent mode) it can detect, analyze and transmit that object's movements over an extended period of time to a central microprocessor (CMP; ie stationary or mobile computing device). In b) (autonomous mode) it can record that object's motion data over time, store it onboard the node and give feedback and / or specific advice on the evaluated input.
  • CMP central microprocessor
  • autonomous mode it can record that object's motion data over time, store it onboard the node and give feedback and / or specific advice on the evaluated input.
  • the system is also capable of extending the above functionality to monitor, analyze, store several objects' movements simultaneously (through each object's sensors serving as an independent body sensor network that is tracked separately).
  • the system is scalable with regards to the number of nodes used per object.
  • the nodes (la, lb, %) and the overall system compute data transformations on-board, in real-time and are able to self-calibrate and detect their position on the underlying physical structure automatically.
  • the nodes (la, lb, ...) are able to adjust for drift during use, by comparing movements calculated from a digital filter algorithm that depends on node attitude data, inter-node distance data and raw IMU values.
  • the digital filter module estimates with high accuracy the relative position of the nodes to each other by correcting drift using inter-node distance values.
  • Each node contains an IMU (inertial measurement unit comprising a 3-axis gyroscope, a 3-axis accelerometer, and a magnetometer) to determine each node's 3-dimensional shift in positioning and a piezoelectric transducer (refer to Figure 5) that emits ultrasonic pulses in a sequence of x ms to measure the one-dimensional distance to all other nodes (by means of the ultrasonic signal emitted by one transducer and received by all others per unit in time).
  • This system no longer relies on microphones, pre-amplifier or multiplexing filters, it only deploys one type of unit (sensor / node) rather than several different types as most prior art.
  • the IMU data in combination with the distances allow to determine the 3D locations of all the nodes (la, lb, ...) with respect to a local coordinate system. Recovering global position of the system is also feasible with added externally fixed beacon nodes. Given the 3D location and orientation, it is possible to determine joint angles according to an (articulated skeleton model) object using inverse kinematics processes or to determine the joint angles directly from an incomplete distance matrix. (We extend inverse kinematics methods to work with distances instead of 3D locations) .
  • the system requires two different programs: one for the central processing unit to control data flow and interpret the results, the other for the nodes to process the IMU and ultrasound data as well as transmit or store it and give specific advice based on the acquired information (on the CMP device).
  • the system works as follows: At startup the system's central microprocessor (CMP ie PC or mobile computing device) sends an initiation signal (heartbeat) over its wireless communication chip to the nodes (la, lb, 7), establishing communication that allows each node to transfer data back to the CMP / wireless chip at a clocking pattern defined by the communication protocol (ANT+, WiFi, Bluetooth, UWB or similar). As part of that initialization, the CMP detects 1) how many nodes make up the active configuration 2) which node is attached to which part of the body (refer to Figure 3).
  • CMP central microprocessor
  • state (I) the system can measure the distance of each node to all others, in states (II and III) the system detects which nodes are fixated on the arm that is currently being moved.
  • the joint restrictions for that particular object help the system determine a) which way the object faces and b) approximately where on the arm the node(s) are fixated since a wider movement radius is associated with a positioning closer towards the hand rather than the shoulder.
  • states (IV and V) in which the system detects which nodes (la, lb, ...) are fixated on the legs and where exactly each node is fixated. Just as with the arms, a wider motion radius suggest a position closer towards the foot rather than the thigh.
  • the wireless channel protocol determines the clocking pattern for the communication between the nodes (la, lb, ...) and the CMP. It also determines the periodicity of the ultrasonic signal for each node. In other words the transducers of all nodes are clocked by the protocol of the communication circuit on board each node whereby the sequence pattern depends on the number of nodes simultaneously used by the individual system of one object.
  • the invention utilizes the ultrasonic signal to assess the one-dimensional distances of each node to all others in the system and to correct the attitude data gathered by each node's IMUs (time of flight measurements complemented by linear accelerations and angular velocities).
  • Measured IMU (attitude) data and distances are further complemented utilizing constraints based on biomechanical characteristics of the underlying object (human body for example).
  • the object's underlying skeleton model implies that body segments are linked by joints and that the nodes (la, lb, ...) are attached to the objects body segments.
  • This model uses different constraints for knee and shoulder joints (as described in V.M. Zatsiorsky, Kinematics of Human Motion, Human Kinetics, 1998). By continuously correcting the kinematics using the joint relation (and distance measurements), unbounded integration drift is prevented.
  • Each node's (la, lb, ...) motion data is processed online onboard that node by the microcontroller through a digital filter (e.g., inspired by complimentary filter or extended Kalman filter) to determine joint configurations for the body.
  • the ultrasonic signal of the transducer is processed by the microcontroller and features (time stamp, time of flight and power data) are fed into the processing pipeline for the digital filter.
  • features time stamp, time of flight and power data
  • a specific node sends one set of data to the CMP consisting of the node's ID, timestamp, IMU (attitude) data and distance to each other sensor.
  • the IMU data from the accelerometer and gyroscope are passed through a digital filter on
  • each node to generate the node's orientation relative to the earth's coordinate system. Since the inertial signal processing (from the IMUs) is digital and not analogue any more - as was the case for prior art systems - the (amount of) data communicated is very efficient relative to prior art systems and - for a system of 10 sensors - corresponds to a data transfero package of only 8 KB/s (a factor of 1000 less than prior art systems that transmitted analogue data).
  • the pose is defined as including location and orientation.
  • the six degree of freedom pose can be determined with respect to a global or world coordinate system.
  • a standard extended Kalman filter EKF
  • EKF extended Kalman filter
  • the central microprocessor processes all object pose operations in real-time.
  • the program In determining object pose at each point in time the program considers body structure constraints that help in the recovery of joint configurations.
  • the configuration of an articulated body is specified by the joint angles that describe configurations of shoulders, elbows, and other body joints.
  • the invention computes position and orientation of body points as a function of the joint angles. Joint configurations depend on the underlying physical structure and - if different from a human body structure - can be modified by the software on the user's computing device (PC, mobile computing device or similar).
  • the configuration is specified by the joint angles that describe configurations of all body joints.
  • each node is transmitting at a5 defined sequence in time (clocking pattern) a set of data (node id, time stamp, IMU data, distance matrix to all other nodes) to the CMP.
  • the system works as follows: Since there is no central microprocessor employed in autonomous mode, the nodes (la, lb, %) de- termine one of their own to serve as the 'master' node which then takes on the role of the CMP (lx) as depicted in Figure 2: it sends an initiation signal (heartbeat) over its wireless communication chip to the other nodes (la, lb, 7), establishing communication that allows each node to transfer data back to the 'master' node's wireless chip at a clocking pattern defined by the communication protocol (ANT+, WiFi, Bluetooth, UWB or similar). As part of that initialization the 'master' node detects 1) how many nodes make up the active configuration 2) which node is attached to which part of the body.
  • the communication protocol ANT+, WiFi, Bluetooth, UWB or similar
  • the system can, in autonomous mode, be programmed to use the data to generate feedback and specific advice for the user's behavior that is to be displayed on a mobile computing device (trigger function).
  • Security applications such as authentication, autorization, safety
  • an external system needs to verify an user's authenticity or the physical state of the user.
  • Figure 1 shows the invention operating in autonomous mode with a possible arrangement of the nodes (la, lb, ...) attached to each moveable body segment of a human being where an arbitrary node lx is taking the role of a central microprocessor (CMP) that is managing the system's data processing rather than relying on a separate processing device such as a stationary or mobile computing device.
  • CMP central microprocessor
  • the number of nodes is variable (scalable).
  • the nodes analyze and interpret the motion data to formulate recommendations or trigger other actions.
  • Figure 2 shows schematically the invention in autonomous mode: sensor nodes (la, lb, . . . ) communicate to a CMP-enabled node of the same type (lx).
  • Figure 3 shows schematically the invention in dependent mode: sensor nodes (la), lb), . . . ) communicate to a CMP- node of a different type (2).
  • Figure 4 shows the invention at an early stage after startup: the nodes (la), lb), etc.) do not know their exact position on the body yet. This uncertainty is visualized by multiple circles representing the most likely position of a node.
  • Figure 5 shows schematically the block diagram of a node (sensor).
  • the arrows represent data flow between the modules of a sensor node.
  • FIG. 6 shows how drift is prevented:
  • Each node sends an ultrasonic signal at a specific point in time predetermined by the clocking pattern fixed at start-up by the wireless communication protocol. The signal is received by all other nodes that measure their distance to that specific node. These distance measurements enable the system to generate a graph structure where the edges are the inter-node distances and the vertices are the nodes.
  • This graph structure together with dynamic attitude data of each node, allows the system to estimate the node positions based on an object- dependent motion profile.
  • a motion profile consists of statistics about IMU raw data, attitude data and inter-node distance data. After a short time period during which the system has been used dynamically, i.e.
  • the system has estimated the exact node positions relative to the underlying skeletal (body or object) structure.
  • the system performs this position-optimization also during regular run-time order to prevent the typical drift seen in prior art systems that were able to operate in real-time.
  • Trigger-Function to provide feedback and advice
  • This function can be adapted for use in rehabilitation, sports, engineering, animation, simulation, and commercial- as well as entertainment applications.
  • System recommendations may refer to the wearer's posture, his level of activity, the motion property of his limbs, the distribution of his weight, the position of his feet, knees and head, the synchronization and general flow of his movements and the range of his motions among others.
  • the software on the CMP evaluates the motion data (or features calculated thereout) transmitted by the nodes and - apart from visualizing the movement in 3D - provides either statistics or characteristics of (body-) posture or returns instructions adapt change certain movements, postures or other specifics. It does so by comparing the estimated body posture or the recorded movement patterns with normative static posture templates or motion sequences for certain activities. It may also suggest to perform certain exercises or change of posture to prevent pain or strain of muscles and or other body parts. Posture templates are depending on the physical object currently being tracked. They can be implemented as static target pose or body alignments or statistics on the dynamic characteristics of object movement.
  • a target pose is a defined arrangement of sensor nodes in terms of inter-node distances and individual node attitude (up to an arbitrary, but defined, accuracy).
  • Statistics on the dynamic characteristics are time dynamic features calculated within a specific time window. They are calculated on a (sub-) set of the nodes and incorporate distance data, attitude, but also raw IMU data over multiple sampling periods.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention relates to the field of capturing, analyzing and interpreting the motion of one or several objects through a plurality of sensors and recommending actions to the object(s) based on the interpretation of the motion data.

Description

Self-Calibrating Motion Capture System
Field of the Invention
This invention relates to the field of capturing, analyzing and interpreting the motion of one or several objects through a plurality of sensors and recommending actions to the object(s) based on the interpretation of the motion data.
Background Information and Prior Art
Motion capture is used extensively in computer animation, Bruderlin et al., Motion signal processing, Proceedings of SIGGRAPH 95, pp. 105-108, 1995, Gleicher, Retargetting motion to new characters, Proceedings of SIGGRAPH 9, pp. 33-42, 1998, Kovar et al., Motion graphs, ACM Transactions on Graphics 21, 3, pp. 4763-482, 2002, and Arikan et al., Motion synthesis from annotations, pp.402-408, 2003
Several motion systems have been described. The advantages and disadvantages are presented in several surveys, Meyer et al., A survey of position- trackers, Presence 1, 2, pp. 173-200, 1992, Hightower et al., Location systems for ubiquitous computing, IEEE Computer 34, 8, pp. 57-66, 2001, and Welch et al., Motion tracking: No silver bullet, but a respectable arsenal, IEEE Computer Graphics and Applications, special issue on Tracking 22, 6, pp. 24-38, 2002.
US7'628'074(D1) describes a motion capture system that comprises units sending ultrasonic pulses and units containing respective receivers as well as IMUs (Inertial Measurement Unit). Emitters and receivers form a pair (rf. Dl, page 3, line 56f.) that is connected to a so-called driver module by cable. A significant handicap of that system is its dependency on preamplifiers and multiplexers and its data intensity: due to the abundance of data accumulated and transmitted (1.33 MB/s for a 10 node system and more [140000 B/s * 10 = 1.33 MB/s), that system wouldn't have been suitable for real-time use. (Wireless-G 802. llg have a bandwith of 54 Mbit/s, equivalent of 6.59 MB/s).
Also, prior art motion capture systems (like the one described in Dl) failed to be truly autonomous - either because they were not reliable in their representation of the object's underlying movements due to significant drift or because they were too complex to use for a non-expert (and not modular nor scalable) as they consisted of different types of units (ie sensors, ultrasonic sources, driver modules and similar), often were wired (connected to each other by cable) or only stored motion data rather than processed and transmitted data in real-time.
Prior art systems are not able to self-calibrate or to recalibrate (themselves) during use - for example to correct for drift, which still represents a major disadvantage of state-of-the-art systems and causes significant problems in real life applications since measurement errors are accumulated over time resulting in inacceptable aberrations of analyzed movement patterns even after only a few minutes. Prior art systems describe automatic error corrections just on a theoretical basis and either just store the motion data and later correct it using joint and other restrictions or they fail to capture movements precisely in real-time, as indicated by the publications below:
• Damgrave et al., 2009: The Drift of the Xsens Moven Motion Capturing Suit during Common Movement in a Working Environment, Proceedings of the 19th CIRP Design Conference Competitive Design, 30-31 March, pp 338
• Sun et al., 2010: Adaptive Sensor Data Fusion in Motion Capture, Proceedings of the 13th International Conference on Information Fusion, 26-29 July
• Zhou, H. et H. Hu, 2010: Reducing Drifts in the Inertial Measurements of Wrist and Elbow Positions, IEEE Transactions on Instrumentation and Measurement, Vol. 59, No. 3, March 2010
Prior art systems consisted of different types of units (rather than one single type of node) that could not operate autonomously but required (power and computational support from) a central driver module to be able to communicate with each other. (Ultrasonic signal sources ie senders as well as sensors were usually connected to this driver module by cables; see also Dl). Prior art systems are converting the analog ultrasonic signal into a digital one, transmitting it together with the IMU data and deciphering it on an external microprocessor. Such a process represents the traditional way. Major disantavage of such a handling is the velocity (i.e. real time applications). This invention shows a solution in which the distance measurements between the sensors are evaluated on board of each respective sensor, rather than converting the data and transmitting it together with the IMU data. Consequently, data processing and ultrasonic signal processing are significantly more efficient, therefore faster and more precise.
Prior art consists of and uses in particular the following devices and techniques which are disadvantageous: • Prior art systems used and differentiated two or more types of nodes: sources and sensors (and driver modules and other units).
• Prior art systems deployed a driver module to compile, record and store the sensor data.
• Prior art systems deployed separate A/D converters and transducer drivers etc.
• Prior art systems:
— either deployed no distance measurement through ultrasonic signal communication thereby exhibiting significant drift in pose estimation over a very short period of time.
- or accumulated and transmitted too much data so that after some time the 'simulated' or captured data deviates from and lags the underlying, correct data. These systems thus fail to qualify as real-time systems.
• Prior art systems were not able to calibrate themselves automatically and continuously while in use.
• Prior art systems did not provide feedback functions for the user to establish corrective and adaptive use of the system.
• Prior art systems consisted of a combination of a master device and multiple slave devices that were physically connected to the master node for power supply and communication.
• Prior art did not allow automatic processing of sensory data, but relied on human post-processing and parameter optimization in order to recover node attitude values and object poses.
On this background it is the intended goal of the invention to provide a simple, easy-to-use, self-calibrating, mobile, cost-efficient, versatile (sensor) system that captures movements of an object with utmost precision.
The invention describes the technical solution to:
• eliminate drift inherent in prior art (IMU) motion capture systems, through automatic calibration of the IMU data dependent on the underlying physical structure (rig) of the object by utilizing a [16-bit] integrated mc-based ultrasound distance measuring system;
• be less obtrusive for the wearing object by being smaller, communicating wireless, deploying only one type of node that can send, receive, store and trigger signals (the system deploys no separate driver module). In addition the system offers the following novel benefits:
• Each sensor comprises its own microcontroller that communicates with that of the other nodes;
• it represents a fully modular, and thus scalable motion capture system (a flexible number of nodes can be used per object, additional different sensor types such as finger and sole sensors can seamlessly be integrated into the system);
• it is much simpler to use due to automatic (re) calibration during use;
• it allows for real-time processing, transmission and representation of the motion capture data;
• it allows the tracking of several objects, each with its own set of nodes through one central microprocessor (CMP);
• it operates in two types of modes - dependent and autonomous - ie with and without an external and separate central microprocessor (CMP). In autonomous mode, the nodes are capable of capturing, processing and storing all data onboard for later evaluation. In dependent mode nodes communicate, capture, process and transmit all data to a CMP that then triggers feedback for actions to the user;
• it allows wireless recharging of each sensor through a built-in inductive power-supply-unit (PSU).
Detailed description of
embodiments and functions
Self-calibrating, wireless and autonomous Motion Capture Sensor System The invention refers to a system that captures and transmits movements of objects (humans, animals, etc) in real-time, without drift and wireless to a computing device (PC, mobile device or other) which further processes the data - for example to analyze, visualize or animate the movements in a virtual 3D environment and / or to trigger signals depending on the pattern of movement.
The system is fully mobile, not stationary and thus can be used anywhere. It consists of wearable, unobtrusive (small) nodes (sensors) which are autonomous (independent) from each other with regards to energy supply and computational power. Their CPUs (microcontroller) allows them to react to specific events and calculate recommendations. They are fully controllable from a master device (CMP, central microprocessor).
The system consists of a number of nodes (la, lb, ...) arranged on an object (human, animal, other object).The nodes (la, lb, ...) are placed on those extremities of the object whose movements are intended to be captured (refer to Figure 1).
The system can operate in two modes: In a) (dependent mode) it can detect, analyze and transmit that object's movements over an extended period of time to a central microprocessor (CMP; ie stationary or mobile computing device). In b) (autonomous mode) it can record that object's motion data over time, store it onboard the node and give feedback and / or specific advice on the evaluated input. The system is also capable of extending the above functionality to monitor, analyze, store several objects' movements simultaneously (through each object's sensors serving as an independent body sensor network that is tracked separately).
The system is scalable with regards to the number of nodes used per object. The nodes (la, lb, ...) and the overall system compute data transformations on-board, in real-time and are able to self-calibrate and detect their position on the underlying physical structure automatically. The nodes (la, lb, ...) are able to adjust for drift during use, by comparing movements calculated from a digital filter algorithm that depends on node attitude data, inter-node distance data and raw IMU values. The digital filter module estimates with high accuracy the relative position of the nodes to each other by correcting drift using inter-node distance values.
Each node (la, lb, ...) contains an IMU (inertial measurement unit comprising a 3-axis gyroscope, a 3-axis accelerometer, and a magnetometer) to determine each node's 3-dimensional shift in positioning and a piezoelectric transducer (refer to Figure 5) that emits ultrasonic pulses in a sequence of x ms to measure the one-dimensional distance to all other nodes (by means of the ultrasonic signal emitted by one transducer and received by all others per unit in time). This system no longer relies on microphones, pre-amplifier or multiplexing filters, it only deploys one type of unit (sensor / node) rather than several different types as most prior art.
The IMU data in combination with the distances allow to determine the 3D locations of all the nodes (la, lb, ...) with respect to a local coordinate system. Recovering global position of the system is also feasible with added externally fixed beacon nodes. Given the 3D location and orientation, it is possible to determine joint angles according to an (articulated skeleton model) object using inverse kinematics processes or to determine the joint angles directly from an incomplete distance matrix. (We extend inverse kinematics methods to work with distances instead of 3D locations) .
The system requires two different programs: one for the central processing unit to control data flow and interpret the results, the other for the nodes to process the IMU and ultrasound data as well as transmit or store it and give specific advice based on the acquired information (on the CMP device).
Dependent mode (a)
In dependent mode the system works as follows: At startup the system's central microprocessor (CMP ie PC or mobile computing device) sends an initiation signal (heartbeat) over its wireless communication chip to the nodes (la, lb, ...), establishing communication that allows each node to transfer data back to the CMP / wireless chip at a clocking pattern defined by the communication protocol (ANT+, WiFi, Bluetooth, UWB or similar). As part of that initialization, the CMP detects 1) how many nodes make up the active configuration 2) which node is attached to which part of the body (refer to Figure 3).
Description of calibration:
Subsequently an example of a calibration procedure is given for a person. This calibration is performed to assess the exact position of each node on the respective object: Following startup of the communication between nodes and CMP the 2nd step is determined by an automatic calibration algorithm in which for a short instant each user performs a series of separate, short movements of different limbs, for example he has to I) stand still in a T-position, II) move one arm, III) move the other arm, IV) move one leg, V) move the other leg. The CMP can then determine each node's position on the underlying physical structure (articulated skeleton model) as depicted in Figure 4. (A lack of major movements during this calibration process for example indicates that a node is fixated on the torso).
In state (I) the system can measure the distance of each node to all others, in states (II and III) the system detects which nodes are fixated on the arm that is currently being moved. The joint restrictions for that particular object help the system determine a) which way the object faces and b) approximately where on the arm the node(s) are fixated since a wider movement radius is associated with a positioning closer towards the hand rather than the shoulder. The same logic applies to states (IV and V) in which the system detects which nodes (la, lb, ...) are fixated on the legs and where exactly each node is fixated. Just as with the arms, a wider motion radius suggest a position closer towards the foot rather than the thigh.
The wireless channel protocol determines the clocking pattern for the communication between the nodes (la, lb, ...) and the CMP. It also determines the periodicity of the ultrasonic signal for each node. In other words the transducers of all nodes are clocked by the protocol of the communication circuit on board each node whereby the sequence pattern depends on the number of nodes simultaneously used by the individual system of one object. The invention utilizes the ultrasonic signal to assess the one-dimensional distances of each node to all others in the system and to correct the attitude data gathered by each node's IMUs (time of flight measurements complemented by linear accelerations and angular velocities).
Measured IMU (attitude) data and distances are further complemented utilizing constraints based on biomechanical characteristics of the underlying object (human body for example). The object's underlying skeleton model implies that body segments are linked by joints and that the nodes (la, lb, ...) are attached to the objects body segments. This model uses different constraints for knee and shoulder joints (as described in V.M. Zatsiorsky, Kinematics of Human Motion, Human Kinetics, 1998). By continuously correcting the kinematics using the joint relation (and distance measurements), unbounded integration drift is prevented.
Each node's (la, lb, ...) motion data is processed online onboard that node by the microcontroller through a digital filter (e.g., inspired by complimentary filter or extended Kalman filter) to determine joint configurations for the body. The ultrasonic signal of the transducer is processed by the microcontroller and features (time stamp, time of flight and power data) are fed into the processing pipeline for the digital filter. At each unit in time - predetermined by the clocking pattern - a specific node sends one set of data to the CMP consisting of the node's ID, timestamp, IMU (attitude) data and distance to each other sensor. The IMU data from the accelerometer and gyroscope are passed through a digital filter on
5 board each node to generate the node's orientation relative to the earth's coordinate system. Since the inertial signal processing (from the IMUs) is digital and not analogue any more - as was the case for prior art systems - the (amount of) data communicated is very efficient relative to prior art systems and - for a system of 10 sensors - corresponds to a data transfero package of only 8 KB/s (a factor of 1000 less than prior art systems that transmitted analogue data).
Recovering object pose: The pose is defined as including location and orientation. The six degree of freedom pose can be determined with respect to a global or world coordinate system. To recover the pose of a trackeds subject, it is used a standard extended Kalman filter (EKF) to combine all the measurements the invention provides; accelerations from accelerometer, angular velocities from gyroscopes, and distances from the ultrasonic sensors.
In dependent mode the central microprocessor (CMP) processes all object pose operations in real-time. In determining object pose at each point in time the program considers body structure constraints that help in the recovery of joint configurations. The configuration of an articulated body is specified by the joint angles that describe configurations of shoulders, elbows, and other body joints. The invention computes position and orientation of body points as a function of the joint angles. Joint configurations depend on the underlying physical structure and - if different from a human body structure - can be modified by the software on the user's computing device (PC, mobile computing device or similar). The configuration is specified by the joint angles that describe configurations of all body joints.
o The system adapts to position shifts by calibrating itself and automatically correcting for detected deviations: the distance measurements delivered by the ultrasonic signal processing allows the system to correct the attitude data gathered from the IMUs that are exposed to significant drift and deviations over time (refer to Figure 6). To do so, each node is transmitting at a5 defined sequence in time (clocking pattern) a set of data (node id, time stamp, IMU data, distance matrix to all other nodes) to the CMP.
Autonomous mode (b)
In autonomous mode the system works as follows: Since there is no central microprocessor employed in autonomous mode, the nodes (la, lb, ...) de- termine one of their own to serve as the 'master' node which then takes on the role of the CMP (lx) as depicted in Figure 2: it sends an initiation signal (heartbeat) over its wireless communication chip to the other nodes (la, lb, ...), establishing communication that allows each node to transfer data back to the 'master' node's wireless chip at a clocking pattern defined by the communication protocol (ANT+, WiFi, Bluetooth, UWB or similar). As part of that initialization the 'master' node detects 1) how many nodes make up the active configuration 2) which node is attached to which part of the body.
Otherwise the workflow follows the 'Description of calibration' outlined above, with the key difference being that the data gathered is not stored and processed on a central microprocessor (of a computing device) but stored for later evaluation on a small data storage medium that is present on one or each node (a mini-SD card for example).
Depending on the application, the system can, in autonomous mode, be programmed to use the data to generate feedback and specific advice for the user's behavior that is to be displayed on a mobile computing device (trigger function).
The system's trigger function lends itself to use in:
• Medical applications such as orthopedic rehabilitation, diagnosis and monitoring of neurological disorders;
• Sports where the course of movement, path of motions and posture is analyzed and continuously optimized;
• Military training where subjects may receive direct recommendations from the system itself about how to improve their tactics;
• Security applications (such as authentication, autorization, safety) where an external system needs to verify an user's authenticity or the physical state of the user.
Figure 1 shows the invention operating in autonomous mode with a possible arrangement of the nodes (la, lb, ...) attached to each moveable body segment of a human being where an arbitrary node lx is taking the role of a central microprocessor (CMP) that is managing the system's data processing rather than relying on a separate processing device such as a stationary or mobile computing device. The number of nodes is variable (scalable). In autonomous mode the nodes analyze and interpret the motion data to formulate recommendations or trigger other actions.
Figure 2 shows schematically the invention in autonomous mode: sensor nodes (la, lb, . . . ) communicate to a CMP-enabled node of the same type (lx).
Figure 3 shows schematically the invention in dependent mode: sensor nodes (la), lb), . . . ) communicate to a CMP- node of a different type (2). Figure 4 shows the invention at an early stage after startup: the nodes (la), lb), etc.) do not know their exact position on the body yet. This uncertainty is visualized by multiple circles representing the most likely position of a node.
Figure 5 shows schematically the block diagram of a node (sensor). The arrows represent data flow between the modules of a sensor node.
Figure 6: shows how drift is prevented: Each node sends an ultrasonic signal at a specific point in time predetermined by the clocking pattern fixed at start-up by the wireless communication protocol. The signal is received by all other nodes that measure their distance to that specific node. These distance measurements enable the system to generate a graph structure where the edges are the inter-node distances and the vertices are the nodes. This graph structure, together with dynamic attitude data of each node, allows the system to estimate the node positions based on an object- dependent motion profile. A motion profile consists of statistics about IMU raw data, attitude data and inter-node distance data. After a short time period during which the system has been used dynamically, i.e. the nodes have been moved, the system has estimated the exact node positions relative to the underlying skeletal (body or object) structure. The system performs this position-optimization also during regular run-time order to prevent the typical drift seen in prior art systems that were able to operate in real-time.
Trigger-Function to provide feedback and advice
This function can be adapted for use in rehabilitation, sports, engineering, animation, simulation, and commercial- as well as entertainment applications.
System recommendations may refer to the wearer's posture, his level of activity, the motion property of his limbs, the distribution of his weight, the position of his feet, knees and head, the synchronization and general flow of his movements and the range of his motions among others.
Depending on the application, the software on the CMP (App on Smart- phone, program on PC, etc) evaluates the motion data (or features calculated thereout) transmitted by the nodes and - apart from visualizing the movement in 3D - provides either statistics or characteristics of (body-) posture or returns instructions adapt change certain movements, postures or other specifics. It does so by comparing the estimated body posture or the recorded movement patterns with normative static posture templates or motion sequences for certain activities. It may also suggest to perform certain exercises or change of posture to prevent pain or strain of muscles and or other body parts. Posture templates are depending on the physical object currently being tracked. They can be implemented as static target pose or body alignments or statistics on the dynamic characteristics of object movement. A target pose is a defined arrangement of sensor nodes in terms of inter-node distances and individual node attitude (up to an arbitrary, but defined, accuracy). Statistics on the dynamic characteristics are time dynamic features calculated within a specific time window. They are calculated on a (sub-) set of the nodes and incorporate distance data, attitude, but also raw IMU data over multiple sampling periods.

Claims

Claims:
1. A system for capturing 3D-movement of at least one vertex, characterized in such a way that at each vertex a node (sensor unit, la, lb, ...) is located which comprises an IMU (at least one accelerometer, gyroscope and magnetometer) , an ultrasonic transducer, a storage capacity and a microcontroller, so that this node continuously tracks, maps, computes, stores and transmits the movement data (attitude plus raw sensor data) of at least one vertex cooperatively through exchange of information.
2. A system for capturing movements according to claim 1, characterized in such a way that the distances between the nodes (la, lb, lc, ...) are continuously and pairwise measured (time of flight of ultrasonic pulse) so that this information serves as a basis for drift and error corrections.
3. The system of claim 1 or 2, in which nodes are stand-alone, interchangeable entities.
4. A system according to one of the claims 1-3 where nodes are autonomous with regard to energy and computational power support and work cooperatively.
5. A system according to one of the claims 1-4 where nodes form a scalable system of modular sensor units that capture movements in realtime.
6. A system according to one of the claims 1-5 where nodes can operate independent of a central microprocessor (CMP).
7. A system according to one of the claims 1-6 in which the nodes or the CMP continuously utilize movements of the object to (self-calibrate ie) detect the nodes' positions relative to each other on the underlying model (human or animal body, animate or inanimate articulated object etc.) whose physical structure is stored among others in a database onboard the nodes and which is recognized (automatically) by the system (of nodes) at initial startup (initial 'calibration').
8. A system according to one of the claims 1-7 in which these nodes can store, analyze, interpret and process the data in real-time.
9. A system according to one of the claims 1-8 in which nodes are said to be in self-sustained mode meaning that no separate control device needs to be present. The nodes store the data onboard for later upload to and analysis by a CMP.
10. A system according to one of the claims 1-9 in which the nodes capture, store, analyze, interpret, process and transmit the motion data in realtime to a central microprocessor (CMP) unit (PC, mobile computing device) that further processes and interprets this data for the user (for example: visual representation) continuously.
11. A system according to one of the claims 1-10 in which the system is tracking, analyzing, storing, interpreting motion data of more than one object simultaneously through at least one node attached to each object [and one additional node placed somewhere else serving as a reference point for the objects' distance and orientation to each other] .
12. A system according to one of the claims 1-11 in which the nodes can trigger feedback (visual, acoustic, electronic etc.) to the user to allow for a real-time correction in the object's movement, posture or position.
13. A system according to one of the claims 1-12 in which at least one node is both sender and receiver (lx in Figure 1) of data within the system.
14. A system according to one of the claims 1-13 in which the nodes can be switched to a state where they store raw data Onboard' or transmit raw data to the CMP.
15. A system according to one of the claims 1-14 in which nodes are wireless.
16. A system according to one of the claims 1-15 in which the system allows the inclusion of other types of sensors such as sole pressure- or finger nodes (sensors).
17. A system according to one of the claims 1-16 in which each node contains an inductor to re-charge the node's battery wirelessly.
PCT/IB2013/000093 2013-01-25 2013-01-25 Self-calibrating motion capture system WO2014114967A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2013/000093 WO2014114967A1 (en) 2013-01-25 2013-01-25 Self-calibrating motion capture system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2013/000093 WO2014114967A1 (en) 2013-01-25 2013-01-25 Self-calibrating motion capture system

Publications (1)

Publication Number Publication Date
WO2014114967A1 true WO2014114967A1 (en) 2014-07-31

Family

ID=47844407

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2013/000093 WO2014114967A1 (en) 2013-01-25 2013-01-25 Self-calibrating motion capture system

Country Status (1)

Country Link
WO (1) WO2014114967A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016112108A (en) * 2014-12-12 2016-06-23 カシオ計算機株式会社 Exercise information display system, exercise information display method, and exercise information display program
ES2615167A1 (en) * 2015-12-04 2017-06-05 José María GARCÍA RIELO Device for registering the natural position of the head of an individual through unit of inertial measurement (imu) and associated method (Machine-translation by Google Translate, not legally binding)
JP2017164376A (en) * 2016-03-17 2017-09-21 株式会社東芝 Behavior estimation device, behavior estimation method and behavior estimation program
EP3324204A1 (en) * 2016-11-21 2018-05-23 HTC Corporation Body posture detection system, suit and method
WO2018095804A1 (en) * 2016-11-25 2018-05-31 Sensoryx AG Wearable motion tracking system
CN108170268A (en) * 2017-12-26 2018-06-15 浙江大学 A kind of Whole Body motion capture devices based on Inertial Measurement Unit
EP3278321A4 (en) * 2015-03-31 2018-09-26 CAE Inc. Multifactor eye position identification in a display system
CN109269483A (en) * 2018-09-20 2019-01-25 国家体育总局体育科学研究所 A kind of scaling method of motion capture node, calibration system and calibration base station
DE102017120741A1 (en) * 2017-09-08 2019-03-14 Tim Millhoff Device, system and method for decoupling a VR system from infrastructure and localized hardware
CN110044377A (en) * 2019-04-08 2019-07-23 南昌大学 A kind of IMU off-line calibration method based on Vicon
CN110517750A (en) * 2019-08-21 2019-11-29 兰州交通大学 A kind of more human action method for catching of fusion WIFI positioning and inertia sensing
EP3716017A1 (en) * 2019-03-29 2020-09-30 Nokia Technologies Oy Haptic feedback
CN111915943A (en) * 2020-08-18 2020-11-10 营口巨成教学科技开发有限公司 Unmanned public nursing and first-aid training system and training method
WO2023277952A1 (en) * 2021-06-28 2023-01-05 Google Llc System and method for motion capture
EP3843619B1 (en) * 2018-08-29 2023-04-26 Pulsion Medical Systems SE Method and apparatus for correcting a blood pressure measurement taken at a measurement location
WO2023100565A1 (en) * 2021-11-30 2023-06-08 リオモ インク Running form evaluation system, program, and method
EP4084936A4 (en) * 2019-12-31 2024-01-10 Human Mode Llc Proxy controller suit with optional dual range kinematics

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020103610A1 (en) * 2000-10-30 2002-08-01 Government Of The United States Method and apparatus for motion tracking of an articulated rigid body
US20030182077A1 (en) * 2002-03-25 2003-09-25 Emord Nicholas Jon Seamless sensory system
US7628074B2 (en) 2007-03-15 2009-12-08 Mitsubishi Electric Research Laboratories, Inc. System and method for motion capture in natural environments
US20120046901A1 (en) * 2009-01-21 2012-02-23 Birmingham City University Motion capture apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020103610A1 (en) * 2000-10-30 2002-08-01 Government Of The United States Method and apparatus for motion tracking of an articulated rigid body
US20030182077A1 (en) * 2002-03-25 2003-09-25 Emord Nicholas Jon Seamless sensory system
US7628074B2 (en) 2007-03-15 2009-12-08 Mitsubishi Electric Research Laboratories, Inc. System and method for motion capture in natural environments
US20120046901A1 (en) * 2009-01-21 2012-02-23 Birmingham City University Motion capture apparatus

Non-Patent Citations (11)

* Cited by examiner, † Cited by third party
Title
ARIKAN ET AL., MOTION SYNTHESIS FROM ANNOTATIONS, 2003, pages 402 - 408
BRUDERLIN ET AL.: "Motion signal processing", PROCEEDINGS OF SIGGRAPH, vol. 95, 1995, pages 105 - 108
DAMGRAVE ET AL.: "The Drift of the Xsens Moven Motion Capturing Suit during Common Movement in a Working Environment", PROCEEDINGS OF THE 19TH CIRP DESIGN CONFERENCE COMPETITIVE DESIGN, 30 March 2009 (2009-03-30), pages 338
GLEICHER: "Retargetting motion to new characters", PROCEEDINGS OF SIGGRAPH, vol. 9, 1998, pages 33 - 42
HIGHTOWER ET AL.: "Location systems for ubiquitous computing", IEEE COMPUTER, vol. 34, no. 8, 2001, pages 57 - 66
KOVAR ET AL.: "Motion graphs", vol. 21, 2002, ACM TRANSACTIONS ON GRAPHICS, pages: 4763 - 482
MEYER ET AL.: "A survey of position- trackers", PRESENCE, vol. 1, no. 2, 1992, pages 173 - 200
SUN ET AL.: "2010: Adaptive Sensor Data Fusion in Motion Capture", PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION, 26 July 2010 (2010-07-26)
VLASIC D ET AL: "Practical motion capture in everyday surroundings", ACM TRANSACTIONS ON GRAPHICS (TOG), ACM, US, vol. 26, no. 3, 29 July 2007 (2007-07-29), pages 35/1 - 35/10, XP007910935, ISSN: 0730-0301, DOI: 10.1145/1276377.1276421 *
WELCH ET AL.: "Motion tracking: No silver bullet, but a respectable arsenal", IEEE COMPUTER GRAPHICS AND APPLICATIONS, SPECIAL ISSUE ON TRACKING, vol. 22, no. 6, 2002, pages 24 - 38
ZHOU, H.; H. HU: "Reducing Drifts in the Inertial Measurements of Wrist and Elbow Positions", IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, vol. 59, no. 3, March 2010 (2010-03-01)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016112108A (en) * 2014-12-12 2016-06-23 カシオ計算機株式会社 Exercise information display system, exercise information display method, and exercise information display program
EP3278321A4 (en) * 2015-03-31 2018-09-26 CAE Inc. Multifactor eye position identification in a display system
ES2615167A1 (en) * 2015-12-04 2017-06-05 José María GARCÍA RIELO Device for registering the natural position of the head of an individual through unit of inertial measurement (imu) and associated method (Machine-translation by Google Translate, not legally binding)
JP2017164376A (en) * 2016-03-17 2017-09-21 株式会社東芝 Behavior estimation device, behavior estimation method and behavior estimation program
US10642368B2 (en) 2016-11-21 2020-05-05 Htc Corporation Body posture detection system, suit and method
TWI647595B (en) * 2016-11-21 2019-01-11 宏達國際電子股份有限公司 Body posture detection system, wearable device and method
EP3324204A1 (en) * 2016-11-21 2018-05-23 HTC Corporation Body posture detection system, suit and method
CN108089699A (en) * 2016-11-21 2018-05-29 宏达国际电子股份有限公司 Human posture's detecting system, clothes and method
CN110023884B (en) * 2016-11-25 2022-10-25 森索里克斯股份公司 Wearable motion tracking system
RU2746686C2 (en) * 2016-11-25 2021-04-19 Сенсорикс Аг Wearable motion tracking system
CN110023884A (en) * 2016-11-25 2019-07-16 森索里克斯股份公司 Wearable motion tracking system
WO2018095804A1 (en) * 2016-11-25 2018-05-31 Sensoryx AG Wearable motion tracking system
US10768691B2 (en) 2016-11-25 2020-09-08 Sensoryx AG Wearable motion tracking system
AU2017365223B2 (en) * 2016-11-25 2022-07-07 Sensoryx AG Wearable motion tracking system
DE102017120741A1 (en) * 2017-09-08 2019-03-14 Tim Millhoff Device, system and method for decoupling a VR system from infrastructure and localized hardware
CN108170268A (en) * 2017-12-26 2018-06-15 浙江大学 A kind of Whole Body motion capture devices based on Inertial Measurement Unit
EP3843619B1 (en) * 2018-08-29 2023-04-26 Pulsion Medical Systems SE Method and apparatus for correcting a blood pressure measurement taken at a measurement location
CN109269483B (en) * 2018-09-20 2020-12-15 国家体育总局体育科学研究所 Calibration method, calibration system and calibration base station for motion capture node
CN109269483A (en) * 2018-09-20 2019-01-25 国家体育总局体育科学研究所 A kind of scaling method of motion capture node, calibration system and calibration base station
WO2020200673A1 (en) * 2019-03-29 2020-10-08 Nokia Technologies Oy Haptic feedback
US20220187917A1 (en) * 2019-03-29 2022-06-16 Nokia Technologies Oy Haptic feedback
EP3716017A1 (en) * 2019-03-29 2020-09-30 Nokia Technologies Oy Haptic feedback
US11841990B2 (en) 2019-03-29 2023-12-12 Nokia Technologies Oy Haptic feedback
CN110044377A (en) * 2019-04-08 2019-07-23 南昌大学 A kind of IMU off-line calibration method based on Vicon
CN110517750A (en) * 2019-08-21 2019-11-29 兰州交通大学 A kind of more human action method for catching of fusion WIFI positioning and inertia sensing
EP4084936A4 (en) * 2019-12-31 2024-01-10 Human Mode Llc Proxy controller suit with optional dual range kinematics
CN111915943A (en) * 2020-08-18 2020-11-10 营口巨成教学科技开发有限公司 Unmanned public nursing and first-aid training system and training method
WO2023277952A1 (en) * 2021-06-28 2023-01-05 Google Llc System and method for motion capture
WO2023100565A1 (en) * 2021-11-30 2023-06-08 リオモ インク Running form evaluation system, program, and method

Similar Documents

Publication Publication Date Title
WO2014114967A1 (en) Self-calibrating motion capture system
US8165844B2 (en) Motion tracking system
Xu et al. Geometrical kinematic modeling on human motion using method of multi-sensor fusion
Roetenberg et al. Xsens MVN: Full 6DOF human motion tracking using miniature inertial sensors
CN106648116B (en) Virtual reality integrated system based on motion capture
US20180089841A1 (en) Mixed motion capture system and method
Peppoloni et al. A novel 7 degrees of freedom model for upper limb kinematic reconstruction based on wearable sensors
US20180132761A1 (en) Use of epidermal electronic devices to measure orientation
JP6526026B2 (en) Method and system for determining motion of object
JP2016511400A (en) Position detection apparatus and method
US9021712B2 (en) Autonomous system and method for determining information representative of the movement of an articulated chain
Hindle et al. Inertial‐Based Human Motion Capture: A Technical Summary of Current Processing Methodologies for Spatiotemporal and Kinematic Measures
CN110609621B (en) Gesture calibration method and human motion capture system based on microsensor
CN109284006A (en) A kind of human motion capture device and method
US20180216959A1 (en) A Combined Motion Capture System
Taunyazov et al. A novel low-cost 4-DOF wireless human arm motion tracker
Qiu et al. Heterogeneous data fusion for three-dimensional gait analysis using wearable MARG sensors
CN105628028A (en) Indoor three-dimensional positioning system and positioning method based on mobile phone built-in sensor
Ahmadi et al. Human gait monitoring using body-worn inertial sensors and kinematic modelling
CN109453505B (en) Multi-joint tracking method based on wearable device
Zhang et al. Ubiquitous human body motion capture using micro-sensors
Zhang et al. 3D upper limb motion modeling and estimation using wearable micro-sensors
Seel et al. IMU-based joint angle measurement made practical
Low et al. A wearable wireless sensor network for human limbs monitoring
JP6205387B2 (en) Method and apparatus for acquiring position information of virtual marker, and operation measurement method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13708522

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13708522

Country of ref document: EP

Kind code of ref document: A1