WO2009112281A1 - Appareil intégré à un vêtement pour la détection, l'analyse et le retour d'informations en ligne de/concernant la position et les mouvements du corps - Google Patents

Appareil intégré à un vêtement pour la détection, l'analyse et le retour d'informations en ligne de/concernant la position et les mouvements du corps Download PDF

Info

Publication number
WO2009112281A1
WO2009112281A1 PCT/EP2009/001864 EP2009001864W WO2009112281A1 WO 2009112281 A1 WO2009112281 A1 WO 2009112281A1 EP 2009001864 W EP2009001864 W EP 2009001864W WO 2009112281 A1 WO2009112281 A1 WO 2009112281A1
Authority
WO
WIPO (PCT)
Prior art keywords
garment
user
terminals
orientation
sensors
Prior art date
Application number
PCT/EP2009/001864
Other languages
English (en)
Inventor
Holger Harms
Daniel Roggen
Tröster GERHARD
Original Assignee
Eth Zurich
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eth Zurich filed Critical Eth Zurich
Publication of WO2009112281A1 publication Critical patent/WO2009112281A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41DOUTERWEAR; PROTECTIVE GARMENTS; ACCESSORIES
    • A41D13/00Professional, industrial or sporting protective garments, e.g. surgeons' gowns or garments protecting against blows or punches
    • A41D13/12Surgeons' or patients' gowns or dresses
    • A41D13/1236Patients' garments
    • A41D13/1281Patients' garments with incorporated means for medical monitoring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the invention is in the field of smart textiles for posture classification.
  • Posture classification is an essential basis for activity recognition in various health-related applications. These include virtual assistants for movement rehabilitation to regain movement flexibility, or coaching support to maintain favorable upper body postures during daily activities.
  • the common vision for virtual movement assistants is to empower the user with preventive coaching to reduce risk of hospitalization and chronic diseases and track rehabilitation progress.
  • Textile-based posture sensing solutions have been investigated for different sensors and target applications.
  • Major research areas include the reconstruction of hand gestures with glove-attached accelerometers [1], conductive elastomers [2] or cameras [3, 4].
  • One pioneering project for upper body monitoring was the Georgia Tech Wearable Motherboard [5].
  • the system uses optical fibers to detect bullet wounds and other sensor modalities to monitor a soldier's vital conditions in combat conditions.
  • strain sensors fix application to specific body regions.
  • off-body computers are utilized for the processing of sensor data.
  • monitoring applications in rehabilitation and sports require free movement and on-body processing capabilities.
  • PadNET a wearable data acquisition platform for the upper body.
  • PadNET consists of a multistage sensor network with large-area sensors that was integrated into a jacket. Sensors were connected by woven wires and placed into pockets. A coupling between the sensors and the body segments is achieved by the jacket being rather stiff and the sensors having a large area covering a substantial part of the jacket's area.
  • a crucial drawback is the bandwidth limitation of the system bus. The system proposed here, shrinks the area for garment attached sensing platforms by factor 12.5 compared to PadNET.
  • Gyroscopes and accelerometer-magnetometer pairs are well-established sensing solutions for posture tracking.
  • the complementary nature of the sensors was exploited in academia and industry.
  • Several approaches were made to determine the orientation of stand-alone sensor units using Kalman filtering [1 1, 12, 13]. All these academic investigations are not textile and rely on a defined number and location of sensors.
  • a further object of the invention is to provide a method for measuring of the orientation of at least one body segment, preferably for detection of postures and/or body movements and/or activities, that enables daylong, unobtrusive recordings of subjects in everyday life situations.
  • the garment according to the invention is in particular an upper body garment, e.g. a long-sleeve shirt, but could also be trousers or an all-in-one suit. It comprises an apparatus for measurement of body segment orientation which is integrated to the garment. This apparatus comprises a plurality of sensing terminals and at least one processing unit in communication with the terminals.
  • the garment is loose-fitting and flexible.
  • the garment substrate is made of a standard textile matter, e.g. knitted or woven fabrics or fleece.
  • the method for measurement of body segment orientation comprises the steps of providing such a loose-fitting garment with an apparatus for measurement of body segment orientation integrated to the garment, and determining an orientation of at least one body segment by processing and analyzing the sensor signals in the processing unit. From the orientation information, postures and/or body movements and/or activities can be determined. Contrary to the posture sensing garments according to the prior art, where sensors are either attached to the skin or a tight fitting textile, in order to establish a direct coupling between sensor and skin, the present invention does not rely on direct coupling between sensor and the user's body.
  • the invention is based on a loose fitting garment, which may be an everyday piece of clothing.
  • the sensors are thus only loosely coupled to the user's body; variations in the exact positioning of the sensors with respect to the user's body occur.
  • these uncertainties caused by the loose fit generally do not affect classification of several main postures.
  • the loose fit may even make discrimination between postures easier, because a typical response of the garment can be taken into account, e.g. stretching of certain garment areas and thus displacement of corresponding sensors.
  • several approaches can be made, e.g. using redundant sensors and/or knowledge about the system (model calculations, reference measurements).
  • a posture sensing garment is integrated into a loose fitting upper body garment, e.g. an everyday life garment.
  • the garment is unobtrusive, wearable and user-friendly, and, hence, allows long term recordings in everyday life situations. That the garment is loose fitting means that the electronic components, in particular the sensors do not exert pressure to the skin and are not fixedly coupled to the body. The user is thus not hindered in its movements. This is especially helpful for motion/posture detection with children or in sports.
  • the invention may be used as a user interface for on-line control of a computer or similar device (e.g. playstation) by means of detection of certain movements or postures of the user.
  • miniature sensors with dimensions less than 10 mm x 10 mm x 5 mm (e.g. having an area of 8x8mm 2 and a thickness of 5mm) are used. They are not obviously visible and do not affect the garment's properties and the garment's orientation.
  • sensors with an dimensions of 8 mm x 10 mm x 5 mm are used, which exert a pressure of appr. 100 N/m 2 .
  • a reference for a maximum size/weight may be a typical button.
  • a loose-fitting garment in the context of the invention preferably means that a total area or certain dimensions of the garment of a predetermined size exceed a predetermined body area or predetermined body dimensions of a standard user having this size, by a predetermined value, e.g. by at least 10%.
  • the predetermined body area is for example covered by the garment when worn.
  • Predetermined garment or body dimensions can be arm length, chest length/width, wrist/neck circumference.
  • a larger surface or larger dimensions of the garment mean that the garment is not tight fitting. There is thus enough elbowroom for the user. If the difference is not too large, e.g. less than 20-30%, there is still a sufficient spatial relation between the parts of the body and the sensors assigned thereto.
  • An upper limit for the "looseness" of the fit may be defined by means of the deviations between the sensor signals of sensors applied to the garment (textile sensor) and reference sensors directly coupled to the associated body segment: For example, a difference in the detected angles of the gravitational vectors of 10-40°, e.g. 15°, may for certain applications still be tolerated while a larger deviation may not.
  • loose-fitting in the context of the invention preferably means that the garment does not exert compressive forces on the user's body when worn.
  • the pressure exerted by the garment when worn does not exceed the pressure caused by the weight of the garment.
  • the sensors and other electronic components are thus not pressed against the user's skin for the purpose of fixation.
  • the garment allows mobility of garment-attached sensing terminals - they are not specifically fixed to the body. There is thus only a loose coupling between the body segment which orientation is to be measured and the respective sensor on/in the garment.
  • An upper limit for the tolerance between the position of the textile sensor and a reference position on the body segment may be 0.5-5 cm. It may also be determined depending on the result of a reference measurement as mentioned above.
  • the system according to the invention is preferably autonomous: Power supply, recording, processing and feedback hardware, software for measuring and analyzing sensor data, compensation of errors caused by loose fit, detection and/or classification of orientation of body segments, postures and activities, provision of feedback are integrated to the garment. These components are preferably also miniaturized.
  • the proposed shirt does the sensing, classification and feedback- autonomously.
  • An interface for communication with an external unit e.g. a data processing unit, is preferably provided.
  • the apparatus is preferably able to provide an online feedback regarding recorded body orientation measurement, e.g. adopted postures.
  • Postures trained by a user e.g. in a physiotherapy or sports, can be classified online by the garment according to the invention.
  • a feedback e.g. an optical, vibrational, and/or acoustical signal, may be provided to the user by means of a signal unit in dependence of the adopted posture.
  • the architecture of the data acquisition and processing apparatus is flexible. It can be chosen to meet the needs of a specific application.
  • the garment according to the invention thus provides a certain flexibility of the measuring setup, i.e. number, type and positioning of the sensors.
  • sensors can generally be placed anywhere at the garment; preferably, their location is not fixed but can be adjusted to the purpose of the measurement.
  • the sensors can preferably be attached or removed during runtime (plug 'n play capable). For error reduction, redundant sensors can be placed all over the garment, and a subset of sensors can be selected by feature ranking and multi-sensor fusion.
  • the data acquisition and processing apparatus is integrated into the textile, preferably such that the electronic components and their connecting lines are fixedly connected to the garment. Preferably, they are encapsulated by an insulating and waterproof housing. For example silicon gel hot melted housings.
  • the terminals and/or the power supply may be removably integrated, e.g. by providing receiving pockets.
  • Posture can be detected by sensors such as strain gauges, accelerometer, gyroscopes, magnetic field sensors, optical fibers, goniometer, pressure sensors, sensors for ultrasonic distance measurements and/or optical systems.
  • sensors such as strain gauges, accelerometer, gyroscopes, magnetic field sensors, optical fibers, goniometer, pressure sensors, sensors for ultrasonic distance measurements and/or optical systems.
  • sensors such as strain gauges, accelerometer, gyroscopes, magnetic field sensors, optical fibers, goniometer, pressure sensors, sensors for ultrasonic distance measurements and/or optical systems.
  • sensors such as strain gauges, accelerometer, gyroscopes, magnetic field sensors, optical fibers, goniometer, pressure sensors, sensors for ultrasonic distance measurements and/or optical systems.
  • goniometer goniometer
  • pressure sensors for ultrasonic distance measurements and/or optical systems.
  • small and low- power inertial sensing units are used.
  • Orientation errors of sensors introduced by loose fitting clothing are preferably compensated inherently by a classifier and/or classifier fusion and/or multi-sensor fusion.
  • a classifier and/or classifier fusion and/or multi-sensor fusion For example, also generally known pattern recognition methods can be used to classify a posture or movement using a set of sensor signals or measured orientations.
  • Sensor orientation errors introduced by attaching terminals to the loose fitting garment can be compensated by the fusion of multiple sensing terminals.
  • Sensor information can be fused either on signal and/or feature and/or recognition level.
  • one or a subset of many sensors is selected according their quality of information delivered. For example, only sensors that are less affected by garment-induced orientation errors, delivering an orientation measurement highly correlated to the actual body-segment orientation, are considered. A corresponding reference measurement with reference sensors attached to the skin can be carried out to determine this subset. Alternatively the subset of sensors can be determined by evaluating the sensor's output information conent. In a fusion of sensor information on classification level, each individual sensor's data are analyzed and used for recognition of postures or activity. In a subsequent step the individual sensor's classification results are fused to one final result, e.g. by majority vote.
  • An alternative method for compensating garment drape includes the simulation of the garment drape using e.g. discrete or continuous particle models.
  • the simulation of the garment drape can be performed e.g. inside the integrated processing unit.
  • Measured body-segment orientation data can be enhanced with simulation results for improving measurement accuracy.
  • the effective errors can be measured directly using body-attached reference sensors. Again, their information can be utilized for compensating the measured orientation errors.
  • the shirt is loose fitting. Subjects can move without hindrance, the shirt is feasible for usage in unsupervised everyday life. An invisible sensing and processing of data will increase the system acceptance.
  • the errors introduced by the placement of sensors onto a loose fitting textile can be compensated inherently, e.g. by a classifier and a multi-sensor fusion approach.
  • Fig. Ia A garment according to the invention (inside out) showing the central data processing unit (Konnex) and three decentral preprocessing units (Gateways; each is marked with a circle, the fourth Gateway is hidden).
  • Fig. Ib Placement and routing of a system according to the invention with two Terminal units.
  • the system bus was embedded on the shirt using silicone gel. Dotted lines indicate sewing lines where the system bus was routed for unloading of mass and garment strain. Double lines, perpendicular to shirt stretching zones, indicate the textile strain that was considered by routing in winding paths.
  • Fig. 2a Dependence between the information flow and the system architecture.
  • Fig. 2b An exemplary system architecture designed in three processing layers of standard activity recognition tasks. Sensor data is acquired and preprocessed by Terminals (first layer), features are computed by Gateways that fuse sensor data (second layer), and recognition tasks are performed by a Konnex (third layer).
  • Terminals first layer
  • features are computed by Gateways that fuse sensor data
  • recognition tasks are performed by a Konnex (third layer).
  • Fig. 3 The worn garment. White bars at the upper arm, back and lower waist are the Gateway's connectors to Terminals (marked with a circle).
  • Fig. 4 The left picture shows an ADC Terminal (8x8mm) and an acceleration Terminal (8x1 Omm). On the right, a silicone integrated Gateway is shown.
  • Fig. 5 Experimental procedure for the system evaluation.
  • Fig. 6 Three different repetitions of the exercise, performed by an exemplary user.
  • Fig. 7 Average error in the orientation of the arm during the repetition of the experiment.
  • Fig. 8 Twelve classified postures.
  • Fig. 9 Sensor placement at the wrist and upper arm (marked with a circle).
  • Fig. 10 Posture classification accuracy for the rehabilitation exercises and the three training modes: user-specific, user-adapted and user-independent.
  • Fig. 11 Posture classification confusion matrix for the rehabilitation exercises in user-specific training mode.
  • Fig. 12 Posture classification confusion matrix for the rehabilitation exercises in user-independent training mode.
  • Fig. 13 User-specific classification accuracy variation (maximum-minimum) in relation to user arm length and body size. The dashed line indicates the result of a linear fitting.
  • SMASH posture and movement sensing platform
  • the sensing platform is integrated to a not specifically tightened textile.
  • SMASH is designed to operate as a comfortable monitoring garment for everyday use in movement rehabilitation or sports coaching. Sensing and processing tasks can be efficiently implemented using a distributed system architecture. Other architectures are also possible.
  • SMASH is introduced and the garment's sensing and hierarchical data processing architecture is described.
  • the latter resembles three processing layers: sensor data acquisition, feature processing and classification.
  • Sensing terminals are connected to a textile integrated core system, consisting of decentral preprocessing units (interface gateways) and a central system master.
  • the master performs classification tasks on the preprocessed data from the gateways.
  • a characterization procedure to analyze the SMASH system is presented. With this approach, the system's resolution is evaluated for arm postures with five users. The posture resolution is derived from the absolute measurement error and verified by classifying 37 arm angle postures.
  • Section 2 outlines the distributed system architecture along with its garment implementation. Sections 3 and 4 present characterization and movement rehabilitation investigations respectively. Finally, Section 5 summarizes the results of this work.
  • the sensing apparatus integrated into a loose-fitting garment is made of a hierarchical processing network.
  • a picture of the garment is shown in Fig. Ia, a schematic drawing of the garment as well as the electronic compenents is shown in Fig. Ib.
  • the system architecture is shown in Figs. 2a+b:
  • the SMASH hierarchy is in this example aligned to a three-layer stack of standard tasks performed for activity recognition: signal sampling and sensor data preprocessing, feature processing, and pattern recognition.
  • the corresponding functions are implemented in the SMASH units Terminals, Gateways, and a Konnex.
  • a central system master K the Konnex, is connected to four decentral preprocessing units, the Gateways G, by a wired system bus 1. Together they form the core system 2, which is fully integrated to the textile 3.
  • Each of the four Gateways G provides standardized interfaces 4 to outer peripheral platforms, called Terminals T (sensors).
  • Terminals T Sensors
  • One kind of Terminal is the acceleration Terminal.
  • SMASH System Design SMASH was designed to acquire, evaluate and process signal data, perform an online classification and to report results to the user. In this flow, information are represented on different levels of abstraction, like physical observations, electrical signals, meaningful features and interpreted classification outputs.
  • One design goal for SMASH was the distribution of processing tasks onto different computation units in a hierarchical way. Hence, data are separated and processed in three layers according to their level of abstraction: signals, features, classification. An overview is depicted in Fig. 2a+b.
  • Terminals convert physical observations into an electrical signal representation. Signals are filtered and translated into a standardized format for further processing. However, Terminals act on signal level and are the first processing layer. Afterwards, the Terminal sends collected data to one of the Gateways over a wired connection (terminal bus 5). Gateways acquire data from several attached Terminals and fuse them in order to extract meaningful features. Gateways are the second processing layer on the feature level. Gateways send features to the Konnex, which is the last processing layer. It takes the features as an input for an online classification and reports the results to the user. Especially in case of the terminals it was the aim to achieve a minimum PCB size. Hence, all electronic components were selected after their chip area.
  • Posture classes are discriminated on the Konnex by a Nearest Centroid Classifier.
  • the unit is able to perform a sample-wise realtime classification with seven active acceleration Terminals and a sample frequency of 16Hz.
  • SMASH can alternatively be configured to operate as a pure data acquisition hardware system. Gathered sensor data can be either stored into a non-volatile memory or sent to an outer host via an integrated IEEE 802.15.4 compatible Bluetooth module. Locally developed data recording software receives, visualizes and stores received data. 2.3 Terminals
  • Terminals are miniature and lightweight sensors or actuators, connected to the core system via a 2-wire I 2 C bus (see Fig. 4).
  • SMASH is able to detect plugged Terminals and identify their services during runtime. For this, the available I 2 C address space was segmented, each segment is reserved for one type of Terminal. Gateways are polling the available address space by sending a short ping-message to every address. Newly connected Terminals respond to that ping-message and, hence, are identified.
  • Each implemented kind of Terminal is equipped with an ATmega48 8-bit microprocessor for basic pre-processing. It offers all required functionality and interfaces with a size of 5x5mm. Following types of Terminals have been implemented:
  • 3D-accelerometer Terminals with a size of 8x1 Omm are used to calculate the orientation of body segments by means of gravity vectors.
  • ADC Terminal containing four analog-digital converters with a resolution of lObit.
  • Each ADC Terminal has a total size of 8x8mm and can be equipped with custom pull down resistors to be configured for specific measuring ranges.
  • ADC Terminals are used to gather the temperature and resistance of the user's skin or light conditions.
  • An I/O interface with four input buttons has four additional LEDs to signal events.
  • the LEDs are currently configured to shine if a button is pressed.
  • the main task of the Gateways is to provide an interface between the core system and remote Terminals.
  • a special issue was the placement of the units on the garment. The goal was to permit a balanced distribution of Terminals over the whole body with a maximal cable length of 85cm.
  • Two Gateways are located at the right and left upper arm to reach the upper body and limbs.
  • a third Gateway was placed at the back in order to reach upper body locations and the head.
  • a final Gateway was placed at the lower waist, to reach the legs.
  • the positions of the Gateways are indicated in Fig. la+b, 3.
  • Each Gateway is equipped with four sockets, where Terminals can be connected. Hubs extend the number of Terminals attachable to a single Gateway to 127. Hence, the system can be equipped with about 500 Terminals.
  • a 3D-accelerometer is mounted on every Gateway, to give the core system a basic data gathering capability.
  • the device is internally handled as a virtual acceleration Terminal. It is equipped with a MSP430F1611 l ⁇ bit microcontroller, since it offers a good compromise between available computation power, necessary peripheral interfaces and power consumption.
  • the selected model is equipped with 10KB of RAM, which allows a later porting of an operating system.
  • Gateways and Konnex are connected by a 4-wire Serial Peripheral Interface (SPI) bus in a redundant star topology.
  • SPI Serial Peripheral Interface
  • the Konnex as the logical bus master, is able to detect broken signal wires and to restore the connection to unreachable devices via an associated Gateway in a static routing.
  • the Konnex is the system master for communication, power and data processing. The latter is done by an additional MSP430 microprocessor. For reasons of comfort, the Konnex and the battery are located near to the body's center of mass, at the lower back where the extra weight is hardly noticeable for the wearer.
  • a central power supply generates a system- wide distributed voltage of 3.3V. It is sourced by a flat, detachable lithium polymer battery. A virtual Terminal is located on the Konnex in order to observe the system voltage. SMASH disables all communication modules and Terminals if the system voltage drops below a critical level. A long term test to estimate the system's runtime was performed. SMASH was equipped with three acceleration Terminals and performed an on-line classification of three randomly trained posture classes. The results were sent continuously to an outer host PC via the integrated Bluetooth module. For three runs, a battery life far in excess of 14 hours was measured.
  • Table 1 SMASH Terminals that are currently implemented and tested.
  • Table 2 SMASH system units.
  • the plain core system is integrated into the inside of a long sleeve garment, see Fig. Ia.
  • the base layer of SMASH is a commercial off-the-shelf long-sleeve shirt.
  • the shirt design was chosen as it allows to attach sensors at various locations, even at wrists, while being conveniently worn as casual cloth during daily life.
  • the following table shows an example of a SMASH substrate garment:
  • the accuracy of the system is limited by the electrical properties of the used ADXL330 accelerometer.
  • the output of each Terminal's X, Y and Z axis was measured three times for -Ig, Og and Ig, respectively. All acceleration Terminals showed a similar accuracy for the different axes.
  • Table 5 shows the average derivations for the outputs of the axes (for clearness, the deviation was converted from g [m/s 2 ] to degrees [°]).
  • the system's resolution is electrically limited to approx. 1°.
  • Table 5 Stability of the Terminals output, averaged for three readings for -90°,0°and 90°.
  • the accuracy of the overall system was evaluated in a study.
  • a typical movement like the abduction of the right arm, was picked and cut it into equidistant steps of 5 degrees.
  • the movement can be exercised in a range from approx. 0° to 180°, which ends up in 37 different posture classes.
  • a poster showing a semicircle and excentric beams in an respective angle of 5 degrees was printed.
  • Two female and three male subjects put on SMASH and were equipped with an acceleration Terminal at the right wrist. Subjects were instructed to stand with their back to the poster so that the shoulder joint was aligned in front of the semicircle's center (see Fig. 5). Starting with pointing to the bottom, the right arm was abducted in steps of 5 degrees from 0° to 180°.
  • Each of the 37 postures was held for at least one second and labeled by an assistant. The whole exercise was repeated three times. The subjects were asked to perform random activities between the repetitions in order to realign the garment in a natural way.
  • Fig. 7 shows the average orientation difference (error), depending on the arm's abduction angle (dotted line).
  • the rotation reaches a maximum of about 20° for a flexion of approx. 105° and decreases to 3° if the arm points up.
  • This effect can be explained in two ways. On one hand the sleeve is shifted, if the arm is raised. It becomes tight fitting and aligns the acceleration Terminal in a similar way to the body. On the other hand the influence of the arm's rotation decreases if the arm is raised. In case the arm points up, the gravity vector is orthogonal to the rotation-sensitive Y-axis.
  • Fig. 7 depicts the averaged angular deviations, if the Y-axis is masked. Hence, the system's average angular accuracy in the poster's plane is better than 7.5°. The classification was repeated without both Y-axes in order to avoid the rotation's influence. A final user-specific accuracy of 88% was reached.
  • Fig. 9 shows the sensor positioning.
  • a set of exercises, commonly used for rehabilitation training of the shoulder and elbow joint was selected for the investigation. Table 3 summarizes the individual postures. Each exercise begins with the user standing upright, arms relaxed. This is indicated as normal position (class 1 in Tab. 6). Fig.
  • Table 6 Rehabilitation exercises included in the study. Each exercise consists of one or several postures.
  • the classification performance for the three training modes was analyzed: user- specific, user-adapted and user-independent.
  • training and testing was performed on the posture instances from each user individually.
  • a leave-one-out cross-validation was used.
  • posture instances from all users were selected for training and testing set.
  • a threefold cross-validation was used.
  • the user-adapted evaluation includes instances from more than one subject and is an intermediate step between user-specific and user- independent modes.
  • the user-independent is typically the hardest test.
  • the classification performance is analyzed for postures from a user who's postures were not included in the algorithm training. This test indicates the performance of the system when used with new person. For this mode, a cross-validation on the number of users (eight) was used.
  • Fig. 10 depicts the performances of all training modes along with minimum and maximum values. The values indicate the range of results for each user in the user-specific and user-independent evaluations. For the user-adapted case, the result variance from the cross-validation folds is shown. As expected, this variance is very low, since instances from all users are used for training and testing in this mode.
  • Figs. 1 1 and 12 show the classifier confusion matrices for the user-specific and user- independent cases respectively. This class confusion provides an indication on the misclassified postures. The overall good classification of each class is shown by the main diagonal being close to one.
  • the rehabilitation exercises can be successfully classified with good accuracy, even for the more challenging user-adapted and user-independent training modes. This indicates that the movement of the textile does not prevent the accurate classification of the selected exercises with simple acceleration sensors.
  • SMASH a novel sensing and processing platform integrated into a upper body garment.
  • the system processes sensor data in a three- layer distributed architecture.
  • remote sensing terminals are used, features are extracted by local gateways and a central system master performs classification tasks.
  • Sensor data processing and classification as well as hot-plug- capabilities were implemented in the system.
  • accelerometer sensors were used. However, the system is designed to support different sensing modalities.
  • the system performance was evaluated in two independent studies.
  • the first evaluation addressed the sensor resolution in a characterization experiment with five users. It was found that the system was able to detect angle changes at a resolution of 7.5°. This result was confirmed by posture classification in 5° and 10° resolution. The procedure has potential for the validation of similar textile systems in the future.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Artificial Intelligence (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Signal Processing (AREA)
  • Fuzzy Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Professional, Industrial, Or Sporting Protective Garments (AREA)

Abstract

L'invention concerne le domaine des textiles intelligents pour une classification de la position. L'invention concerne un vêtement, en particulier un vêtement pour le haut du corps, comprenant un appareil permettant de détecter l'orientation d'au moins un segment du corps intégré au vêtement. L'appareil comprend une pluralité de bornes de détection et au moins une unité de traitement en communication avec les bornes. Le vêtement se porte ample.
PCT/EP2009/001864 2008-03-14 2009-03-13 Appareil intégré à un vêtement pour la détection, l'analyse et le retour d'informations en ligne de/concernant la position et les mouvements du corps WO2009112281A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP08004866 2008-03-14
EP08004866.3 2008-03-14

Publications (1)

Publication Number Publication Date
WO2009112281A1 true WO2009112281A1 (fr) 2009-09-17

Family

ID=40848563

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2009/001864 WO2009112281A1 (fr) 2008-03-14 2009-03-13 Appareil intégré à un vêtement pour la détection, l'analyse et le retour d'informations en ligne de/concernant la position et les mouvements du corps

Country Status (1)

Country Link
WO (1) WO2009112281A1 (fr)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2508127A1 (fr) 2011-04-06 2012-10-10 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Procédé et système d'évaluation de la posture
US8384551B2 (en) 2008-05-28 2013-02-26 MedHab, LLC Sensor device and method for monitoring physical stresses placed on a user
WO2013040294A1 (fr) * 2011-09-16 2013-03-21 Veldman Bernie T Méthode et appareil de prévention contre la posture en w
US8565109B1 (en) 2010-01-29 2013-10-22 University Of Washington Through Its Center Of Commercialization Optimization of polling protocols in sensor networks
US8945328B2 (en) 2012-09-11 2015-02-03 L.I.F.E. Corporation S.A. Methods of making garments having stretchable and conductive ink
US8948839B1 (en) 2013-08-06 2015-02-03 L.I.F.E. Corporation S.A. Compression garments having stretchable and conductive ink
WO2015017712A1 (fr) * 2013-07-31 2015-02-05 Sensoria Inc Procédés et systèmes de collecte de données, d'analyse et de formulation de retour d'informations spécifique d'utilisateur, et utilisation de systèmes de détection en tant que dispositifs d'entrée
US9078478B2 (en) 2012-07-09 2015-07-14 Medlab, LLC Therapeutic sleeve device
JP2015159950A (ja) * 2014-02-27 2015-09-07 陽陽 任 呼吸及び/又は脈拍測定装置
US9265641B2 (en) 2011-09-16 2016-02-23 Bernie T. Veldman Method and apparatus for discouraging W-sitting
US9282897B2 (en) 2012-02-13 2016-03-15 MedHab, LLC Belt-mounted movement sensor system
US9282893B2 (en) 2012-09-11 2016-03-15 L.I.F.E. Corporation S.A. Wearable communication platform
JP5891286B1 (ja) * 2014-10-28 2016-03-22 東芝電波プロダクツ株式会社 状態情報収集システム
WO2016097655A1 (fr) 2014-12-18 2016-06-23 Universite Grenoble Alpes Systeme et procede de controle du mouvement cyclique d'un segment corporel d'un individu
WO2016112126A1 (fr) * 2015-01-06 2016-07-14 Asensei, Inc. Entretien de la forme physique à base de mouvements et gestion de produit de forme physique
CN105877757A (zh) * 2016-03-30 2016-08-24 哈尔滨理工大学 多传感器集成的人体运动姿态捕获与识别装置
WO2017140537A1 (fr) 2016-02-16 2017-08-24 Koninklijke Philips N.V. Dispositif de cordon, procédé et système de surveillance de cordon personnel
US9817440B2 (en) 2012-09-11 2017-11-14 L.I.F.E. Corporation S.A. Garments having stretchable and conductive ink
US9858773B2 (en) 2014-02-19 2018-01-02 Microsoft Technology Licensing, Llc Wearable computer having a skin-stimulating interface
US10154791B2 (en) 2016-07-01 2018-12-18 L.I.F.E. Corporation S.A. Biometric identification by garments having a plurality of sensors
US10159440B2 (en) 2014-03-10 2018-12-25 L.I.F.E. Corporation S.A. Physiological monitoring garments
US10201310B2 (en) 2012-09-11 2019-02-12 L.I.F.E. Corporation S.A. Calibration packaging apparatuses for physiological monitoring garments
US20190175068A1 (en) * 2016-08-19 2019-06-13 Gowerlabs Limited Measuring Apparatus and Device for Measuring Changes in Chromophore Concentration
US10366593B2 (en) 2017-02-08 2019-07-30 Google Llc Ergonomic assessment garment
US10462898B2 (en) 2012-09-11 2019-10-29 L.I.F.E. Corporation S.A. Physiological monitoring garments
US10467744B2 (en) 2014-01-06 2019-11-05 L.I.F.E. Corporation S.A. Systems and methods to automatically determine garment fit
JP2019208623A (ja) * 2018-05-31 2019-12-12 トヨタ自動車株式会社 生体センサが取り付けられた被服
CN110998561A (zh) * 2017-08-30 2020-04-10 R-e株式会社 穿着物设计信息检索系统及套装设计信息检索系统
US11013275B2 (en) 2012-09-11 2021-05-25 L.I.F.E. Corporation S.A. Flexible fabric ribbon connectors for garments with sensors and electronics
US11246213B2 (en) 2012-09-11 2022-02-08 L.I.F.E. Corporation S.A. Physiological monitoring garments
JP2022148816A (ja) * 2021-03-24 2022-10-06 株式会社日立製作所 姿勢認識システム、姿勢認識方法及びプログラム

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070089800A1 (en) * 2005-10-24 2007-04-26 Sensatex, Inc. Fabrics and Garments with Information Infrastructure

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070089800A1 (en) * 2005-10-24 2007-04-26 Sensatex, Inc. Fabrics and Garments with Information Infrastructure

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
EDMISON J ET AL: "E-Textile Based Automatic Activity Diary for Medical Annotation and Analysis", WEARABLE AND IMPLANTABLE BODY SENSOR NETWORKS, 2006. BSN 2006. INTERNA TIONAL WORKSHOP ON CAMBRIDGE, MA, USA 03-05 APRIL 2006, PISCATAWAY, NJ, USA,IEEE, 3 April 2006 (2006-04-03), pages 131 - 134, XP010911493, ISBN: 978-0-7695-2547-1 *
JUNKER H ET AL: "PadNET: wearable physical activity detection network", WEARABLE COMPUTERS, 2003. PROCEEDINGS. SEVENTH IEEE INTERNATIONAL SYMP OSIUM ON 21-23 OCT. 2003, PISCATAWAY, NJ, USA,IEEE, 21 October 2003 (2003-10-21), pages 244 - 245, XP010673792, ISBN: 978-0-7695-2034-6 *
MATTMANN C ET AL: "Recognizing Upper Body Postures using Textile Strain Sensors", WEARABLE COMPUTERS, 2007 11TH IEEE INTERNATIONAL SYMPOSIUM ON, IEEE, PI, 11 October 2007 (2007-10-11), pages 29 - 36, XP031219949, ISBN: 978-1-4244-1452-9 *
PIERO ZAPPI ET AL: "Activity recognition from on-body sensors by classifier fusion: sensor scalability and robustness", INTELLIGENT SENSORS, SENSOR NETWORKS AND INFORMATION, 2007. ISSNIP 2007. 3RD INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 3 December 2007 (2007-12-03), pages 281 - 286, XP031245752, ISBN: 978-1-4244-1501-4 *
VAN LAERHOVEN K ET AL: "Towards a wearable inertial sensor network", IEE EUROWEARABLE '03 IEE LONDON, UK, 2004, pages 125 - 130, XP002538247, ISBN: 0-85296-282-7 *
WADE E ET AL: "Cable-free wearable systems using conductive fabrics transmitting signals and power", PROCEEDINGS OF THE SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING SPIE-INT. SOC. OPT. ENG USA, vol. 5758, no. 1, 2005, pages 285 - 295, XP002538248, ISSN: 0277-786X *

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8384551B2 (en) 2008-05-28 2013-02-26 MedHab, LLC Sensor device and method for monitoring physical stresses placed on a user
US8565109B1 (en) 2010-01-29 2013-10-22 University Of Washington Through Its Center Of Commercialization Optimization of polling protocols in sensor networks
US9078259B2 (en) 2010-01-29 2015-07-07 University Of Washington Through Its Center For Commercialization Optimization of polling protocols in sensor networks
EP2508127A1 (fr) 2011-04-06 2012-10-10 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Procédé et système d'évaluation de la posture
WO2013040294A1 (fr) * 2011-09-16 2013-03-21 Veldman Bernie T Méthode et appareil de prévention contre la posture en w
US9265641B2 (en) 2011-09-16 2016-02-23 Bernie T. Veldman Method and apparatus for discouraging W-sitting
US9282897B2 (en) 2012-02-13 2016-03-15 MedHab, LLC Belt-mounted movement sensor system
US9078478B2 (en) 2012-07-09 2015-07-14 Medlab, LLC Therapeutic sleeve device
US10045439B2 (en) 2012-09-11 2018-08-07 L.I.F.E. Corporation S.A. Garments having stretchable and conductive ink
US9986771B2 (en) 2012-09-11 2018-06-05 L.I.F.E. Corporation S.A. Garments having stretchable and conductive ink
US9282893B2 (en) 2012-09-11 2016-03-15 L.I.F.E. Corporation S.A. Wearable communication platform
US11246213B2 (en) 2012-09-11 2022-02-08 L.I.F.E. Corporation S.A. Physiological monitoring garments
US11013275B2 (en) 2012-09-11 2021-05-25 L.I.F.E. Corporation S.A. Flexible fabric ribbon connectors for garments with sensors and electronics
US10736213B2 (en) 2012-09-11 2020-08-04 L.I.F.E. Corporation S.A. Physiological monitoring garments
US10462898B2 (en) 2012-09-11 2019-10-29 L.I.F.E. Corporation S.A. Physiological monitoring garments
US10258092B2 (en) 2012-09-11 2019-04-16 L.I.F.E. Corporation S.A. Garments having stretchable and conductive ink
US9817440B2 (en) 2012-09-11 2017-11-14 L.I.F.E. Corporation S.A. Garments having stretchable and conductive ink
US10201310B2 (en) 2012-09-11 2019-02-12 L.I.F.E. Corporation S.A. Calibration packaging apparatuses for physiological monitoring garments
US8945328B2 (en) 2012-09-11 2015-02-03 L.I.F.E. Corporation S.A. Methods of making garments having stretchable and conductive ink
WO2015017712A1 (fr) * 2013-07-31 2015-02-05 Sensoria Inc Procédés et systèmes de collecte de données, d'analyse et de formulation de retour d'informations spécifique d'utilisateur, et utilisation de systèmes de détection en tant que dispositifs d'entrée
US8948839B1 (en) 2013-08-06 2015-02-03 L.I.F.E. Corporation S.A. Compression garments having stretchable and conductive ink
US10699403B2 (en) 2014-01-06 2020-06-30 L.I.F.E. Corporation S.A. Systems and methods to automatically determine garment fit
US10467744B2 (en) 2014-01-06 2019-11-05 L.I.F.E. Corporation S.A. Systems and methods to automatically determine garment fit
US9858773B2 (en) 2014-02-19 2018-01-02 Microsoft Technology Licensing, Llc Wearable computer having a skin-stimulating interface
JP2015159950A (ja) * 2014-02-27 2015-09-07 陽陽 任 呼吸及び/又は脈拍測定装置
US10159440B2 (en) 2014-03-10 2018-12-25 L.I.F.E. Corporation S.A. Physiological monitoring garments
JP5891286B1 (ja) * 2014-10-28 2016-03-22 東芝電波プロダクツ株式会社 状態情報収集システム
WO2016097655A1 (fr) 2014-12-18 2016-06-23 Universite Grenoble Alpes Systeme et procede de controle du mouvement cyclique d'un segment corporel d'un individu
GB2551062A (en) * 2015-01-06 2017-12-06 Asensei Inc Movement based fitness and fitness product management
WO2016112126A1 (fr) * 2015-01-06 2016-07-14 Asensei, Inc. Entretien de la forme physique à base de mouvements et gestion de produit de forme physique
US10360811B2 (en) 2015-01-06 2019-07-23 Asensei, Inc. Movement based fitness and fitness product management
US11978355B2 (en) 2015-01-06 2024-05-07 Asensei, Inc Movement based fitness and fitness product management
GB2551062B (en) * 2015-01-06 2019-11-13 Asensei Inc Movement based fitness and fitness product management
US11302214B2 (en) 2015-01-06 2022-04-12 Asensei, Inc. Movement based fitness and fitness product management
WO2017140537A1 (fr) 2016-02-16 2017-08-24 Koninklijke Philips N.V. Dispositif de cordon, procédé et système de surveillance de cordon personnel
US11205335B2 (en) 2016-02-16 2021-12-21 Lifeline Systems Company Lanyard device, method and personal lanyard monitoring system
CN105877757A (zh) * 2016-03-30 2016-08-24 哈尔滨理工大学 多传感器集成的人体运动姿态捕获与识别装置
US10869620B2 (en) 2016-07-01 2020-12-22 L.I.F.E. Corporation S.A. Biometric identification by garments having a plurality of sensors
US10154791B2 (en) 2016-07-01 2018-12-18 L.I.F.E. Corporation S.A. Biometric identification by garments having a plurality of sensors
US20190175068A1 (en) * 2016-08-19 2019-06-13 Gowerlabs Limited Measuring Apparatus and Device for Measuring Changes in Chromophore Concentration
US10600304B2 (en) 2017-02-08 2020-03-24 Google Llc Ergonomic assessment garment
US11151857B2 (en) 2017-02-08 2021-10-19 Google Llc Ergonomic assessment garment
US10366593B2 (en) 2017-02-08 2019-07-30 Google Llc Ergonomic assessment garment
CN110998561A (zh) * 2017-08-30 2020-04-10 R-e株式会社 穿着物设计信息检索系统及套装设计信息检索系统
JP2019208623A (ja) * 2018-05-31 2019-12-12 トヨタ自動車株式会社 生体センサが取り付けられた被服
JP2022148816A (ja) * 2021-03-24 2022-10-06 株式会社日立製作所 姿勢認識システム、姿勢認識方法及びプログラム
JP7495370B2 (ja) 2021-03-24 2024-06-04 株式会社日立製作所 姿勢認識システム、姿勢認識方法及びプログラム

Similar Documents

Publication Publication Date Title
Harm et al. Smash: A distributed sensing and processing garment for the classification of upper body postures
WO2009112281A1 (fr) Appareil intégré à un vêtement pour la détection, l'analyse et le retour d'informations en ligne de/concernant la position et les mouvements du corps
US10716510B2 (en) Smart clothing with converging/diverging bend or stretch sensors for measuring body motion or configuration
Nguyen et al. A wearable sensing system for tracking and monitoring of functional arm movement
Cho Smart clothing: technology and applications
Esfahani et al. A “smart” undershirt for tracking upper body motions: Task classification and angle estimation
US20170042454A1 (en) Modular physical activity monitoring system
Jin et al. Soft sensing shirt for shoulder kinematics estimation
Abdelhady et al. A high-fidelity wearable system for measuring lower-limb kinetics and kinematics
Veltink et al. Wearable technology for biomechanics: e-textile or micromechanical sensors?[conversations in bme]
Kumar et al. Electronics in textiles and clothing: design, products and applications
Yang et al. Smart wearable monitoring system based on multi-type sensors for motion recognition
Harms et al. Rapid prototyping of smart garments for activity-aware applications
Huang et al. Sensor-Based wearable systems for monitoring human motion and posture: A review
Mohammadzadeh et al. Feasibility of a wearable, sensor-based motion tracking system
Saggio et al. Sensory systems for human body gesture recognition and motion capture
Harms et al. Influence of a loose-fitting sensing garment on posture recognition in rehabilitation
Ma et al. A soft capacitive wearable sensing system for lower-limb motion monitoring
Tognetti et al. Wearable kinesthetic systems for capturing and classifying body posture and gesture
Paradiso et al. Smart textile suit
Nesenbergs Architecture of smart clothing for standardized wearable sensor systems
Shenoy et al. Design and validation of an IMU based full hand kinematic measurement system
Rezaei et al. Towards user-friendly wearable platforms for monitoring unconstrained indoor and outdoor activities
Younes et al. The design of smart garments for motion capture and activity classification
Borghetti et al. Validation of a modular and wearable system for tracking fingers movements

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09720689

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09720689

Country of ref document: EP

Kind code of ref document: A1