SE1950879A1 - Torso-mounted accelerometer signal reconstruction - Google Patents

Torso-mounted accelerometer signal reconstruction

Info

Publication number
SE1950879A1
SE1950879A1 SE1950879A SE1950879A SE1950879A1 SE 1950879 A1 SE1950879 A1 SE 1950879A1 SE 1950879 A SE1950879 A SE 1950879A SE 1950879 A SE1950879 A SE 1950879A SE 1950879 A1 SE1950879 A1 SE 1950879A1
Authority
SE
Sweden
Prior art keywords
motion data
recorded
individual
accelerometer
torso
Prior art date
Application number
SE1950879A
Other languages
Swedish (sv)
Inventor
Mohammed El-Beltagy
Original Assignee
Wememove Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wememove Ab filed Critical Wememove Ab
Priority to SE1950879A priority Critical patent/SE1950879A1/en
Priority to EP20744177.5A priority patent/EP4061215A1/en
Priority to PCT/SE2020/050619 priority patent/WO2021006790A1/en
Publication of SE1950879A1 publication Critical patent/SE1950879A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • A61B5/6815Ear
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Psychiatry (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Otolaryngology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

TORSO-MOUNTED ACCELEROMETER SIGNAL RECONSTRUCTIONThe present disclosure relates to a device and method for torso-mounted accelerometer signal reconstruction.In an aspect a method of acquiring recorded motion data of an individual (100) is provided. The method comprises acquiring (S101) motion data of the individual (100) recorded with a first torso-attached accelerometer (101), acquiring (S102) motion data of the individual (too) recorded with a second accelerometer (103) attached to a different part of the individual (100) than the torso, and determining (S103) a mapping function configured to map the motion data recorded with the second accelerometer (103) to the motion data recorded with the first accelerometer (101) for subsequently reconstructing torso-recorded motion data from motion data being recorded with the second accelerometer (103) and processed by the determined mapping function.

Description

TORSO-MOUNTED ACCELEROMETER SIGNAL RECONSTRUCTION TECHNICAL FIELD 1. 1. id="p-1" id="p-1"
[001] The present disclosure relates to a device and method for torso-mounted accelerometer signal reconstruction.
BACKGROUND [oo 2] Torso-mounted accelerometers are used in sports for motion detection andanalysis With aim to improve movement pattern of a wearer of the accelerometer duringexercising such as for instance running or cross-country skiing. Motion data is recordedand subsequently processed to provide an animation of the wearer's motion patternduring the exercise for review by the wearer, possibly with feedback to the wearer onimproving actions to be taken, such as a proposed change of stride length, an instructionto run in a more upright manner, to more aggressively pivot the arms back and forth, etc. 3. 3. id="p-3" id="p-3"
[003] There is a good rational to have a torso-mounted accelerometer for motiondetection and analysis. Torso-mounted accelerometers tend to sit quite closely to thecentre or mass of the wearer's body, and hence can capture a great deal of motion 1111311068. 4. 4. id="p-4" id="p-4"
[004] However, torso-mounted accelerometers are rather inconvenient as theytypically must be fastened and adjusted with a strap over the wearer's chest and maintained in that position throughout the exercise.
SUMMARY . . id="p-5" id="p-5"
[005] One object is to solve, or at least mitigate, this problem in the art and thus to provide an improved method of acquiring recorded motion signals of an individual. 6. 6. id="p-6" id="p-6"
[006] This object is attained in a first aspect by a method of acquiring recordedmotion data of an individual. The method comprises acquiring motion data of theindividual recorded with a first torso-attached accelerometer, acquiring motion data ofthe individual recorded with a second accelerometer attached to a different part of theindividual than the torso, and determining a mapping function configured to map themotion data recorded with the second accelerometer to the motion data recorded with the first accelerometer for subsequently reconstructing torso-recorded motion data from motion data being recorded with the second accelerometer and processed by the determined mapping function. 7. 7. id="p-7" id="p-7"
[007] This object is attained in a second aspect by a device configured to acquirerecorded motion data of an individual. The device comprises a processing unit and amemory, said memory containing instructions executable by the processing unit,whereby the device is operative to acquire motion data of the individual recorded with afirst torso-attached accelerometer, acquire motion data of the individual recorded with asecond accelerometer attached to a different part of the individual than the torso, and todetermine a mapping function configured to map the motion data recorded with thesecond accelerometer to the motion data recorded with the first accelerometer forsubsequently reconstructing torso-recorded motion data from motion data beingrecorded with the second accelerometer and processed by the determined mapping function. 8. 8. id="p-8" id="p-8"
[008] This object is attained in a third aspect by a method of reconstructing motiondata of an individual. The method comprises acquiring motion data of the individualrecorded with a second accelerometer attached to a different part of the individual thana torso, and processing the acquired motion data in a mapping function having beenpreviously determined by mapping the motion data recorded with the secondaccelerometer (103) to motion data of the individual (100) recorded with a first torso-attached accelerometer (101), thereby reconstructing (S203) torso-recorded motion data. 9. 9. id="p-9" id="p-9"
[009] This object is attained in a fourth aspect by a device configured to reconstructmotion data of an individual. The device comprises a processing unit and a memory, thememory containing instructions executable by the processing unit, whereby the device isoperative to acquire motion data of the individual recorded with a second accelerometerattached to a different part of the individual than a torso, and to process the acquiredmotion data in a mapping function having been previously determined by mapping themotion data recorded with the second accelerometer to motion data of the individualrecorded with a first torso-attached accelerometer, thereby reconstructing torso- recorded motion data. . . id="p-10" id="p-10"
[0010] In an embodiment, when reconstructing torso-recorded motion data, the mapping function used is that determined using the method of the first aspect. 3 11. 11. id="p-11" id="p-11"
[0011] A solution to the previously described problem in the art is proposed byallowing placement of the accelerometer elsewhere on the wearer's body, somewheremore convenient, such as in a holder fastened around an upper part of the wearer's armor around the wrist, or in a pair of in-ear headphones or over-ear headphones to befastened to the wearer's head. It is also envisaged that the accelerometer is implementedin for instance a smart watch of in smart phone placed in a holder fastened around an upper part of the wearer's arm or around the wrist. 12. 12. id="p-12" id="p-12"
[0012] However, a problem then arises that the accelerometer no longer is placed atthe centre of mass of the wearer's body, which e.g. has the consequence that algorithmsdeveloped in this particular technical field no longer will produce reliable motionanalysis data as they assume torso-based motion signals for processing, while motionsignals originating from other locations of the body will differ from the torso-based motion signals. 13. 13. id="p-13" id="p-13"
[0013] This is solved in that signals recorded by an accelerometer at anothermounted location (AML) will be adapted and matched to signals of a torso-mounted(TM) accelerometer by utilizing a mapping function, after which process TMaccelerometer signals may be reconstructed from recorded AML accelerometer signalshaving been processed by said mapping function. Advantageously, the reconstructed TMaccelerometer signals may be processed in the already available torso-based motion detection and analysis algorithms. 14. 14. id="p-14" id="p-14"
[0014] In an embodiment, the mapping function is determined such that an errorbetween the reconstructed torso-recorded motion data and the corresponding motion data recorded with the first accelerometer is minimized. . . id="p-15" id="p-15"
[0015] In an embodiment, a user profile is acquired of the individual for which themotion data is acquired, the user profile being associated with the determined mapping function. 16. 16. id="p-16" id="p-16"
[0016] In an embodiment, the user profile comprises information including at least one of weight, height, sex, chest width, placement of the second accelerometer. 17. 17. id="p-17" id="p-17"
[0017] In an embodiment, the mapping function is determined based on motion data of a plurality individuals having a similar user profile. 4 18. 18. id="p-18" id="p-18"
[0018] In an embodiment, a mapping function having been previously determinedfor a first individual is used to reconstruct torso-recorded motion data of a second individual. 19. 19. id="p-19" id="p-19"
[0019] In an embodiment, a request to use a determined mapping function isreceived, the request comprising a user profile of the requesting individual and motiondata of the requesting individual recorded with a second accelerometer attached to adifferent part of the individual than a torso, and the motion data of the requestingindividual is processed using a mapping function associated with a user profile bestmatching the user profile of the requesting individual to reconstruct torso-recorded motion data of the requesting individual. . . id="p-20" id="p-20"
[0020] In an embodiment, a motion pattern of the individual is detected based on the reconstructed torso-recorded motion data. 21. 21. id="p-21" id="p-21"
[0021] Generally, all terms used in the claims are to be interpreted according to theirordinary meaning in the technical field, unless explicitly defined otherwise herein. Allreferences to "a/an/the element, apparatus, component, means, step, etc." are to beinterpreted openly as referring to at least one instance of the element, apparatus,component, means, step, etc., unless explicitly stated otherwise. The steps of anymethod disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
BRIEF DESCRIPTION OF THE DRAVVINGS 22. 22. id="p-22" id="p-22"
[0022] Aspects and embodiments are now described, by way of example, with refer- ence to the accompanying drawings, in which:[0023] Figure 1 illustrates a user wearing a torso-mounted accelerometer;[0024] Figure 2 illustrates a user wearing a head-mounted accelerometer; . . id="p-25" id="p-25"
[0025] Figures 3a-c illustrate motion data recorded in three dimensions X, Y and Zby a torso-mounted accelerometer and a headphone-mounted accelerometer, respectively, according to an embodiment: 26. 26. id="p-26" id="p-26"
[0026] Figure 4 illustrates recording of motion data of a wearer, deriving of amapping function, and reconstruction of torso-based motion data according to an embodiment, 27. 27. id="p-27" id="p-27"
[0027] Figure 5 illustrates recorded and reconstructed accelerometer motion signals according to an embodiment; 28. 28. id="p-28" id="p-28"
[0028] Figure 6 illustrates a user requesting to use a previously determined mappingfunction associated with a user profile matching her own user profile according to an embodiment; and 29. 29. id="p-29" id="p-29"
[0029] Figure 7 illustrates a device configured to acquire recorded motion data of an individual according to an embodiment.
DETAILED DESCRIPTION . . id="p-30" id="p-30"
[0030] The aspects of the present disclosure will now be described more fullyhereinafter with reference to the accompanying drawings, in which certain embodiments are shown. 31. 31. id="p-31" id="p-31"
[0031] These aspects may, however, be embodied in many different forms andshould not be construed as limiting; rather, these embodiments are provided by way ofexample so that this disclosure will be thorough and complete, and to fully convey thescope of all aspects to those skilled in the art. Like numbers refer to like elements throughout the description. 32. 32. id="p-32" id="p-32"
[0032] With reference to Figure 1, as previously mentioned, a torso-mountedaccelerometer 101 has the advantage of being placed close to the centre of mass of thebody of the wearer 100, and hence can capture a great deal of motion nuances. The torsois widely considered to be the best location for placement of an accelerometer andavailable algorithms utilized for the motion detection and analysis are typically adaptedfor processing torso-based motion data. However, the torso is not the most convenientposition for wearing the accelerometer 101, and a strap 102 is required to fasten the accelerometer 101 over the wearer's chest. 33. 33. id="p-33" id="p-33"
[0033] A solution is proposed to this problem by allowing placement of theaccelerometer elsewhere on the wearer's body, somewhere more convenient, such as ina holder fastened around an upper part of the wearer's arm or around the wrist, or in apair of in-ear headphones or over-ear headphones to be fastened to the wearer's head. Itis also envisaged that the accelerometer is implemented in for instance a smart phoneplaced in a holder fastened around an upper part of the wearer's arm or around the wrist. 6 34. 34. id="p-34" id="p-34"
[0034] Figure 2 illustrates an accelerometer 103 being part of an in-ear headphone attached to the wearer 100. . . id="p-35" id="p-35"
[0035] However, the problem then arises that the accelerometer 103 no longer isplaced at the centre of mass of the wearer's body, which e. g. has the consequence thatalgorithms developed in this particular technical field no longer will produce reliablemotion analysis data as they assume torso-based motion signals for processing, whilemotion signals originating from other locations of the body will differ from the torso- based motion signals. 36. 36. id="p-36" id="p-36"
[0036] Hence, data representing motion key point indicators (KPIs) such as forexample vertical oscillation (V0), cadence, ground contact time (GCT), etc., recorded forinstance by a headphone-mounted accelerometer 103 will not match data recorded by atorso-mounted accelerometer 101, and the existing torso-based algorithms utilized forthe motion detection and analysis will thus not produce accurate results if beingsupplied with motion data recorded by a headphone-mounted accelerometer. That is,KPIs computed using torso-based algorithms for processing torso-recorded motionsignals will not match KPIs computed using available torso-based algorithms forprocessing motion signals recorded by accelerometers located at a part of the body of the wearer 100 different from the torso. 37. 37. id="p-37" id="p-37"
[0037] An embodiment solves this problem in that signals recorded by anaccelerometer 103 at another mounted location (AML) will be adapted and matched tosignals of a torso-mounted (TM) accelerometer 101 by utilizing a mapping function,after which process TM accelerometer signals may be reconstructed from recordedAML accelerometer signals having been processed by said mapping function.Advantageously, the reconstructed TM accelerometer signals may be processed in thealready available torso-based motion detection and analysis algorithms. In thefollowing, “motion signals” or “motion data” will denote signals produced by theaccelerometers 101, 103 from which motion of the wearer 100 is detected and possiblyeven reconstructed by means of animation to be presented on a suitable display for review by the wearer.[0038] Such signals are illustrated in the following with reference to Figures 3a-c. 39. 39. id="p-39" id="p-39"
[0039] Thus, during a derivation phase, TM motion signals will be recorded by a torso-mounted accelerometer 101, while AML motion signals will be recorded by for 7 instance a headphone-mounted accelerometer 103, both being attached to the wearer lOO. 40. 40. id="p-40" id="p-40"
[0040] Figures 3a-c illustrate accelerometer data recorded in all three dimensions X,Y and Z, respectively, for the TM accelerometer and the AML accelerometer. The TMmotion signals are illustrated with continuous lines, while the AML motion signals areillustrated with continuous lines. There is typically a non-linear relationship between the TM motion signals and the AML motion signals. 41. 41. id="p-41" id="p-41"
[0041] Figure 4 thus illustrates recording of motion data of the wearer 100 using theTM accelerometer 101 in step S101 and recording of motion data of the wearer 100 usingthe AML accelerometer 103 in step S102 during the derivation phase. In a next stepS103, the AML accelerometer data is adjusted such that it matches the TMaccelerometer data by using an appropriately elaborated mapping function M.Optionally, a user profile of the wearer 100 may be acquired and associated with the mapping function M is shown in step S104. This will be discussed further hereinbelow. 42. 42. id="p-42" id="p-42"
[0042] Hence, from the acquired AML accelerometer motion data and the acquiredTM accelerometer motion data, a mathematical mapping function M is derived, therebymapping the two sets of motion data to each other such that during a subsequent TMaccelerometer motion data reconstruction phase, AML accelerometer motion data beingrecorded in step S201 by the AML accelerometer 103 and processed in step S202 by themapping function M will result in (hypothetically recorded) TM accelerometer motion data being reconstructed:TM accelerometer motion data = M(AML accelerometer motion data) In other words, TM accelerometer motion data may subsequently be reconstructed instep S203 from AML accelerometer motion data having been recorded in step S201 and processed by the determined mapping function M in step S20 2. 43. 43. id="p-43" id="p-43"
[0043] Advantageously, once the mapping function M has been determined for thewearer 100 in the derivation phase using motion data of the TM accelerometer 101 andthe AML accelerometer 103, TM accelerometer motion data can be reconstructed by processing recorded AML accelerometer motion data in the mapping function M. 44. 44. id="p-44" id="p-44"
[0044] Hence, the wearer 100 will only initially - during the derivation phase - haveto wear the TM accelerometer 101. Once the mapping function M has been determined, the signals of the AML accelerometer 103 can be utilized to reconstruct the TM 8 accelerometer signals and the TM accelerometer 101 is thus advantageously no longer needed to obtain motion analysis information. 45. 45. id="p-45" id="p-45"
[0045] In an embodiment, the composite three-dimensional motion signal of the AML accelerometer 103 is denoted a(t) = ax(t), ay(t), az(t). 46. 46. id="p-46" id="p-46"
[0046] Similarly, the composite three-dimensional motion signal of the TM accelerometer 101 is denoted T(t) = 'tx(t), 'ty(t), tz(t). 47. 47. id="p-47" id="p-47"
[0047] When determining the mapping function M, the aim is to find an accurate predictor M which results in the following mapping:fr-x(t),»fy(t),a=z(t) = M({ax(t*), ay(ê),az(ê) v t - 6,1, < E< f; + 6u,,}) 48. 48. id="p-48" id="p-48"
[0048] That is, at any given time t, it should be possible to reconstruct the TMaccelerometer motion signal by creating an estimate t*(t) of the TM accelerometermotion signal based on the AML accelerometer motion signal over a time window defined by a lower bound and an upper bound t - ölb and t - öub, respectively. 49. 49. id="p-49" id="p-49"
[0049] During the previously described derivation phase, where the mappingfunction M is determined, the mapping function M may have to be fine-tuned - i.e.calibrated - such that an error between the estimates flt) of the TM accelerometermotion signals (produced by processing the AML accelerometer motion signals a(t) withthe mapping function M) and the actually measured values r(t) of the TM accelerometermotion signals is minimized. As is understood, during the derivation phase the respective accelerometer motion signals T(t) and a(t) are concurrently measured. 50. 50. id="p-50" id="p-50"
[0050] Hence, in this embodiment, the mapping function M is determined such thatthe error between the reconstructed value í*(t)- i.e. the estimated value - of the TMaccelerometer motion signal at a given point in time t, which is based on the AMLaccelerometer motion signal a(t) processed by the mapping function M, and themeasured value T(t) of the TM accelerometer signal at said given point in time is minimized. 51. 51. id="p-51" id="p-51"
[0051] Different methods of determining M is envisaged, such as the utilization ofconvolutional neural networks, recurrent neural networks, neural differential equations, etc. 52. 52. id="p-52" id="p-52"
[0052] Each one of these approaches represents a rich class of functions that can be utilized to embody spatio-temporal mapping provided by M. The selection of the 9 particular approach to be utilized depends on a number of factors, such as for instancedesired level of accuracy required for the mapping or computational load required to reconstruct the TM motion data. 53. 53. id="p-53" id="p-53"
[0053] As an example, a recurrent neural network, such as a so called “long short-term memory” network may be used to derive a highly accurate mapping model M, butit may computationally be too demanding for e. g. smart watch deployment; in whichcase a convolutional neural network may be considered being less accurate but on the other hand requiring less computational power. 54. 54. id="p-54" id="p-54"
[0054] Figure 5 illustrates in an upper view a motion signal recorded by a chest-mounted accelerometer 101 and a motion signal recorded by an ear-mountedaccelerometer 103 over a short span of time (for the Y dimension). The motion signal ofthe chest-mounted accelerometer 101 is illustrated with a continuous line, while the motion signal of the ear-mounted accelerometer 103 is illustrated with a dotted line. 55. 55. id="p-55" id="p-55"
[0055] In a lower view, the motion signal recorded by a chest-mountedaccelerometer 101 (continuous line) is shown together with the chest-recorded motionsignal reconstructed/ estimated from the motion signal recorded by the ear-mountedaccelerometer 103 (dotted line) and processed by the mapping function M. As can beconcluded, it is possible to reconstruct a chest-recorded motion signal which is quite accurate as compared to the actual chest-recorded motion signal. 56. 56. id="p-56" id="p-56"
[0056] As an example, the accuracy of the reconstruction may be determined bycomputing a so-called mean-squared error (MSE), wherein the mapping function Mmay be adjusted such that the reconstructed chest-recorded motion signals correspondsto the actually recorded chest signals in such a manner that the MSE is small (or even minimized). 57. 57. id="p-57" id="p-57"
[0057] Now, after the derivation process is finished, and the function M has beenderived, for instance by a smart phone or computer of the wearer or even by a cloudserver to which accelerometer data is transferred, the wearer no longer needs to use theTM accelerometer but can henceforth use her headphone-mounted accelerometer torecord motion data while the algorithms utilized for motion detection and analysis -possibly hosted on a computer or smart-phone to which the AML accelerometer data issupplied after exercise (or in real-time) and on a display of which an animation may be presented - are the available algorithms utilized to process TM motion data. It may further be envisaged that the derivation process is performed by one device, such as acloud server, while the reconstruction process is performed by another, e. g. a smartphone to which the determined mapping function M has been transferred from the cloud server. 58. 58. id="p-58" id="p-58"
[0058] It may be envisaged that each individual user will undergo the calibrationprocess for best and most accurate end result. However, it could alternatively beenvisaged in an embodiment that a generic model M is derived using AML and TMacceleration motion data of one or more users and that future users can adopt the derived model M with good result. 59. 59. id="p-59" id="p-59"
[0059] In a further embodiment, for each determined mapping function M, acorresponding user profile is stored containing information such as weight, height, sex,chest width, etc., wherein a new user can select a function M based on the closest matching user profile. 60. 60. id="p-60" id="p-60"
[0060] In yet an alternative embodiment, TM and AML accelerometer data of a greatnumber of users having an identical or similar user profile is acquired to determine acorresponding mapping function M. That would have the advantage that a great amountof data originating from users having identical, or at least similar, user profiles will be used to derive a mathematical mapping model M. 61. 61. id="p-61" id="p-61"
[0061] With reference to Figure 6, a user 100 having for instance an earphone-mounted accelerometer 103 and a smart phone 104 may download a selected mappingfunction M from a server 105 located in the cloud 106 to a motion analysis app on thephone 104. The user 100 may selected a mapping function M which best matches herindividual user profile. Alternatively, the user 100 may supply her user profile to thecloud server 105 along with motion data recorded by the AML accelerometer 103 for theuser 100 to the cloud server 105, which subsequently reconstructs TM motion databased on the received user profile and the AML accelerometer motion data and possiblyreturns the reconstructed TM motion data to the smart phone 104 for presentation to the user 100. 62. 62. id="p-62" id="p-62"
[0062] Thus, upon determining a mapping function M based on the recorded TMaccelerometer motion data and the recorded AML accelerometer motion data as described in step S103 in Figure 4, the cloud server 105 will also in step 104 acquire a 11 user profile of the user for which the function M is derived. Hence, a specific user profilemay specify:length: 162 cm,weight: 55 kg,sex: female, and AML accelerometer placement: ear. 63. 63. id="p-63" id="p-63"
[0063] Now, the user profile is stored at the server 105 as profile1 and the corresponding determined mapping function is stored as M1. 64. 64. id="p-64" id="p-64"
[0064] For a second user, a second specific user profile profile2 is stored along withthe corresponding determined mapping function M2. For a third user, a third specificuser profile profileg is stored along with the corresponding determined mappingfunction M3, and so on. Hence, the cloud server 105 may contain a great number of determined mapping functions and associated user profiles. 65. 65. id="p-65" id="p-65"
[0065] Again, with reference to Figure 6, a user 100 wishing to acquire a suitable mapping function may thus make such a request to the server 105 via her smart phone104 and include her user profile with the request. Assuming that the user 100 providesidentical or similar information as that contained in the above exemplified user profile profilel, the server 105 will return the associated mapping function M1. 66. 66. id="p-66" id="p-66"
[0066] Alternatively, as previously discussed, the server 105 may itself use themapping function M1 for reconstruction of TM accelerometer motion data given thatthe user 100 also provides her AML accelerometer motion data on which thereconstruction is to be based. The server 105 may thus subsequently supply the smartphone 104 with the reconstructed of TM accelerometer motion data, or possibly even an animation illustrating the motion pattern of the requesting individual 100. 67. 67. id="p-67" id="p-67"
[0067] The mapping function M1 is likely to be able to accurately reconstruct TMaccelerometer motion signals from the motion signals recorded by the ear-mountedaccelerometer 103 given that the user 100 has the same user profile as the userassociated with profile1. Thus, there is advantageously no need for the user 100 towear a torso-mounted accelerometer to derive a mapping function, but an already determined mapping function M1 can be used with good accuracy. 68. 68. id="p-68" id="p-68"
[0068] Figure 7 illustrates a device 105, e. g. a cloud server, according to an embodiment. The steps of the method performed by the device 105, being embodied e. g. 12 in the form of a computer, server, smart phone, etc., of recording motion data of anindividual according to embodiments are in practice performed by a processing unit 120embodied in the form of one or more microprocessors arranged to execute a computerprogram 121 downloaded to a suitable storage volatile medium 122 associated with themicroprocessor, such as a Random Access Memory (RAM), or a non-volatile storagemedium such as a Flash memory or a hard disk drive. The processing unit 120 isarranged to cause the device 105 to carry out the method according to embodimentswhen the appropriate computer program 121 comprising computer-executableinstructions is downloaded to the storage medium 122 and executed by the processingunit 120. The storage medium 122 may also be a computer program product comprisingthe computer program 121. Alternatively, the computer program 121 may be transferredto the storage medium 122 by means of a suitable computer program product, such as aDigital Versatile Disc (DVD) or a memory stick. As a further alternative, the computerprogram 121 may be downloaded to the storage medium 122 over a network. Theprocessing unit 120 may alternatively be embodied in the form of a digital signalprocessor (DSP), an application specific integrated circuit (ASIC), a field-programmablegate array (FPGA), a complex programmable logic device (CPLD), etc. 69. 69. id="p-69" id="p-69"
[0069] The aspects of the present disclosure have mainly been described above withreference to a few embodiments and examples thereof. However, as is readilyappreciated by a person skilled in the art, other embodiments than the ones disclosedabove are equally possible within the scope of the disclosure, as defined by the appended patent claims. 70. 70. id="p-70" id="p-70"
[0070] Thus, while various aspects and embodiments have been disclosed herein,other aspects and embodiments will be apparent to those skilled in the art. The variousaspects and embodiments disclosed herein are for purposes of illustration and are notintended to be limiting, With the true scope and spirit being indicated by the following claims.

Claims (22)

1. A method of acquiring recorded motion data of an individual (100) comprising: acquiring (S101) motion data of the individual (100) recorded with a first torso-attached accelerometer (101); acquiring (S102) motion data of the individual (100) recorded with a secondaccelerometer (103) attached to a different part of the individual (100) than the torso; determining (S103) a mapping function configured to map the motion datarecorded with the second accelerometer (103) to the motion data recorded with the firstaccelerometer (101) for subsequently reconstructing torso-recorded motion data frommotion data being recorded with the second accelerometer (103) and processed by the determined mapping function.
2. The method of claim 1, wherein the mapping function is determined such that anerror between the reconstructed torso-recorded motion data and the corresponding motion data recorded with the first accelerometer (101) is minimized.
3. The method of claims 1 or 2, further comprising:acquiring (S104) a user profile of the individual (100) for which the motion data is acquired, the user profile being associated with the determined mapping function.
4. The method of claim 3, the user profile comprising information including at least one of weight, height, sex, chest width, placement of the second accelerometer (103).
5. The method of any one of claims 3 or 4, wherein the mapping function is determined based on motion data of a plurality individuals having a similar user profile.
6. A method of reconstructing motion data of an individual (100) comprising: acquiring (S201) motion data of the individual (100) recorded with a secondaccelerometer (103) attached to a different part of the individual (100) than a torso; processing (S202) the acquired motion data in a mapping function having beenpreviously determined by mapping the motion data recorded with the secondaccelerometer (103) to motion data of the individual (100) recorded with a first torso-attached accelerometer (101), thereby reconstructing (S203) torso-recorded motiondata.
7. The method of claim 6, wherein the mapping function is determined using the method of claim 1. 14
8. The method of claims 6 or 7, Wherein a mapping function having been previouslydetermined for a first individual is used to reconstruct torso-recorded motion data of a second individual.
9. The method of claim 8, further comprising; receiving a request to use a determined mapping function, the request comprisinga user profile of the requesting individual and motion data of the requesting individual(100) recorded with a second accelerometer (103) attached to a different part of theindividual (100) than a torso; and processing the motion data of the requesting individual using a mapping functionassociated with a user profile best matching the user profile of the requesting individual (100) to reconstruct torso-recorded motion data of the requesting individual.
10. The method of any one of claims 6-9, further comprising:detecting a motion pattern of the individual based on the reconstructed torso- recorded motion data.
11. A device (105) configured to acquire recorded motion data of an individual (100),the device (105) comprising a processing unit (120) and a memory (122), said memorycontaining instructions (121) executable by said processing unit (120), whereby thedevice (105) is operative to: acquire motion data of the individual (100) recorded With a first torso-attachedaccelerometer (1o1); acquire motion data of the individual (100) recorded with a second accelerometer(103) attached to a different part of the individual (100) than the torso; determine a mapping function configured to map the motion data recorded Withthe second accelerometer (103) to the motion data recorded with the first accelerometer(101) for subsequently reconstructing torso-recorded motion data from motion databeing recorded with the second accelerometer (103) and processed by the determined mapping function.
12. The device (105) of claim 11 being operative to determine the mapping functionsuch that an error between the reconstructed torso-recorded motion data and the corresponding motion data recorded with the first accelerometer (101) is minimized.
13. The device (105) of claims 11 or 12, further being operative to:acquire a user profile of the individual (100) for which the motion data is acquired, the user profile being associated with the determined mapping function.
14. The device (105) of claim 3, the user profile comprising information including at least one of weight, height, sex, chest width, placement of the second accelerometer(103)-
15. The device (105) of any one of claims 13 or 14, being operative to determine themapping function based on motion data of a plurality individuals having a similar user profile.
16. A device (105) configured to reconstruct motion data of an individual (100), thedevice (105) comprising a processing unit (120) and a memory (122), said memorycontaining instructions (121) executable by said processing unit (120), whereby thedevice (105) is operative to: acquire motion data of the individual (100) recorded with a second accelerometer(103) attached to a different part of the individual (100) than a torso; process the acquired motion data in a mapping function having been previouslydetermined by mapping the motion data recorded with the second accelerometer (103)to motion data of the individual (100) recorded with a first torso-attached accelerometer (101), thereby reconstructing (S203) torso-recorded motion data.
17. The device (105) of claim 16, further being configured to determine the mapping function using the method of claim 1.
18. The device (105) of claims 16 or 17, being configured to use a mapping functionhaving been previously determined for a first individual to reconstruct torso-recorded motion data of a second individual.
19. The device (105) of claim 18, further being operative to: receive a request to use a determined mapping function, the request comprising auser profile of the requesting individual and motion data of the requesting individual(100) recorded with a second accelerometer (103) attached to a different part of theindividual (100) than a torso; and process the motion data of the requesting individual using a mapping functionassociated with a user profile best matching the user profile of the requesting individual (100) to reconstruct torso-recorded motion data of the requesting individual. 16
20. The device (105) of any one of claims 16-19, further being operative to:detect a motion pattern of the individual based on the reconstructed torso- recorded motion data.
21. A computer program (121) comprising computer-executable instructions forcausing a device (105) to perform steps recited in any one of claims 1-1o when thecomputer-executable instructions are executed on a processing unit (120) included in the device (105).
22. A computer program product comprising a computer readable medium (122), thecomputer readable medium having the computer program (121) according to claim 21 embodied thereon.
SE1950879A 2019-07-10 2019-07-10 Torso-mounted accelerometer signal reconstruction SE1950879A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
SE1950879A SE1950879A1 (en) 2019-07-10 2019-07-10 Torso-mounted accelerometer signal reconstruction
EP20744177.5A EP4061215A1 (en) 2019-07-10 2020-06-15 Torso-mounted accelerometer signal reconstruction
PCT/SE2020/050619 WO2021006790A1 (en) 2019-07-10 2020-06-15 Torso-mounted accelerometer signal reconstruction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1950879A SE1950879A1 (en) 2019-07-10 2019-07-10 Torso-mounted accelerometer signal reconstruction

Publications (1)

Publication Number Publication Date
SE1950879A1 true SE1950879A1 (en) 2021-01-11

Family

ID=71741868

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1950879A SE1950879A1 (en) 2019-07-10 2019-07-10 Torso-mounted accelerometer signal reconstruction

Country Status (3)

Country Link
EP (1) EP4061215A1 (en)
SE (1) SE1950879A1 (en)
WO (1) WO2021006790A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170332946A1 (en) * 2016-05-17 2017-11-23 Harshavardhana Narayana Kikkeri Method and program product for multi-joint tracking combining embedded sensors and an external sensor
US20180020978A1 (en) * 2016-07-25 2018-01-25 Patrick Kaifosh System and method for measuring the movements of articulated rigid bodies
US20180070864A1 (en) * 2016-06-02 2018-03-15 Matthew Schuster Methods and devices for assessing a captured motion
US20180153444A1 (en) * 2016-12-05 2018-06-07 Intel Corporation Body movement tracking
WO2019014238A1 (en) * 2017-07-10 2019-01-17 Georgia Tech Research Corporation Systems and methods for tracking body movement
US20190038187A1 (en) * 2017-08-03 2019-02-07 Latella Sports Technologies, LLC Systems and methods for evaluating body motion

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2636361B1 (en) * 2012-03-06 2022-03-30 Polar Electro Oy Exercise monitoring using acceleration measurement
WO2019043601A1 (en) * 2017-08-29 2019-03-07 Myotest Sa A method and device for retrieving biomechanical parameters of a stride

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170332946A1 (en) * 2016-05-17 2017-11-23 Harshavardhana Narayana Kikkeri Method and program product for multi-joint tracking combining embedded sensors and an external sensor
US20180070864A1 (en) * 2016-06-02 2018-03-15 Matthew Schuster Methods and devices for assessing a captured motion
US20180020978A1 (en) * 2016-07-25 2018-01-25 Patrick Kaifosh System and method for measuring the movements of articulated rigid bodies
US20180153444A1 (en) * 2016-12-05 2018-06-07 Intel Corporation Body movement tracking
WO2019014238A1 (en) * 2017-07-10 2019-01-17 Georgia Tech Research Corporation Systems and methods for tracking body movement
US20190038187A1 (en) * 2017-08-03 2019-02-07 Latella Sports Technologies, LLC Systems and methods for evaluating body motion

Also Published As

Publication number Publication date
WO2021006790A1 (en) 2021-01-14
EP4061215A1 (en) 2022-09-28

Similar Documents

Publication Publication Date Title
US10313818B2 (en) HRTF personalization based on anthropometric features
WO2020069116A1 (en) Techniques for generating media content
US11783335B2 (en) Transaction confirmation and authentication based on device sensor data
CN110348543A (en) Eye fundus image recognition methods, device, computer equipment and storage medium
CN110121118A (en) Video clip localization method, device, computer equipment and storage medium
CN110147745B (en) Video key frame detection method and device
CN110222728B (en) Training method and system of article identification model and article identification method and equipment
CN107833219A (en) Image-recognizing method and device
Kuhad et al. Using distance estimation and deep learning to simplify calibration in food calorie measurement
CN113192536B (en) Training method of voice quality detection model, voice quality detection method and device
US11595772B2 (en) Information processing device, information processing method, and information processing program
CN113473201A (en) Audio and video alignment method, device, equipment and storage medium
Beecy et al. Development of novel machine learning model for right ventricular quantification on echocardiography—a multimodality validation study
SE1950879A1 (en) Torso-mounted accelerometer signal reconstruction
Engan et al. Chest compression rate measurement from smartphone video
SE1950996A1 (en) Advancement manager in a handheld user device
CN110134902A (en) Data information generation method, device and storage medium
CN112559794A (en) Song quality identification method, device, equipment and storage medium
CN110443841A (en) The measurement method of ground depth, apparatus and system
KR20240043488A (en) A multimodal deep learning model for predicting future visual field in glaucoma patients
Asada et al. A System for Facial Expression Analysis of a Person While Using Video Phone.
KR20220118737A (en) Apparatus and method for sarcopenia diagnosis and training
CN115727870A (en) Data processing method and device and terminal equipment
KR20220010244A (en) Method, system and non-transitory computer-readable recording medium for providing golf-related contents
CN113496243A (en) Background music obtaining method and related product

Legal Events

Date Code Title Description
NAV Patent application has lapsed