WO2022144929A1 - Modular body-motion analysis system, device and method thereof - Google Patents

Modular body-motion analysis system, device and method thereof Download PDF

Info

Publication number
WO2022144929A1
WO2022144929A1 PCT/IN2022/050005 IN2022050005W WO2022144929A1 WO 2022144929 A1 WO2022144929 A1 WO 2022144929A1 IN 2022050005 W IN2022050005 W IN 2022050005W WO 2022144929 A1 WO2022144929 A1 WO 2022144929A1
Authority
WO
WIPO (PCT)
Prior art keywords
engine
person
kinematic
motion data
motion
Prior art date
Application number
PCT/IN2022/050005
Other languages
French (fr)
Inventor
Anant Sharma
Shwetank SHREY
Ayush KUSHWAHA
Aman Parnami
Original Assignee
Tweek Labs Private Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tweek Labs Private Limited filed Critical Tweek Labs Private Limited
Publication of WO2022144929A1 publication Critical patent/WO2022144929A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0024Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system for multiple sensor units attached to the patient, e.g. using a body or personal area network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4848Monitoring or testing the effects of treatment, e.g. of medication
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7465Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/10Athletes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/04Arrangements of multiple sensors of the same type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present disclosure relates to the monitoring body motions. More particularly, the present disclosure relates to a modular and contextual body-motion analysis system, device and method for providing a comprehensive analysis pertaining to the biomechanics of any sport or activity being performed.
  • a few methods for motion capture include optical and inertial motion capture systems.
  • Optical motion capture systems require one or more cameras that might track a number of markers on the user's body placed on appropriate points or deduce the pose through the image feed without the need of any markers. Usually, such systems require complex set-up and calibration and are usually used in laboratories.
  • inertial motion capture systems involve placing inertial measurement units (IMU) on appropriate parts of the user's body and reconstructing the motion through the sensor feed.
  • IMU inertial measurement units
  • a modular body-motion analysis system for monitoring and analysing the movement of a body part of a person during a real-world action
  • the modular body-motion analysis system including; a sensing unit configured to sense motion data associated with the body part; a communication unit coupled to the sensing unit and configured to transmit the motion data sensed by the sensing unit to a processing unit; the processing unit coupled to the sensing unit and configured to process the motion data, the processing unit further including: a kinematic engine and an analytics engine; such that the kinematic engine is configured to simulate the motion data on a digital rig to obtain the person’s movement for extracting a plurality of kinematic features; and the analytics engine is configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person’s movement; and a presentation unit coupled to the processing unit and configured to output the motion data processed by the processing unit.
  • the modular body-motion analysis system further including: a database coupled to the processing unit and configured to store the motion data received from the sensing unit and an intermediate output associated with the processing unit.
  • the data stored in the database enables training of a machine learning model associated with the analytics engine in an evolving manner to generate a potential resolution for a range of body movements performed by the person.
  • a number of sensors of the sensing unit are adapted to be placed on the body part of the person as well as to the extended body /augmentation held by the person.
  • the processing unit further including: a pre-processing engine configured to perform a set of pre-processing algorithms to decompress and segregate the motion data from the communication unit to standardize the motion data; a synchronization engine coupled to the pre-processing engine and configured to enable the motion data from the number of sensors of the sensing unit to be synchronised and generate aggregated time-series motion data frames with motion data from the body part at every specific instant; and the kinematic engine coupled to the synchronization engine and configured to use the time-series motion data frames and run four-dimensional simulations (space and time) on a human rig in a physics-based environment to generate the kinematic features of the entire motion, the kinematic features including biomechanical raw data like linear and angular positions, speeds, accelerations, forces, etc.
  • a pre-processing engine configured to perform a set of pre-processing algorithms to decompress and segregate the motion data from the communication unit to standardize the motion data
  • a synchronization engine coupled to the pre-
  • the analytics engine is further configured to identify a key activity from a kinematic time-series data stream using segmentation algorithms and a plurality of key events within the key activity using event detection algorithms to generate biomechanical insight from the kinematic features obtained from the kinematic engine at the identified key events by using biomechanical algorithms; and summarizing the generated biomechanical insights generated for the key activities over time periods to track progress.
  • a method for analysing movement associated with a body part of a person including; placing the sensing unit on the body parts of the person to sense motion data associated with the person; calibrating a kinematic engine by enabling the standing of the person in a standard pose; performing a regular real-world sporting action during training session; generating real-time feedback of biomechanical insights from a key activity associated with a sport; presenting feedbacks based on the generated biomechanical insights on a presentation unit.
  • the method further includes, training a machine learning model associated with an analytics engine in an evolving manner to generate a potential resolution for a range of body movements performed by the person by using the data stored in a database.
  • the processing unit further includes: a pre-processing engine configured to perform a set of pre-processing algorithms to clean the motion data from the communication unit and bring them to a standardized form; a synchronization engine coupled to the pre-processing engine and configured to enable the motion data from the different sensor units in the sensing unit to be synchronised and generate aggregated time- series motion data frames with individual motion data from the full body at every specific instant; a kinematic engine coupled to the synchronization engine and configured to use time-series motion data frames and run four-dimensional simulations (space and time) on a human rig in a physics-based environment to generate kinematic features of the body movement performed by the person, the kinematic features including biomechanical raw data like linear and angular positions, speeds, accelerations, forces, etc.
  • the analytics engine further includes: identification of key activities from a kinematic time-series data stream using segmentation algorithms; identification of key events within the key activity using event detection algorithms; biomechanical insight generation from the kinematic features obtained from the kinematic engine at these key events using biomechanical algorithms; and summarization of the various biomechanical insights generated for key activities over some time periods to track progress.
  • a wearable device for analysing motion associated with a body part of a person, the device including; a sensing unit configured to sense motion data associated with the body part; a processing unit coupled to the sensing unit and configured to process the motion data, the processing unit further including: a kinematic engine and an analytics engine; and the kinematic engine is configured to simulate the motion data on a digital rig to obtain the person’s movement for extracting a plurality of kinematic features; and the analytics engine is configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person’s movement.
  • the wearable device further includes: a database coupled to the processing unit and configured to store the motion data received from the sensing unit and an intermediate output associated with the processing unit.
  • the data stored in the database enables training of a machine learning model associated with the analytics engine in an evolving manner to generate a potential resolution for a range of body movements performed by the person.
  • a number of sensors of the sensing unit are adapted to be placed on the body part of the person as well as to the extended body /augmentation held by the person.
  • FIG. 1 illustrates a schematic view of the modular body-motion analysis system for monitoring motion associated with the body part of a person and deducing improper movement of the body part of the person, according to an embodiment herein;
  • FIG. 2 illustrates an indicator diagram for indicating the kinematic features generated in a kinematic engine are handled in the analytics engine, according to an embodiment herein;
  • FIG. 3 illustrates a flowchart of a method for a coach to monitor body motions of a sportsperson through analysis of motion data by the body-motion analysis system and provide actionable feedback in real-time, according to an embodiment herein;
  • FIG. 4 illustrates a wearable device for analysing motion associated with a body part of a person, according to an embodiment herein.
  • the term “initializing” indicates a pre-processing of the raw data by the raw data unit.
  • the pre-processing includes filtering, fusion for sensor fused orientation data, synchronization and encryption.
  • rig human rig
  • digital rig indicate a digital model (simulation) designed to mimic the skeleton of a person's body.
  • raw data and “sensed data” indicate the information relating to the movement and orientation of a body part of a person, sensed by the sensors.
  • variable engines or “engines” as used herein the description that are associated with a unit of the system of the present disclosure indicates all the sub- components of the layers of the system of the present disclosure for which the term is used.
  • the aspect herein overcomes the limitations of the prior art by providing a system and method for recording body motion along with the extended body or augmentations such as a bat, a racket or a hockey stick held by the person, as needed in an activity on the field or a real-world sporting action.
  • the system and method as per the present disclosure, performs an automated and detailed biomechanical analysis in context of the activity or sport being conducted and provides a real time comprehensive analysis of the body and the augmentations in an accessible and user-friendly form, without a need for manual intervention.
  • FIG. 1 illustrates a schematic view of a modular body-motion analysis system (100) for monitoring the motion associated with the body part of a person and deducing improper movement of the body part of the person.
  • the modular body-motion analysis system (100) includes a sensing unit (101), a communication unit (102), a processing unit (103), a presentation unit (104), and a database (109).
  • the sensing unit (101) further includes a motion capture unit (not shown).
  • the processing unit (103) further includes a processor.
  • the presentation unit (104) further includes a tablet.
  • the processor of the processing unit (103) further includes a pre-processing engine (105), a synchronization engine (106), a kinematic engine (107), and an analytics engine (108).
  • the database (109) is coupled to the processor of the processing unit (103).
  • the motion capture unit of the sensing unit (101) is coupled to the pre-processing engine (105) of the processing unit (103) through the communication unit (102).
  • the pre- processing engine (105) of the processing unit (103) is coupled to the synchronization engine (106).
  • the synchronization engine (106) is coupled to the kinematic engine (107).
  • the kinematic engine (107) is coupled to the analytics engine (108).
  • the processing unit (103) is coupled to the tablet of the presentation unit (104).
  • the motion capture unit of the sensing unit (101) includes a number of sensors (herein referred to as a sensor for single component) that are placed on the body part of the person.
  • the modular body-motion analysis system (100) enables the motion capture unit of the sensing unit (101) to extend up to the extended body or augmentations such as a bat, a hockey stick, a racket held by the person while being involved in a sporting activity i.e. the number of sensors are fitted on the extended body or augmentations held by the person.
  • the extension of the motion capture unit (101) up to the extended body held by the person enables a complete coverage for the modular body-motion analysis system (100) to collectively analyse the body motions exhibited by the person along-with the motions imparted on the extended body that is held by the person.
  • the number of sensors of the motion capture unit of the sensing unit (101) are coupled on the body part through attachments provided on a skintight clothing and are adapted to sense the movement from the orientation data associated with the person.
  • the number of sensors of the motion capture unit of the sensing unit (101) are Inertial Measurement Units (IMU’s).
  • the presentation unit (104) further includes a mobile device, a computer, a personal digital assistance device (PDA).
  • PDA personal digital assistance device
  • the processor of the processing unit (103) can be deployed as a cloud.
  • the processor of the processing unit (103) can be any or a combination of microprocessor, microcontroller, iOS Uno, At mega 328, Raspberry Pi or other similar processing unit, and the like.
  • the processor of the processing unit (103) can include one or more processors coupled with the database (109), the database (109) storing instructions executable by the one or more processors.
  • the processor of the processing unit (103) can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions.
  • the database (109) can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service.
  • the database (109) can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
  • the processor of the processing unit (103) can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor.
  • programming for the processor may be processor executable instructions stored on a non-transitory machine -readable storage medium and the hardware for the processor may include a processing resource (for example, one or more processors), to execute such instructions.
  • the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processor.
  • the processor can include the machine- readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to processor and the processing resource.
  • the processor may be implemented by an electronic circuitry.
  • the database (109) can include data that is either stored or generated as a result of functionalities implemented by any of the components of the processor.
  • the sensing unit (101) transmits the motion data (raw data) sensed by the motion capture unit of the sensing unit (101) to the processing unit (103).
  • the processing unit (103) receives, analyses and processes the motion data received from the motion capture unit of the sensing unit (101) to generate an output, which is overlaid onto an end user application.
  • the tablet of the presentation unit (104) displays the overlaid output generated in the processing unit (103).
  • the database (109) is adapted to store the sensed raw data received at the processing unit (103) and the intermediate data received from the pre-processing engine (105), the synchronization engine (106), the kinematic engine (107) and the analytics engine (108) in order to enable training of a machine learning model associated with the analytics engine (108) in an evolving manner to accommodate the biases of the various engines of the processing unit (103) and further improve the capabilities of the engines of the processing unit (103).
  • the machine learning model of the analytics engine (108) is trained in an evolving manner such that a potential solution for a range of non-optimal posture and body movement during any action is generated.
  • the sensed raw data is processed in the engines of the processing unit (103) as described herein below:
  • Pre-processing engine (105) filters the raw data received from the motion capture unit of the sensing unit (101).
  • the accelerometer, magnetometer, and gyroscope data captured by the motion capture unit is filtered into quaternions or Euler angles and directly used to provide the orientation of the various parts of the body.
  • a sensor fusion system is implemented to process the sensor data in following manner: a. A software sensor fusion solution on the processing unit, through algorithms like AHRS filters - Madgwick or Mahony filters, or Kalman filters, etc. b.
  • the pre-processing engine (105) performs filtering and further pose detection algorithms to perform segmentation of the human body from the video stream of the sensing unit (101) and identify the various body parts to estimate the three- dimensional spatial coordinates of these body parts.
  • Convolutional Neural Networks might be trained for the same or available algorithms might be employed such as Detectron, Openpose or Alphapose.
  • Additional security measures are implemented on the hardware level to encrypt the data at the individual sensors of the motion capture unit of the sensing unit (101).
  • Synchronization Engine (106) a) The intermediate data we have after pre-processing through the preprocessing engine (105) consists of orientation data of various body parts in case of an inertial motion capture unit or position data of various body parts in case of an optical motion capture unit in the sensing unit (101). These individual data streams are time- synchronized in the synchronization engine (106) of the processing unit (102), which employs asynchronous modes of communication. b) The synchronization engine (106) generates a time-series data stream where these individual motion data streams are associated with timestamps and aggregated to create time-series data-frames.
  • a digital rig is adapted to conform to the skeleton of the human.
  • the rig is made of cylinders or sticks joined to each other to mimic a skeleton of the body of the person and implement kinematics from the motion data to recreate the motion of the user on the human rig, thereby running four-dimensional simulations (space and time) on the human rig in a physics-based environment.
  • a mathematical model of the user’ s movement is obtained from which a set of kinematic features including biomechanical raw data like linear and angular positions, speeds, accelerations, forces, etc.
  • the analytics engine (108) is configured to provide a context of the biomechanics associated with a sport or activity undertaken by the person to the kinematic features (204) that are obtained from the kinematic engine (107).
  • the analytics engine (108) includes a set of units for each such sport or activity, and each unit is configured with a set of algorithms that takes some kinematic features (204), identifies a number of key events (203) and combines these events and features during key activities (202) to provide contextual insights.
  • the positions and orientations of the various bones on the human ng are further used for the animation of a 3D model to present in the presentation unit (104), mirroring the actions of the user.
  • the user In case of an inertial motion capture unit being employed in the sensing unit (101), the user needs to undergo a calibration process to map the incoming orientation data with a standard pose (reference data) such as the T-pose, N-pose, etc., so that a reference is provided for the transformation of the quaternions as part of the projection.
  • the calibration process is a continuous process and needs to be repeated with the involvement of a user to recalibrate or through an automated system based on the posture of the user at a given instance.
  • a Markov Chain or Recurrent Neural Network based model can be used to predict the user’s future actions and the sensor values could be calibrated based on this model.
  • FIG. 2 illustrates an indicator diagram (200) to indicate a set of kinematic features (204) generated in the kinematic engine (107) of the processing unit (103) such that the kinematic features (204) are handled in the analytics engine (108).
  • the analytics engine (108) provides a context of the biomechanics associated with a sport or activity undertaken by the person to the kinematic features (204) that are obtained from the kinematic engine (107).
  • the analytics engine (108) includes a set of units for each such sport or activity, and each unit is configured with a set of algorithms that takes some kinematic features (204), identifies some key events (203) and combines these events and features during key activities (202) to provide contextual insights.
  • the kinematic features (204) obtained from the kinematic engine (107) are a time-series data stream (201).
  • the analytics engine (108) first identifies key activities (202) using segmentation algorithms from the appropriate units, for example, in case of a bowler in cricket, this would be a ball bowled or in case of a tennis player, this would be a stroke by the racket. d) The analytics engine (108) then runs event detection algorithms from the appropriate units to identify key events (203) within the key activity (202), for example, in case of a bowler in cricket, these would be the run-up, the pre-delivery and the follow-through or in case of a batsman in cricket, these would be the down-swing, the impact and the follow- through.
  • the analytics engine (108) then runs the biomechanical algorithms from the appropriate units to create biomechanical insights from the kinematic features (204) obtained from the kinematic engine (107) at these key events (203). f) The analytics engine (108) thereby runs summarization algorithms from the appropriate units to create session summaries of the various biomechanical insights generated for key activities over some time periods to track progress. g) The analytics engine (108) stores all the data comprehensively for each key activity in the database (109) to serve to the presentation unit (104).
  • the number of sensors of the motion capture unit of the sensing unit (101) are attached to a skin-tight clothing in order to provide an accurate motion data from the person’s body.
  • the number of sensors of the motion capture unit of the sensing unit (101) are attached to the body of the person and the extended body through a fitment.
  • the fitment is either directly mounted on a cavity designed over a clothing worn by the person or on the attachment means provided on a belt.
  • the belt is wrapped tightly around different portions of the body.
  • the fitment is tightly maintained to the skin in order to suppress or completely avoid a noise generated through intermediate cloth movement and an air friction leading to false data identification.
  • the noises lead to non-optimal sensor data and thereby filtration of such noises is required, but such filtration of the noises is a tricky and tedious task.
  • the attachment of the fitment of the present disclosure thus enables attachment of the number of sensors of the motion capture unit of the sensing unit (101) tightly to the body of the person and to the extended body held by the person that eliminates the noise generation.
  • the fitment for attaching the number of sensors of the motion capture unit of the sensing unit (101) is modular that enables attachment of the number of sensors to the body of the person as well as to the extended body held by the person.
  • the number of sensors of the motion capture unit of the sensing unit (101) can be mechanical motion sensor or magnetic sensor.
  • the number of sensors of the motion capture unit of the sensing unit (101) are mountable or embedded under the skin or over the bone of the person.
  • the communication unit (102) enables the communication between the sensing unit (101) and the processing unit (103) through a wireless (ANT, Bluetooth, BLE, Wi-Fi, etc.) or wired mode (UART, SPI, I2C, etc.).
  • a wireless ANT, Bluetooth, BLE, Wi-Fi, etc.
  • wired mode UART, SPI, I2C, etc.
  • each sensor of the motion capture unit of the sensing unit (101) is coupled individually to a gateway to the processing unit (103) and sends data through the wired protocols or wireless communication protocols in predefined manner via the communication unit (102).
  • the number of sensors of the motion capture unit of the sensing unit (101) aggregate their respective sensed data at the communication unit (102) before sending the aggregated sensed data through the wired protocols or wireless communication protocols in predefined manner via the communication unit (102).
  • the communication from the sensors is synchronous in nature in which the sender waits for a response from the receiver to continue further computation. Further, both sender and receiver coupled through the communication unit (102) should be in active state.
  • the communication from the sensors is asynchronous in nature in which the sender does not wait for a response from the receiver.
  • the receiver can be inactive and once the receiver is active, then the sensed data is received and processed accordingly.
  • the sender puts the sensed data in a message queue in the communication unit (102) and does not require an immediate response to continue processing.
  • the sensed data can either be sent at a very high rate in small packets or in a very low rate with big packets.
  • the data transmission rate determines a latency present in a playback capability of a motion capture.
  • FIG. 3 illustrates a flowchart of a method (300) to monitor motions associated with the body part of a sportsperson by a coach through analysis of the motion data captured by a motion capture unit of a sensing unit (101) of the modular bodymotion analysis system (100) and provide actionable feedback in real-time.
  • the method (300) is able to prevent a potential injury or improve efficiency of movement of body parts.
  • the method (300) includes following steps: placing (302) the sensing unit (101) on the specified body parts of the person to sense motion data associated with the person calibrating (304) a kinematic engine (107) by enabling the standing of the person in a standard pose performing (306) regular real- world sporting action during training session generating (308) real-time feedbacks of biomechanical insights for the coach from a number of key activities (202) associated with a sport presenting (310) feedback to improve the performance of the sporting activity executed by the sportsperson based on the generated biomechanical insights on a presentation unit (104).
  • the motion capture unit (101) is configured to transmit biomechanical insights to a presentation unit (104) coupled to the processing unit (103).
  • the method (300) further configured to train a machine in an evolving manner to generate a potential resolution for a range of non-optimal posture and body movement during a sporting action by use of stored data associated with the database (109).
  • the processing unit (103) further comprises: a pre-processing engine (105) configured to perform a set of pre-processing algorithms to clean the motion data from the communication unit (102) and bring them to a standardized form; a synchronization engine (106) coupled to the pre-processing engine (105) and configured to enable motion data from the different sensor units in the sensing unit (101) to be synchronised and generate aggregated time-series motion data frames with individual motion data from the full body at every specific instant; a kinematic engine (107) coupled to the synchronization engine (106) and configured to use the time-series motion data frames and run four-dimensional simulations (space and time) on a human rig in a physics-based environment to generate the kinematic features of the entire motion, the kinematic features including biomechanical raw data like linear and angular positions, speeds, accelerations, forces, etc.
  • a pre-processing engine (105) configured to perform a set of pre-processing algorithms to clean the motion data from the communication unit (102) and bring them to a standardized
  • an analytics engine (108) coupled to the kinematic engine (107) and configured for each sport and activity to run the biomechanical algorithms involved in the particular sport or activity.
  • the analytics engine (108) further configured to identify key activities (202) from a kinematic time-series data stream (201) using segmentation algorithms; and identify key events (203) within the key activity (202) using event detection algorithms; to generate biomechanical insights from the kinematic features (204) obtained from the kinematic engine (107) at these key events (203) using biomechanical algorithms; and summarizing the biomechanical insights generated for key activities (202) over some time periods to track progress.
  • the device (400) includes a sensing unit (101) configured to sense motion data associated with the body part; a processing unit (103) coupled to the sensing unit (101) and configured to process the motion data, the processing unit (103) further including: akinematic engine (107) and an analytics engine (108); and the kinematic engine (107) is configured to simulate the motion data on a digital rig to obtain the person’s movement for extracting a plurality of kinematic features; and the analytics engine (108) is configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person’s movement.
  • the wearable device (400) further includes: a database (109) coupled to the processing unit (103) and configured to store the motion data received from the sensing unit (101) and an intermediate output associated with the processing unit (103).
  • the data stored in the database (109) enables training of a machine learning model associated with the analytics engine (108) in an evolving manner to generate a potential resolution for a range of body movements performed by the person.
  • a number of sensors of the sensing unit (101) are adapted to be placed on the body part of the person as well as to the extended body /augmentation held by the person.
  • Biomechanics of the sports are easily detected that assists the person to take accurate sporting actions while playing.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Physiology (AREA)
  • Artificial Intelligence (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Evolutionary Computation (AREA)
  • Nursing (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Disclosed is a modular body-motion analysis system (100) including; a sensing unit (101) configured to sense motion data associated with the body part; a processing 5 unit (103) coupled to the sensing unit (101) and configured to process the motion data, the processing unit (103) further includes: a kinematic engine (107) and an analytics engine (108); such that the kinematic engine (107) is configured to simulate the motion data on a digital rig to obtain the person's movement for extracting a plurality of kinematic features; and the analytics engine (108) is 10 configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person's movement; and a presentation unit (104) coupled to the processing unit (103) and configured to output the motion data processed by the processing unit (103).

Description

MODULAR BODY-MOTION ANALYSIS SYSTEM, DEVICE AND METHOD THEREOF
TECHNICAL FIELD
The present disclosure relates to the monitoring body motions. More particularly, the present disclosure relates to a modular and contextual body-motion analysis system, device and method for providing a comprehensive analysis pertaining to the biomechanics of any sport or activity being performed.
BACKGROUND
Conventionally, it has been desired to analyse body motions for a range of purposes. A few of these applications include sports training, ortho diagnostics, gaming, animation, virtual reality /mixed reality /augmented reality, filmmaking, research on the body's movements/ergonomics, human computer interaction and robotics. A few methods for motion capture include optical and inertial motion capture systems. Optical motion capture systems require one or more cameras that might track a number of markers on the user's body placed on appropriate points or deduce the pose through the image feed without the need of any markers. Usually, such systems require complex set-up and calibration and are usually used in laboratories. On the other hand, inertial motion capture systems involve placing inertial measurement units (IMU) on appropriate parts of the user's body and reconstructing the motion through the sensor feed. The prior art methods of placements of IMU’s are not efficient and the IMU’s so placed may get dislocated from their designated position. The dislocation of IMU’s results in inaccurate capturing of body motion, which impedes the purpose of the body motion analysis systems. Therefore, the systems might not have the high precision of the optical motion capture systems but are usable outside of laboratories. Some systems for sports training available today are specialised for a particular activity or an analysis data point. They only capture the movement of the particular body parts, or the movement of the bat, racket or just the arm, which is not scalable across activities or sports. Some systems use a single GPS sensor to pinpoint the location of the body globally and perform spatial analysis using the location to calculate speed, reach or other spatial parameters that are general across sports or activities. Motion capture systems still require manual intervention to provide the context of the biomechanics of the sport or activity. Therefore, the player, coach, analyst or physiotherapist using the system has to enlist the aid of a technical expert to use the system effectively in the training process.
There is therefore a need in the art to provide a system and method to solve the problems as mentioned hereinabove.
SUMMARY
In view of the foregoing, a modular body-motion analysis system for monitoring and analysing the movement of a body part of a person during a real-world action, the modular body-motion analysis system including; a sensing unit configured to sense motion data associated with the body part; a communication unit coupled to the sensing unit and configured to transmit the motion data sensed by the sensing unit to a processing unit; the processing unit coupled to the sensing unit and configured to process the motion data, the processing unit further including: a kinematic engine and an analytics engine; such that the kinematic engine is configured to simulate the motion data on a digital rig to obtain the person’s movement for extracting a plurality of kinematic features; and the analytics engine is configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person’s movement; and a presentation unit coupled to the processing unit and configured to output the motion data processed by the processing unit. The modular body-motion analysis system further including: a database coupled to the processing unit and configured to store the motion data received from the sensing unit and an intermediate output associated with the processing unit. The data stored in the database enables training of a machine learning model associated with the analytics engine in an evolving manner to generate a potential resolution for a range of body movements performed by the person. A number of sensors of the sensing unit are adapted to be placed on the body part of the person as well as to the extended body /augmentation held by the person. The processing unit further including: a pre-processing engine configured to perform a set of pre-processing algorithms to decompress and segregate the motion data from the communication unit to standardize the motion data; a synchronization engine coupled to the pre-processing engine and configured to enable the motion data from the number of sensors of the sensing unit to be synchronised and generate aggregated time-series motion data frames with motion data from the body part at every specific instant; and the kinematic engine coupled to the synchronization engine and configured to use the time-series motion data frames and run four-dimensional simulations (space and time) on a human rig in a physics-based environment to generate the kinematic features of the entire motion, the kinematic features including biomechanical raw data like linear and angular positions, speeds, accelerations, forces, etc. of various points of the body. The analytics engine is further configured to identify a key activity from a kinematic time-series data stream using segmentation algorithms and a plurality of key events within the key activity using event detection algorithms to generate biomechanical insight from the kinematic features obtained from the kinematic engine at the identified key events by using biomechanical algorithms; and summarizing the generated biomechanical insights generated for the key activities over time periods to track progress.
In an aspect, a method for analysing movement associated with a body part of a person, the method including; placing the sensing unit on the body parts of the person to sense motion data associated with the person; calibrating a kinematic engine by enabling the standing of the person in a standard pose; performing a regular real-world sporting action during training session; generating real-time feedback of biomechanical insights from a key activity associated with a sport; presenting feedbacks based on the generated biomechanical insights on a presentation unit. The method further includes, training a machine learning model associated with an analytics engine in an evolving manner to generate a potential resolution for a range of body movements performed by the person by using the data stored in a database. The processing unit further includes: a pre-processing engine configured to perform a set of pre-processing algorithms to clean the motion data from the communication unit and bring them to a standardized form; a synchronization engine coupled to the pre-processing engine and configured to enable the motion data from the different sensor units in the sensing unit to be synchronised and generate aggregated time- series motion data frames with individual motion data from the full body at every specific instant; a kinematic engine coupled to the synchronization engine and configured to use time-series motion data frames and run four-dimensional simulations (space and time) on a human rig in a physics-based environment to generate kinematic features of the body movement performed by the person, the kinematic features including biomechanical raw data like linear and angular positions, speeds, accelerations, forces, etc. of various points of the body; and an analytics engine coupled to the kinematic engine and is configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person’s movement. The analytics engine further includes: identification of key activities from a kinematic time-series data stream using segmentation algorithms; identification of key events within the key activity using event detection algorithms; biomechanical insight generation from the kinematic features obtained from the kinematic engine at these key events using biomechanical algorithms; and summarization of the various biomechanical insights generated for key activities over some time periods to track progress.
In another aspect, a wearable device for analysing motion associated with a body part of a person, the device including; a sensing unit configured to sense motion data associated with the body part; a processing unit coupled to the sensing unit and configured to process the motion data, the processing unit further including: a kinematic engine and an analytics engine; and the kinematic engine is configured to simulate the motion data on a digital rig to obtain the person’s movement for extracting a plurality of kinematic features; and the analytics engine is configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person’s movement. The wearable device further includes: a database coupled to the processing unit and configured to store the motion data received from the sensing unit and an intermediate output associated with the processing unit. The data stored in the database enables training of a machine learning model associated with the analytics engine in an evolving manner to generate a potential resolution for a range of body movements performed by the person. A number of sensors of the sensing unit are adapted to be placed on the body part of the person as well as to the extended body /augmentation held by the person.
BRIEF DESCRIPTION OF THE DRAWINGS
Other objects, features, and advantages of the aspect will be apparent from the following description when read with reference to the accompanying drawings. In the drawings, wherein like reference numerals denote corresponding parts throughout the several views:
The diagrams are for illustration only, which thus is not a limitation of the present disclosure, and wherein:
FIG. 1 illustrates a schematic view of the modular body-motion analysis system for monitoring motion associated with the body part of a person and deducing improper movement of the body part of the person, according to an embodiment herein;
FIG. 2 illustrates an indicator diagram for indicating the kinematic features generated in a kinematic engine are handled in the analytics engine, according to an embodiment herein; FIG. 3 illustrates a flowchart of a method for a coach to monitor body motions of a sportsperson through analysis of motion data by the body-motion analysis system and provide actionable feedback in real-time, according to an embodiment herein; and
FIG. 4 illustrates a wearable device for analysing motion associated with a body part of a person, according to an embodiment herein.
To facilitate understanding, like reference numerals have been used, where possible to designate like elements common to the figures.
DETAILED DESCRIPTION
The aspects herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting aspects that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the aspects herein. The examples used herein are intended merely to facilitate an understanding of ways in which the aspects herein may be practiced and to further enable those of skill in the art to practice the aspects herein. Accordingly, the examples should not be construed as limiting the scope of the aspects herein.
The term “initializing” indicates a pre-processing of the raw data by the raw data unit. The pre-processing includes filtering, fusion for sensor fused orientation data, synchronization and encryption.
The terms “rig”, “human rig” or “digital rig” indicate a digital model (simulation) designed to mimic the skeleton of a person's body.
The terms “raw data” and “sensed data” indicate the information relating to the movement and orientation of a body part of a person, sensed by the sensors.
The terms “various engines” or “engines” as used herein the description that are associated with a unit of the system of the present disclosure indicates all the sub- components of the layers of the system of the present disclosure for which the term is used.
As mentioned, there remains a need for the development of a system to monitor and analyse the movement of body during a real-world action to allow an in-depth analysis as per the context of the biomechanics of the sport or activity in action for the deduction of improper movement of various body parts and effects of such action on other body parts that can hamper the performance or potentially lead to an injury. The aspect herein overcomes the limitations of the prior art by providing a system and method for recording body motion along with the extended body or augmentations such as a bat, a racket or a hockey stick held by the person, as needed in an activity on the field or a real-world sporting action. The system and method as per the present disclosure, performs an automated and detailed biomechanical analysis in context of the activity or sport being conducted and provides a real time comprehensive analysis of the body and the augmentations in an accessible and user-friendly form, without a need for manual intervention.
FIG. 1 illustrates a schematic view of a modular body-motion analysis system (100) for monitoring the motion associated with the body part of a person and deducing improper movement of the body part of the person. The modular body-motion analysis system (100) includes a sensing unit (101), a communication unit (102), a processing unit (103), a presentation unit (104), and a database (109).
The sensing unit (101) further includes a motion capture unit (not shown). The processing unit (103) further includes a processor. The presentation unit (104) further includes a tablet.
The processor of the processing unit (103) further includes a pre-processing engine (105), a synchronization engine (106), a kinematic engine (107), and an analytics engine (108).
The database (109) is coupled to the processor of the processing unit (103). The motion capture unit of the sensing unit (101) is coupled to the pre-processing engine (105) of the processing unit (103) through the communication unit (102). The pre- processing engine (105) of the processing unit (103) is coupled to the synchronization engine (106). The synchronization engine (106) is coupled to the kinematic engine (107). The kinematic engine (107) is coupled to the analytics engine (108). The processing unit (103) is coupled to the tablet of the presentation unit (104).
The motion capture unit of the sensing unit (101) includes a number of sensors (herein referred to as a sensor for single component) that are placed on the body part of the person. The modular body-motion analysis system (100) enables the motion capture unit of the sensing unit (101) to extend up to the extended body or augmentations such as a bat, a hockey stick, a racket held by the person while being involved in a sporting activity i.e. the number of sensors are fitted on the extended body or augmentations held by the person. The extension of the motion capture unit (101) up to the extended body held by the person enables a complete coverage for the modular body-motion analysis system (100) to collectively analyse the body motions exhibited by the person along-with the motions imparted on the extended body that is held by the person.
In an embodiment, the number of sensors of the motion capture unit of the sensing unit (101) are coupled on the body part through attachments provided on a skintight clothing and are adapted to sense the movement from the orientation data associated with the person.
In another embodiment, the number of sensors of the motion capture unit of the sensing unit (101) are Inertial Measurement Units (IMU’s).
In another embodiment, the presentation unit (104) further includes a mobile device, a computer, a personal digital assistance device (PDA).
In another embodiment, the processor of the processing unit (103) can be deployed as a cloud.
In another embodiment, the processor of the processing unit (103) can be any or a combination of microprocessor, microcontroller, Arduino Uno, At mega 328, Raspberry Pi or other similar processing unit, and the like. In yet another embodiment, the processor of the processing unit (103) can include one or more processors coupled with the database (109), the database (109) storing instructions executable by the one or more processors.
In an embodiment, the processor of the processing unit (103) can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. The database (109) can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The database (109) can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
In another embodiment, the processor of the processing unit (103) can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor may be processor executable instructions stored on a non-transitory machine -readable storage medium and the hardware for the processor may include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processor. In such examples, the processor can include the machine- readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to processor and the processing resource. In other examples, the processor may be implemented by an electronic circuitry. The database (109) can include data that is either stored or generated as a result of functionalities implemented by any of the components of the processor. The sensing unit (101) transmits the motion data (raw data) sensed by the motion capture unit of the sensing unit (101) to the processing unit (103). The processing unit (103) receives, analyses and processes the motion data received from the motion capture unit of the sensing unit (101) to generate an output, which is overlaid onto an end user application. The tablet of the presentation unit (104) displays the overlaid output generated in the processing unit (103). The database (109) is adapted to store the sensed raw data received at the processing unit (103) and the intermediate data received from the pre-processing engine (105), the synchronization engine (106), the kinematic engine (107) and the analytics engine (108) in order to enable training of a machine learning model associated with the analytics engine (108) in an evolving manner to accommodate the biases of the various engines of the processing unit (103) and further improve the capabilities of the engines of the processing unit (103). The machine learning model of the analytics engine (108) is trained in an evolving manner such that a potential solution for a range of non-optimal posture and body movement during any action is generated.
The sensed raw data is processed in the engines of the processing unit (103) as described herein below:
Pre-processing engine (105): a) The pre-processing engine (105) filters the raw data received from the motion capture unit of the sensing unit (101). The accelerometer, magnetometer, and gyroscope data captured by the motion capture unit is filtered into quaternions or Euler angles and directly used to provide the orientation of the various parts of the body. b) A sensor fusion system is implemented to process the sensor data in following manner: a. A software sensor fusion solution on the processing unit, through algorithms like AHRS filters - Madgwick or Mahony filters, or Kalman filters, etc. b. Using smarter orientation measurement units with a coprocessor performing sensor fusion in the motion capture unit of the sensing unit (101) itself, and providing sensor fused orientation data in form of quaternions or Euler angles. By integrating sensors and sensor fusion in a single device, filtering avoids complex multi-vendor solutions making the process easier. c) Aside from the sensor fusion, further filtering is also implemented to clean the incoming data. d) If the data is being aggregated and compressed at the communication unit (102) before reaching to the processing unit (103), then the data must be decompressed and segregated (clean) to mark the individual sensors of the motion capture unit of the sensing unit (101) which was the source for the various quaternions or Euler angles. e) For an optical motion capture unit, the pre-processing engine (105) performs filtering and further pose detection algorithms to perform segmentation of the human body from the video stream of the sensing unit (101) and identify the various body parts to estimate the three- dimensional spatial coordinates of these body parts. Convolutional Neural Networks might be trained for the same or available algorithms might be employed such as Detectron, Openpose or Alphapose. f) Additional security measures are implemented on the hardware level to encrypt the data at the individual sensors of the motion capture unit of the sensing unit (101).
Synchronization Engine (106): a) The intermediate data we have after pre-processing through the preprocessing engine (105) consists of orientation data of various body parts in case of an inertial motion capture unit or position data of various body parts in case of an optical motion capture unit in the sensing unit (101). These individual data streams are time- synchronized in the synchronization engine (106) of the processing unit (102), which employs asynchronous modes of communication. b) The synchronization engine (106) generates a time-series data stream where these individual motion data streams are associated with timestamps and aggregated to create time-series data-frames.
Kinematic Engine (107): a) In the kinematic engine (107), a digital rig is adapted to conform to the skeleton of the human. The rig is made of cylinders or sticks joined to each other to mimic a skeleton of the body of the person and implement kinematics from the motion data to recreate the motion of the user on the human rig, thereby running four-dimensional simulations (space and time) on the human rig in a physics-based environment. a) By running these four-dimensional simulations in a physics-based environment, a mathematical model of the user’ s movement is obtained from which a set of kinematic features including biomechanical raw data like linear and angular positions, speeds, accelerations, forces, etc. of various points of the body can be extracted for further analysis in the analytics engine (108). The analytics engine (108) is configured to provide a context of the biomechanics associated with a sport or activity undertaken by the person to the kinematic features (204) that are obtained from the kinematic engine (107). The analytics engine (108) includes a set of units for each such sport or activity, and each unit is configured with a set of algorithms that takes some kinematic features (204), identifies a number of key events (203) and combines these events and features during key activities (202) to provide contextual insights. b) The positions and orientations of the various bones on the human ng are further used for the animation of a 3D model to present in the presentation unit (104), mirroring the actions of the user. c) In case of an inertial motion capture unit being employed in the sensing unit (101), the user needs to undergo a calibration process to map the incoming orientation data with a standard pose (reference data) such as the T-pose, N-pose, etc., so that a reference is provided for the transformation of the quaternions as part of the projection. d) The calibration process is a continuous process and needs to be repeated with the involvement of a user to recalibrate or through an automated system based on the posture of the user at a given instance. As the motion of a person is a temporal sequence where the upcoming movements of someone can be predicted through the movements they have done before, a Markov Chain or Recurrent Neural Network based model can be used to predict the user’s future actions and the sensor values could be calibrated based on this model.
FIG. 2 illustrates an indicator diagram (200) to indicate a set of kinematic features (204) generated in the kinematic engine (107) of the processing unit (103) such that the kinematic features (204) are handled in the analytics engine (108).
Analytics Engine (108): b) The analytics engine (108) provides a context of the biomechanics associated with a sport or activity undertaken by the person to the kinematic features (204) that are obtained from the kinematic engine (107). The analytics engine (108) includes a set of units for each such sport or activity, and each unit is configured with a set of algorithms that takes some kinematic features (204), identifies some key events (203) and combines these events and features during key activities (202) to provide contextual insights. c) The kinematic features (204) obtained from the kinematic engine (107) are a time-series data stream (201). The analytics engine (108) first identifies key activities (202) using segmentation algorithms from the appropriate units, for example, in case of a bowler in cricket, this would be a ball bowled or in case of a tennis player, this would be a stroke by the racket. d) The analytics engine (108) then runs event detection algorithms from the appropriate units to identify key events (203) within the key activity (202), for example, in case of a bowler in cricket, these would be the run-up, the pre-delivery and the follow-through or in case of a batsman in cricket, these would be the down-swing, the impact and the follow- through. e) The analytics engine (108) then runs the biomechanical algorithms from the appropriate units to create biomechanical insights from the kinematic features (204) obtained from the kinematic engine (107) at these key events (203). f) The analytics engine (108) thereby runs summarization algorithms from the appropriate units to create session summaries of the various biomechanical insights generated for key activities over some time periods to track progress. g) The analytics engine (108) stores all the data comprehensively for each key activity in the database (109) to serve to the presentation unit (104).
In an embodiment, the number of sensors of the motion capture unit of the sensing unit (101) are attached to a skin-tight clothing in order to provide an accurate motion data from the person’s body.
In another embodiment, the number of sensors of the motion capture unit of the sensing unit (101) are attached to the body of the person and the extended body through a fitment. The fitment is either directly mounted on a cavity designed over a clothing worn by the person or on the attachment means provided on a belt. The belt is wrapped tightly around different portions of the body. The fitment is tightly maintained to the skin in order to suppress or completely avoid a noise generated through intermediate cloth movement and an air friction leading to false data identification. The noises lead to non-optimal sensor data and thereby filtration of such noises is required, but such filtration of the noises is a tricky and tedious task. The attachment of the fitment of the present disclosure thus enables attachment of the number of sensors of the motion capture unit of the sensing unit (101) tightly to the body of the person and to the extended body held by the person that eliminates the noise generation.
In another embodiment, the fitment for attaching the number of sensors of the motion capture unit of the sensing unit (101) is modular that enables attachment of the number of sensors to the body of the person as well as to the extended body held by the person.
In another embodiment, the number of sensors of the motion capture unit of the sensing unit (101) can be mechanical motion sensor or magnetic sensor.
In another embodiment, the number of sensors of the motion capture unit of the sensing unit (101) are mountable or embedded under the skin or over the bone of the person.
In an embodiment, the communication unit (102) enables the communication between the sensing unit (101) and the processing unit (103) through a wireless (ANT, Bluetooth, BLE, Wi-Fi, etc.) or wired mode (UART, SPI, I2C, etc.).
In an embodiment, each sensor of the motion capture unit of the sensing unit (101) is coupled individually to a gateway to the processing unit (103) and sends data through the wired protocols or wireless communication protocols in predefined manner via the communication unit (102).
In another embodiment, the number of sensors of the motion capture unit of the sensing unit (101) aggregate their respective sensed data at the communication unit (102) before sending the aggregated sensed data through the wired protocols or wireless communication protocols in predefined manner via the communication unit (102).
In another embodiment, the communication from the sensors is synchronous in nature in which the sender waits for a response from the receiver to continue further computation. Further, both sender and receiver coupled through the communication unit (102) should be in active state.
In another embodiment, the communication from the sensors is asynchronous in nature in which the sender does not wait for a response from the receiver. The receiver can be inactive and once the receiver is active, then the sensed data is received and processed accordingly. The sender puts the sensed data in a message queue in the communication unit (102) and does not require an immediate response to continue processing.
In another embodiment, the sensed data can either be sent at a very high rate in small packets or in a very low rate with big packets. The data transmission rate determines a latency present in a playback capability of a motion capture.
FIG. 3 illustrates a flowchart of a method (300) to monitor motions associated with the body part of a sportsperson by a coach through analysis of the motion data captured by a motion capture unit of a sensing unit (101) of the modular bodymotion analysis system (100) and provide actionable feedback in real-time. The method (300) is able to prevent a potential injury or improve efficiency of movement of body parts. With respect to FIG. 3, the method (300) includes following steps: placing (302) the sensing unit (101) on the specified body parts of the person to sense motion data associated with the person calibrating (304) a kinematic engine (107) by enabling the standing of the person in a standard pose performing (306) regular real- world sporting action during training session generating (308) real-time feedbacks of biomechanical insights for the coach from a number of key activities (202) associated with a sport presenting (310) feedback to improve the performance of the sporting activity executed by the sportsperson based on the generated biomechanical insights on a presentation unit (104).
The motion capture unit (101) is configured to transmit biomechanical insights to a presentation unit (104) coupled to the processing unit (103). The method (300) further configured to train a machine in an evolving manner to generate a potential resolution for a range of non-optimal posture and body movement during a sporting action by use of stored data associated with the database (109). The processing unit (103) further comprises: a pre-processing engine (105) configured to perform a set of pre-processing algorithms to clean the motion data from the communication unit (102) and bring them to a standardized form; a synchronization engine (106) coupled to the pre-processing engine (105) and configured to enable motion data from the different sensor units in the sensing unit (101) to be synchronised and generate aggregated time-series motion data frames with individual motion data from the full body at every specific instant; a kinematic engine (107) coupled to the synchronization engine (106) and configured to use the time-series motion data frames and run four-dimensional simulations (space and time) on a human rig in a physics-based environment to generate the kinematic features of the entire motion, the kinematic features including biomechanical raw data like linear and angular positions, speeds, accelerations, forces, etc. of various points of the body; and an analytics engine (108) coupled to the kinematic engine (107) and configured for each sport and activity to run the biomechanical algorithms involved in the particular sport or activity. The analytics engine (108) further configured to identify key activities (202) from a kinematic time-series data stream (201) using segmentation algorithms; and identify key events (203) within the key activity (202) using event detection algorithms; to generate biomechanical insights from the kinematic features (204) obtained from the kinematic engine (107) at these key events (203) using biomechanical algorithms; and summarizing the biomechanical insights generated for key activities (202) over some time periods to track progress. FIG. 4 illustrates a wearable device (400) for analysing motion associated with a body part of a person, the device (400) includes a sensing unit (101) configured to sense motion data associated with the body part; a processing unit (103) coupled to the sensing unit (101) and configured to process the motion data, the processing unit (103) further including: akinematic engine (107) and an analytics engine (108); and the kinematic engine (107) is configured to simulate the motion data on a digital rig to obtain the person’s movement for extracting a plurality of kinematic features; and the analytics engine (108) is configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person’s movement. The wearable device (400) further includes: a database (109) coupled to the processing unit (103) and configured to store the motion data received from the sensing unit (101) and an intermediate output associated with the processing unit (103). The data stored in the database (109) enables training of a machine learning model associated with the analytics engine (108) in an evolving manner to generate a potential resolution for a range of body movements performed by the person. A number of sensors of the sensing unit (101) are adapted to be placed on the body part of the person as well as to the extended body /augmentation held by the person.
Certain advantages of the modular body -motion analysis system (100) are listed below:
Accuracy in the sporting action executed by the person is increased.
Career longevity pertaining to sports.
Prevents the risks for potential injury that could occur while executing a sporting execution.
User-friendly output of the modular body-motion analysis system (100) that enables a physiotherapist to analyse the body motions easily.
Biomechanics of the sports are easily detected that assists the person to take accurate sporting actions while playing.

Claims

WE CLAIM
1. A modular body-motion analysis system (100) for monitoring and analysing the movement of a body part of a person during a real-world action, the modular bodymotion analysis system (100) comprising; a sensing unit (101) configured to sense motion data associated with the body part; a communication unit (102) coupled to the sensing unit (101) and configured to transmit the motion data sensed by the sensing unit (101) to a processing unit (103); the processing unit (103) coupled to the sensing unit (101) and configured to process the motion data, the processing unit (103) further comprising: a kinematic engine (107) and an analytics engine (108); such that the kinematic engine (107) is configured to simulate the motion data on a digital rig to obtain the person’s movement for extracting a plurality of kinematic features; and the analytics engine (108) is configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person’s movement; and a presentation unit (104) coupled to the processing unit (103) and configured to output the motion data processed by the processing unit (103).
2. The modular body-motion analysis system (100) as claimed in claim 1, wherein the modular body-motion analysis system (100) further comprising: a database (109) coupled to the processing unit (103) and configured to store the motion data received from the sensing unit (101) and an intermediate output associated with the processing unit (103).
3. The modular body-motion analysis system (100) as claimed in claim 1, wherein the data stored in the database (109) enables training of a machine learning model associated with the analytics engine (108) in an evolving manner to generate a potential resolution for a range of body movements performed by the person.
4. The modular body-motion system (100) as claimed in claim 1 , wherein a plurality of sensors of the sensing unit (101) are adapted to be placed on the body part of the person as well as to the extended body /augmentation held by the person.
5. The modular body-motion analysis system (100) as claimed in claim 1, wherein the processing unit (103) further comprising: a pre-processing engine (105) configured to perform a set of pre-processing algorithms to decompress and segregate the motion data from the communication unit (102) to standardize the motion data; a synchronization engine (106) coupled to the pre-processing engine (105) and configured to enable the motion data from the plurality of sensors of the sensing unit (101) to be synchronised and generate aggregated time- series motion data frames with motion data from the body part at every specific instant; and the kinematic engine (107) coupled to the synchronization engine (106) and configured to use the time-series motion data frames and run four-dimensional simulations (space and time) on a human rig in a physics-based environment to generate the kinematic features of the entire motion, the kinematic features including biomechanical raw data like linear and angular positions, speeds, accelerations, forces, etc. of various points of the body.
6. The modular body-motion analysis system (100) as claimed in claim 1, wherein the analytics engine (108) is further configured to identify a key activity (202) from a kinematic time-series data stream (201) using segmentation algorithms and a plurality of key events (203) within the key activity (202) using event detection algorithms to generate biomechanical insight from the kinematic features (204) obtained from the kinematic engine (107) at the identified key events (203) by using biomechanical algorithms; and summarizing the generated biomechanical insights generated for the key activities (202) over time periods to track progress.
7. A method (300) for analysing movement associated with a body part of a person, the method (300) comprising; placing (302) the sensing unit (101) on the body parts of the person to sense motion data associated with the person calibrating (304) a kinematic engine (107) by enabling the standing of the person in a standard pose performing (306) a regular real-world sporting action during training session generating (308) real-time feedback of biomechanical insights from a key activity (202) associated with a sport presenting (310) feedbacks based on the generated biomechanical insights on a presentation unit (104).
8. The method (300) as claimed in claim 7, wherein the method (300) further comprises, training a machine learning model associated with an analytics engine (108) in an evolving manner to generate a potential resolution for a range of body movements performed by the person by using the data stored in a database (109).
9. The method (300) as claimed in claim 7, wherein the processing unit (103) further comprises: a pre-processing engine (105) configured to perform a set of pre-processing algorithms to clean the motion data from the communication unit (102) and bring them to a standardized form; a synchronization engine (106) coupled to the pre-processing engine (105) and configured to enable the motion data from the different sensor units in the sensing unit (101) to be synchronised and generate aggregated time- series motion data frames with individual motion data from the full body at every specific instant; a kinematic engine (107) coupled to the synchronization engine (106) and configured to use time-series motion data frames and run four-dimensional simulations (space and time) on a human rig in a physics-based environment to generate kinematic features (204) of the body movement performed by the person, the kinematic features including biomechanical raw data like linear and angular positions, speeds, accelerations, forces, etc. of various points of the body; and an analytics engine (108) coupled to the kinematic engine (107) and is configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person’s movement.
10. The method (300) as claimed in claim 7, wherein the analytics engine (108) further comprises: identification of key activities (202) from a kinematic time- series data stream (201) using segmentation algorithms; identification of key events (203) within the key activity (202) using event detection algorithms; biomechanical insight generation from the kinematic features (204) obtained from the kinematic engine (107) at these key events (203) using biomechanical algorithms; and summarization of the various biomechanical insights generated for key activities (202) over some time periods to track progress.
11. A wearable device (400) for analysing motion associated with a body part of a person, the device (400) comprising; a sensing unit (101) configured to sense motion data associated with the body part; a processing unit (103) coupled to the sensing unit (101) and configured to process the motion data, the processing unit (103) further comprising: a kinematic engine (107) and an analytics engine (108); wherein the kinematic engine (107) is configured to simulate the motion data on a digital rig to obtain the person’s movement for extracting a plurality of kinematic features; and the analytics engine (108) is configured to process the extracted kinematic features to provide a context of the biomechanics associated with the person’s movement.
12. The wearable device (400) as claimed in claim 1, wherein the device (400) further comprising: a database (109) coupled to the processing unit (103) and configured to store the motion data received from the sensing unit (101) and an intermediate output associated with the processing unit (103).
13. The wearable device (400) as claimed in claim 1, wherein the data stored in the database (109) enables training of a machine learning model associated with the analytics engine (108) in an evolving manner to generate a potential resolution for a range of body movements performed by the person.
14. The wearable device (400) as claimed in claim 1, wherein a plurality of sensors of the sensing unit (101) are adapted to be placed on the body part of the person as well as to the extended body /augmentation held by the person.
22
PCT/IN2022/050005 2021-01-03 2022-01-03 Modular body-motion analysis system, device and method thereof WO2022144929A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202011045572 2021-01-03
IN202011045572 2021-01-03

Publications (1)

Publication Number Publication Date
WO2022144929A1 true WO2022144929A1 (en) 2022-07-07

Family

ID=82261141

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2022/050005 WO2022144929A1 (en) 2021-01-03 2022-01-03 Modular body-motion analysis system, device and method thereof

Country Status (1)

Country Link
WO (1) WO2022144929A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180177450A1 (en) * 2014-03-17 2018-06-28 Ben Hansen Method and system for delivering biomechanical feedback to human and object motion
US20190388728A1 (en) * 2018-06-21 2019-12-26 City University Of Hong Kong Systems and methods using a wearable sensor for sports action recognition and assessment
KR102173335B1 (en) * 2019-12-30 2020-11-03 주식회사 그림에스앤씨 A method and an apparatus for analyzing personal physical ability based on motion recognition

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180177450A1 (en) * 2014-03-17 2018-06-28 Ben Hansen Method and system for delivering biomechanical feedback to human and object motion
US20190388728A1 (en) * 2018-06-21 2019-12-26 City University Of Hong Kong Systems and methods using a wearable sensor for sports action recognition and assessment
KR102173335B1 (en) * 2019-12-30 2020-11-03 주식회사 그림에스앤씨 A method and an apparatus for analyzing personal physical ability based on motion recognition

Similar Documents

Publication Publication Date Title
US11238636B2 (en) Method and apparatus for sport-specific training with captured body motions
US10314536B2 (en) Method and system for delivering biomechanical feedback to human and object motion
US20180350084A1 (en) Techniques for object tracking
AU2017331639B2 (en) A system and method to analyze and improve sports performance using monitoring devices
Giblin et al. The impact of technology on elite sports performance.
CN105999682A (en) Sports training monitoring system
KR102627927B1 (en) Methods, devices, and computer program products for measuring and interpreting metrics of motor movement and objects related thereto.
US20100194879A1 (en) Object motion capturing system and method
CN107330967B (en) Rider motion posture capturing and three-dimensional reconstruction system based on inertial sensing technology
CN115410233B (en) Gesture attitude estimation method based on Kalman filtering and deep learning
WO2022144929A1 (en) Modular body-motion analysis system, device and method thereof
US11222428B2 (en) Determining golf swing characteristics
CN110189374B (en) Archery posture instant feedback system
CN210078765U (en) Motion capture recognition and evaluation device based on wearable sensor
Kelly et al. Visualisation of tennis swings for coaching
Leser et al. Motion tracking and analysis systems
KR20200144744A (en) Method and system for analyzing and visualizing 3-dimensional motion based on motion capture system
Müller et al. Javelin Throw Analysis and Assessment with Body-Worn Sensors
Leddy et al. Concurrent Validity of the Human Pose Estimation Model “MediaPipe Pose” and the XSENS Inertial Measuring System for Knee Flexion and Extension Analysis During Hurling Sport Motion
Abu-Kheil et al. Trajectory analysis of endoscopic capsule images: A feasibility study
Oyama Analysis of Trunk Movement During Pitching Using Inertial Measurement Unit
US20210153778A1 (en) Smart apparel for monitoring athletics and associated systems and methods
CN113220740A (en) Analysis system for motion capture data
WO2019220465A2 (en) Method and system for performance measurement of a user in a sport
CN111782643A (en) Juvenile sport special skill group training and evaluating system and method based on Internet of things and machine learning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22734786

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22734786

Country of ref document: EP

Kind code of ref document: A1