WO2023097447A1 - 运动数据标定方法和系统 - Google Patents
运动数据标定方法和系统 Download PDFInfo
- Publication number
- WO2023097447A1 WO2023097447A1 PCT/CN2021/134421 CN2021134421W WO2023097447A1 WO 2023097447 A1 WO2023097447 A1 WO 2023097447A1 CN 2021134421 W CN2021134421 W CN 2021134421W WO 2023097447 A1 WO2023097447 A1 WO 2023097447A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- user
- coordinate system
- attitude
- sensor
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 210
- 238000000034 method Methods 0.000 title claims abstract description 64
- 230000009471 action Effects 0.000 claims abstract description 96
- 238000005259 measurement Methods 0.000 claims description 27
- 230000001133 acceleration Effects 0.000 claims description 20
- 238000006243 chemical reaction Methods 0.000 claims description 15
- 238000003860 storage Methods 0.000 claims description 15
- 230000005484 gravity Effects 0.000 claims description 6
- 230000010354 integration Effects 0.000 claims 1
- 238000012545 processing Methods 0.000 description 93
- 238000012544 monitoring process Methods 0.000 description 53
- 230000003183 myoelectrical effect Effects 0.000 description 32
- 238000004891 communication Methods 0.000 description 20
- 230000000386 athletic effect Effects 0.000 description 14
- 238000005452 bending Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 12
- 238000002567 electromyography Methods 0.000 description 12
- 210000003205 muscle Anatomy 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 238000006073 displacement reaction Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 230000029058 respiratory gaseous exchange Effects 0.000 description 6
- 230000003190 augmentative effect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 210000003414 extremity Anatomy 0.000 description 5
- 230000000638 stimulation Effects 0.000 description 5
- 238000013500 data storage Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 3
- 239000008280 blood Substances 0.000 description 3
- 210000004369 blood Anatomy 0.000 description 3
- 230000036541 health Effects 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 229910052760 oxygen Inorganic materials 0.000 description 3
- 239000001301 oxygen Substances 0.000 description 3
- 230000036387 respiratory rate Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 241000489861 Maximus Species 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 210000000852 deltoid muscle Anatomy 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000004907 flux Effects 0.000 description 2
- 210000000245 forearm Anatomy 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 210000004247 hand Anatomy 0.000 description 2
- 210000000629 knee joint Anatomy 0.000 description 2
- 210000002414 leg Anatomy 0.000 description 2
- 238000012806 monitoring device Methods 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 210000002976 pectoralis muscle Anatomy 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 210000003314 quadriceps muscle Anatomy 0.000 description 2
- 230000036391 respiratory frequency Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 210000002027 skeletal muscle Anatomy 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- 210000001015 abdomen Anatomy 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001121 heart beat frequency Effects 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000010339 medical test Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/06—Topological mapping of higher dimensional structures onto lower dimensional surfaces
- G06T3/067—Reshaping or unfolding 3D tree structures onto 2D planes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P21/00—Testing or calibrating of apparatus or devices covered by the preceding groups
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6804—Garments; Clothes
- A61B5/6805—Vests
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P3/00—Measuring linear or angular speed; Measuring differences of linear or angular speeds
- G01P3/42—Devices characterised by the use of electric or magnetic means
- G01P3/44—Devices characterised by the use of electric or magnetic means for measuring angular speed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/10—Athletes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0223—Operational features of calibration, e.g. protocols for calibrating sensors
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/30—Speed
- A63B2220/34—Angular speed
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/40—Acceleration
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/80—Special sensors, transducers or devices therefor
- A63B2220/83—Special sensors, transducers or devices therefor characterised by the position of the sensor
- A63B2220/836—Sensors arranged on the body of the user
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2230/00—Measuring physiological parameters of the user
- A63B2230/62—Measuring physiological parameters of the user posture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- the present application relates to the technical field of wearable devices, in particular to a method and system for calibration of motion data.
- motion monitoring equipment mainly monitors motion parameters of users during motion through sensors.
- the user when calibrating the coordinates of the sensor, the user needs to do a series of calibration actions (such as raising both hands in front, raising both hands sideways, etc.), so that the motion monitoring equipment can adjust the coordinate system of the sensor according to the calibration action.
- the posture data under the body coordinate system is converted into the posture data in the human body coordinate system.
- the present application provides a motion data calibration method and system that can calibrate motion data without requiring a user to perform a calibration action.
- the present application provides a movement data calibration method, including: acquiring movement data of the user during movement, the movement data including at least one gesture signal corresponding to at least one measurement position on the user, the at least one gesture
- Each posture signal in the signal comprises three-dimensional posture data of its corresponding measurement position under the original coordinate system;
- Build a target coordinate system and the target coordinate system includes three mutually perpendicular coordinate axes of the X axis, the Y axis, and the Z axis; and converting each attitude signal into two-dimensional attitude data in the target coordinate system.
- each attitude signal includes data measured by an attitude sensor
- the original coordinate system includes a coordinate system where the attitude sensor is located.
- the attitude sensor includes at least one of an acceleration sensor, an angular velocity sensor and a magnetic force sensor.
- each pose signal includes data measured by an image sensor
- the original coordinate system includes a coordinate system in which the image sensor is located.
- the three-dimensional pose data includes angle data and angular velocity data on three mutually perpendicular coordinate axes.
- the converting each attitude signal into two-dimensional attitude data in the target coordinate system includes: acquiring a pre-stored distance between the target coordinate system and the original coordinate system Conversion relationship: based on the conversion relationship, each attitude signal is converted into three-dimensional motion data in the target coordinate system, and the three-dimensional motion data includes at least the angular velocity data on the X-axis, the angular velocity data on the Y-axis and the angular velocity data on the Z axis; and converting the three-dimensional motion data in the target coordinate system into the two-dimensional attitude data in the target coordinate system.
- the Z-axis of the target coordinate system is a direction opposite to the vertical direction where the acceleration of gravity is located.
- the two-dimensional attitude data in the target coordinate system includes: horizontal attitude data, including horizontal angle data and horizontal angular velocity data when moving in a horizontal plane perpendicular to the Z axis; and vertical Vertical attitude data, including vertical angle data and vertical angular velocity data when moving in any vertical plane perpendicular to the horizontal plane.
- the converting the three-dimensional motion data in the target coordinate system into the two-dimensional attitude data in the target coordinate system includes: converting the angular velocity data on the X axis and the The angular velocity data on the Y-axis is converted into the vertical angular velocity data by the vector law; based on the time corresponding to the starting position and the ending position of the user's movement, the vertical angular velocity data is time-integrated to obtain the vertical angle data; using the Z-axis angular velocity data as the horizontal angular velocity data; and based on the time corresponding to the start position and end position of the user's movement, time-integrate the horizontal angular velocity data to obtain the The horizontal angle data described above.
- the motion data calibration method further includes: determining the relative motion between the at least one measurement position based on the two-dimensional attitude data corresponding to each attitude signal.
- this specification also provides a sports data calibration system, including at least one storage medium and at least one processor, the at least one storage medium stores at least one instruction set for sports data calibration; the at least one processor Communicatively connected with the at least one storage medium, wherein when the exercise data calibration system is running, the at least one processor reads the at least one instruction set and implements the exercise data calibration method described in the first aspect of this specification.
- the motion data calibration method and system provided by the present application can convert the motion data of the user during motion from three-dimensional posture data in three mutually perpendicular coordinate axes to two-dimensional data in the target coordinate system, namely The posture data in the horizontal plane and the posture data in the vertical plane, so that the user's movement can be divided into horizontal movement and vertical movement, so as to avoid the user's different orientation. data discrepancies.
- the method and system can remove the influence of user orientation on motion data. Therefore, the method and system can calibrate the motion data without the need for the user to perform a calibration action.
- FIG. 1 is a schematic diagram of an application scenario of a motion monitoring system according to some embodiments of the present application
- FIG. 2 is a schematic diagram of exemplary hardware and/or software of a wearable device according to some embodiments of the present application
- FIG. 3 is a schematic diagram of exemplary hardware and/or software of a computing device according to some embodiments of the present application.
- Fig. 4 is an exemplary structural diagram of a wearable device according to some embodiments of the present application.
- Fig. 5 is an exemplary flowchart of a motion monitoring method according to some embodiments of the present application.
- Fig. 6 is an exemplary flow chart of a motion data calibration method according to some embodiments of the present application.
- Fig. 7 is a schematic diagram of a target coordinate system according to some embodiments of the present application.
- Fig. 8 is an exemplary flow chart of converting into two-dimensional pose data according to some embodiments of the present application.
- Fig. 9 is a coordinate system diagram when a user moves according to some embodiments of the present application.
- system means for distinguishing different components, elements, components, parts or assemblies of different levels.
- the words may be replaced by other expressions if other words can achieve the same purpose.
- the flow chart is used in this application to illustrate the operations performed by the system according to the embodiment of this application. It should be understood that the preceding or following operations are not necessarily performed in the exact order. Instead, various steps may be processed in reverse order or simultaneously. At the same time, other operations can be added to these procedures, or a certain step or steps can be removed from these procedures.
- the present application provides a motion monitoring system, which can acquire action signals of a user during exercise, wherein the action signals include at least myoelectric signals, posture signals, electrocardiographic signals, respiratory frequency signals, and the like.
- the system can monitor the movement of the user based at least on the feature information corresponding to the electromyography signal or the feature information corresponding to the gesture signal. For example, the user's action type, number of actions, action quality, action time, Or physiological parameter information when the user performs an action, etc.
- the exercise monitoring system can also generate feedback on the user's exercise based on the analysis results of the user's exercise, so as to guide the user's exercise.
- the exercise monitoring system can send prompt information (for example, voice prompt, vibration prompt, electric stimulation, etc.) to the user.
- prompt information for example, voice prompt, vibration prompt, electric stimulation, etc.
- the motion monitoring system can be applied to wearable devices (for example, clothing, wristbands, helmets), medical testing equipment (for example, electromyography tester), fitness equipment, etc.
- the motion monitoring system can be Accurate monitoring and feedback of the user's actions without the participation of professionals can improve the user's fitness efficiency while reducing the cost of the user's fitness.
- Fig. 1 is a schematic diagram of an application scenario of a motion monitoring system according to some embodiments of the present application.
- the motion monitoring system 100 may include a processing device 110 , a network 120 , a wearable device 130 and a mobile terminal device 140 .
- the athletic monitoring system 100 may also include an athletic data calibration system 180 .
- the motion monitoring system 100 can acquire motion signals (such as myoelectric signals, posture signals, electrocardiogram signals, respiratory rate signals, etc.) used to characterize user motions, and monitor and give feedback on user motions according to the user's motion signals. .
- motion signals such as myoelectric signals, posture signals, electrocardiogram signals, respiratory rate signals, etc.
- the exercise monitoring system 100 can monitor and give feedback on the actions of the user when exercising.
- the wearable device 130 may acquire the user's motion signal.
- the processing device 110 or the mobile terminal device 140 may receive and analyze the user's motion signal to determine whether the user's fitness motion is standardized, so as to monitor the user's motion.
- monitoring the user's action may include determining the action type, action quantity, action quality, action time, or physiological parameter information when the user performs the action.
- the exercise monitoring system 100 can generate feedback on the user's exercise action according to the analysis result of the user's exercise action, so as to guide the user's exercise.
- the exercise monitoring system 100 may monitor and give feedback on the actions of the user while running. For example, when the user wears the wearable device 130 for running, the exercise monitoring system 100 can monitor whether the user's running action is normal, whether the running time meets health standards, and the like. When the user runs for too long or the running action is incorrect, the fitness device can feed back the user's exercise status to remind the user that the running action or running time needs to be adjusted.
- the processing device 110 may be configured to process information and/or data related to user motion.
- the processing device 110 may receive the user's action signal (for example, myoelectric signal, posture signal, electrocardiogram signal, respiratory frequency signal, etc.), and further extract the feature information corresponding to the action signal (for example, the myoelectric signal in the action signal The corresponding feature information, the feature information corresponding to the attitude signal).
- the processing device 110 may perform specific signal processing on the EMG signal or posture signal collected by the wearable device 130, such as signal segmentation, signal preprocessing (eg, signal correction processing, filtering processing, etc.), and the like.
- the processing device 110 may also determine whether the user's action is correct based on the user's action signal. For example, the processing device 110 may determine whether the user's action is correct based on feature information (eg, amplitude information, frequency information, etc.) corresponding to the electromyography signal. For another example, the processing device 110 may determine whether the user action is correct based on the feature information corresponding to the attitude signal (eg, angular velocity, angular velocity direction, angular velocity acceleration, angle, displacement information, stress, etc.). For another example, the processing device 110 may determine whether the user's action is correct based on the feature information corresponding to the EMG signal and the feature information corresponding to the gesture signal.
- feature information eg, amplitude information, frequency information, etc.
- the processing device 110 may determine whether the user action is correct based on the feature information corresponding to the attitude signal (eg, angular velocity, angular velocity direction, angular velocity acceleration, angle, displacement information, stress, etc.).
- the processing device 110 may
- the processing device 110 may also determine whether the physiological parameter information of the user during exercise meets the health standard. In some embodiments, the processing device 110 may also issue corresponding instructions to feed back the user's exercise situation. For example, when the user is running, the exercise monitoring system 100 monitors that the user's running time is too long. At this time, the processing device 110 may send an instruction to the mobile terminal device 140 to prompt the user to adjust the running time.
- the characteristic information corresponding to the attitude signal is not limited to the above-mentioned angular velocity, angular velocity direction, angular velocity acceleration, angle, displacement information, stress, etc., but can also be other characteristic information, which can be used to reflect the relative movement of the user's body.
- the parameter information of can be the feature information corresponding to the attitude signal.
- the posture sensor is a strain sensor
- the bending angle and bending direction at the joints of the user can be obtained by measuring the magnitude of the resistance in the strain sensor that changes with the stretching length.
- processing device 110 may be local or remote.
- the processing device 110 may access information and/or data stored in the wearable device 130 and/or the mobile terminal device 140 through the network 120 .
- the processing device 110 may be directly connected to the wearable device 130 and/or the mobile terminal device 140 to access information and/or materials stored therein.
- the processing device 110 may be located in the wearable device 130 and implement information interaction with the mobile terminal device 140 through the network 120 .
- the processing device 110 may be located in the mobile terminal device 140, and realize information interaction with the wearable device 130 through a network.
- the processing device 110 may execute on a cloud platform.
- processing device 110 may process data and/or information related to motion monitoring to perform one or more of the functions described herein.
- the processing device 110 may acquire motion signals collected by the wearable device 130 during the user's exercise.
- the processing device 110 may send control instructions to the wearable device 130 or the mobile terminal device 140 .
- the control instruction can control the switch state of the wearable device 130 and its sensors, and can also control the mobile terminal device 140 to send out prompt information.
- the processing device 110 may include one or more sub-processing devices (eg, a single-core processing device or a multi-core multi-core processing device).
- Network 120 may facilitate the exchange of data and/or information within athletic monitoring system 100 .
- one or more components of the athletic monitoring system 100 may send data and/or information to other components of the athletic monitoring system 100 over the network 120 .
- the motion signal collected by the wearable device 130 may be transmitted to the processing device 110 through the network 120 .
- the confirmation result of the action signal in the processing device 110 may be transmitted to the mobile terminal device 140 through the network 120 .
- network 120 may be any type of wired or wireless network.
- the wearable device 130 refers to clothing or equipment with a wearable function.
- the wearable device 130 may include, but not limited to, an upper garment device 130-1, a trouser device 130-2, a wrist device 130-3, a shoe 130-4, and the like.
- wearable device 130 may include multiple sensors.
- the sensor can acquire various action signals (eg, myoelectric signals, posture signals, temperature information, heartbeat frequency, electrocardiographic signals, etc.) of the user during exercise.
- the sensor may include but not limited to one of myoelectric sensor, attitude sensor, temperature sensor, humidity sensor, ECG sensor, blood oxygen saturation sensor, Hall sensor, electrodermal sensor, rotation sensor, etc. or more.
- an electromyographic sensor can be set at the muscle position of the human body (for example, biceps, triceps, latissimus dorsi, trapezius, etc.) in the upper garment device 130-1. Myoelectric signals when the user is exercising.
- an electrocardiographic sensor may be installed near the left pectoral muscle of the human body in the upper garment device 130-1, and the electrocardiographic sensor may collect the user's electrocardiographic signal.
- a posture sensor can be set at the muscle position of the human body (eg, gluteus maximus, vastus lateralis, vastus medialis, gastrocnemius, etc.) in the trousers device 130-2, and the posture sensor can collect user's posture signals.
- the wearable device 130 can also provide feedback to the user's actions. For example, when the movement of a certain part of the body of the user does not meet the standard, the corresponding myoelectric sensor of this part can generate a stimulation signal (for example, a current stimulation or a hitting signal) to remind the user.
- a stimulation signal for example, a current stimulation or a hitting signal
- wearable device 130 is not limited to the jacket device 130-1, trousers device 130-2, wrist support device 130-3 and shoe device 130-4 shown in FIG.
- Devices for motion monitoring such as helmet devices, knee pads, etc., are not limited here, and any device that can use the motion monitoring method contained in this application is within the scope of protection of this application.
- the mobile terminal device 140 can acquire information or data in the exercise monitoring system 100 .
- the mobile terminal device 140 may receive the exercise data processed by the processing device 110, and feed back exercise records and the like based on the processed exercise data.
- Exemplary feedback methods may include, but are not limited to, voice prompts, image prompts, video presentations, text prompts, and the like.
- the user can obtain the action record during his exercise through the mobile terminal device 140 .
- the mobile terminal device 140 can be connected with the wearable device 130 through the network 120 (for example, wired connection, wireless connection), and the user can obtain the action record during the user's exercise through the mobile terminal device 140, and the action record can be obtained through the mobile terminal device. 140 to the processing device 110.
- the mobile terminal device 140 may include one of a mobile device 140-1, a tablet computer 140-2, a notebook computer 140-3, etc., or any combination thereof.
- the mobile device 140-1 may include a mobile phone, a smart home device, a smart mobile device, a virtual reality device, an augmented reality device, etc., or any combination thereof.
- the smart home device may include a control device for smart appliances, a smart monitoring device, a smart TV, a smart camera, etc., or any combination thereof.
- a smart mobile device may include a smart phone, a personal digital assistant (PDA), a gaming device, a navigation device, a POS device, etc., or any combination thereof.
- the virtual reality device and/or the augmented reality device may include a virtual reality helmet, virtual reality glasses, virtual reality goggles, augmented reality helmets, augmented reality glasses, augmented reality goggles, etc., or any combination thereof.
- the athletic monitoring system 100 may also include an athletic data calibration system 180 .
- the motion data calibration system 180 can be used to process motion data related to the user's motion, and can implement the motion data calibration method described in this specification.
- the motion data calibration system 180 can receive motion data of the user during motion, and can transform the motion data from three-dimensional posture data in three mutually perpendicular coordinate axes into two-dimensional posture data in the target coordinate system, namely The posture data in the horizontal plane and the posture data in the vertical plane, so that the user's movement can be divided into horizontal movement and vertical movement, avoiding the data caused by different user orientations difference.
- the exercise data calibration system 180 can remove the influence of the user's orientation on the exercise data, and the exercise data can be calibrated without the need for the user to perform calibration actions.
- the motion data may be three-dimensional posture data of the measured position on the user's body when it is in motion. The action data and the calibration method for the motion data will be described in detail later.
- the athletic data calibration system 180 may be integrated on the processing device 110 . In some embodiments, the sports data marking system 180 can also be integrated on the mobile terminal device 140 . In some embodiments, the sports data marking system 180 may also exist independently of the processing device 110 and the mobile terminal device 140 .
- the motion data calibration system 180 may be communicatively connected with the processing device 110, the wearable device 130 and the mobile terminal device 140 for information and/or data transmission and exchange. In some embodiments, the exercise data calibration system 180 can access information and/or data stored in the processing device 110 , the wearable device 130 and/or the mobile terminal device 140 through the network 120 .
- the wearable device 130 can be directly connected to the processing device 110 and/or the mobile terminal device 140 to access information and/or data stored therein.
- the motion data calibration system 180 may be located in the processing device 110 and realize information exchange with the wearable device 130 and the mobile terminal device 140 through the network 120 .
- the sports data calibration system 180 may be located in the mobile terminal device 140, and realize information exchange with the processing device 110 and the wearable device 130 through the network.
- the exercise data calibration system 180 can be executed on a cloud platform, and realize information interaction with the processing device 110 , the wearable device 130 and the mobile terminal device 140 through the network.
- the motion monitoring system 100 may also include a database.
- the database may store data (eg, initially set threshold conditions, etc.) and/or instructions (eg, feedback instructions).
- the database can store data acquired from the wearable device 130 and/or the mobile terminal device 140 .
- the database may store information and/or instructions for execution or use by the processing device 110 to perform the example methods described herein.
- the database may be connected to the network 120 to communicate with one or more components of the exercise monitoring system 100 (eg, the processing device 110 , the wearable device 130 , the mobile terminal device 140 , etc.).
- One or more components of the athletic monitoring system 100 may access data or instructions stored in the database via the network 120 .
- the database may be directly connected or communicated with one or more components in the athletic monitoring system 100 .
- the database may be part of the processing device 110 .
- Fig. 2 is a schematic diagram of exemplary hardware and/or software of the wearable device 130 according to some embodiments of the present application.
- the wearable device 130 may include an acquisition module 210, a processing module 220 (also called a processor), a control module 230 (also called a main control, MCU, controller), a communication module 240, a power supply module 250 and input/output module 260 .
- the acquiring module 210 can be used to acquire motion signals of the user when exercising.
- the acquisition module 210 may include a sensor unit, and the sensor unit may be used to acquire one or more types of action signals when the user is exercising.
- the sensor unit may include, but not limited to, myoelectric sensor, attitude sensor, electrocardiogram sensor, respiration sensor, temperature sensor, humidity sensor, inertial sensor, blood oxygen saturation sensor, Hall sensor, electrodermal sensor, One or more of rotation sensors, etc.
- the action signal may include one or more of myoelectric signal, posture signal, electrocardiogram signal, respiratory rate, temperature signal, humidity signal, etc.
- the sensor unit can be placed in different positions of the wearable device 130 according to the type of motion signal to be acquired.
- a myoelectric sensor (also referred to as an electrode element) may be disposed at a muscle position of a human body, and the myoelectric sensor may be configured to collect myoelectric signals when the user moves.
- the EMG signal and its corresponding feature information (for example, frequency information, amplitude information, etc.) can reflect the state of the user's muscles when exercising.
- the posture sensor can be set at different positions of the human body (for example, positions corresponding to the torso, limbs, and joints in the wearable device 130), and the posture sensor can be configured to collect posture signals when the user moves.
- the posture signal and its corresponding feature information can reflect the posture of the user's movement.
- the electrocardiogram sensor can be arranged around the chest of the human body, and the electrocardiogram sensor can be configured to collect electrocardiogram data when the user is exercising.
- the respiration sensor may be arranged around the chest of the human body, and the respiration sensor may be configured to collect respiration data (eg, respiration frequency, respiration amplitude, etc.) of the user during exercise.
- the temperature sensor may be configured to collect temperature data (eg, body surface temperature) of the user while exercising.
- the humidity sensor may be configured to collect humidity data of the external environment when the user is exercising.
- the processing module 220 may process data from the acquisition module 210 , the control module 230 , the communication module 240 , the power supply module 250 and/or the input/output module 260 .
- the processing module 220 may process the action signal from the acquisition module 210 during the user's exercise.
- the processing module 220 may preprocess the action signals (eg, myoelectric signals, posture signals) acquired by the acquisition module 210 .
- the processing module 220 performs segmentation processing on the myoelectric signal or gesture signal when the user is exercising.
- the processing module 220 may perform pre-processing (for example, filter processing, signal correction processing) on the electromyographic signal when the user is exercising, so as to improve the quality of the electromyographic signal.
- processing module 220 may determine feature information corresponding to the gesture signal based on the gesture signal when the user is exercising.
- processing module 220 may process instructions or operations from input/output module 260 .
- the processed data can be stored in memory or hard disk.
- the processing module 220 can transmit the processed data to one or more components in the motion monitoring system 100 through the communication module 240 or the network 120 .
- the processing module 220 may send the monitoring result of the user's movement to the control module 230, and the control module 230 may execute subsequent operations or instructions according to the action determination result.
- the control module 230 can be connected with other modules in the wearable device 130 .
- the control module 230 can control the running status of other modules in the wearable device 130 .
- the control module 230 can control the power supply state (eg, normal mode, power saving mode), power supply time, etc. of the power supply module 250 .
- the control module 230 may control the input/output module 260 according to the determination result of the user's motion, and then may control the mobile terminal device 140 to send the feedback result of its motion to the user.
- the control module 230 can control the input/output module 260, and then can control the mobile terminal device 140 to give feedback to the user, so that the user can know the state of his own exercise in real time. Make adjustments to the action.
- the control module 230 may also control one or more sensors or other modules in the acquisition module 210 to provide feedback to the human body. For example, when a certain muscle exerts too much force during the user's exercise, the control module 230 may control the electrode module at the position of the muscle to provide electrical stimulation to the user to prompt the user to adjust the action in time.
- the communication module 240 may be used for information or data exchange. In some embodiments, the communication module 240 can be used for communication between internal components of the wearable device 130 . For example, the acquisition module 210 may send user action signals (eg, myoelectric signals, gesture signals, etc.) to the communication module 240 , and the communication module 240 may send the action signals to the processing module 220 . In some embodiments, the communication module 240 can also be used for communication between the wearable device 130 and other components in the exercise monitoring system 100 . For example, the communication module 240 may send status information (for example, switch status) of the wearable device 130 to the processing device 110, and the processing device 110 may monitor the wearable device 130 based on the status information.
- the communication module 240 can adopt wired, wireless, and wired/wireless mixed technologies.
- the power supply module 250 may provide power to other components in the motion monitoring system 100 .
- the input/output module 260 can acquire, transmit and send signals.
- the input/output module 260 may connect or communicate with other components in the motion monitoring system 100 .
- Other components in the motion monitoring system 100 may be connected or communicated through the input/output module 260 .
- the above description of the motion monitoring system 100 and its modules is only for convenience of description, and does not limit one or more embodiments of the present application within the scope of the illustrated embodiments. It can be understood that for those skilled in the art, after understanding the principle of the system, it is possible to combine the various modules arbitrarily, or form a subsystem to connect with other modules, or to One or more modules of the .
- the acquisition module 210 and the processing module 220 may be one module, and this module may have the function of acquiring and processing user action signals.
- the processing module 220 may also be integrated in the processing device 110 instead of being disposed in the wearable device 130 . Such modifications are within the protection scope of one or more embodiments of the present application.
- FIG. 3 is a schematic diagram of exemplary hardware and/or software of a computing device 300 according to some embodiments of the present application.
- the processing device 110 and/or the mobile terminal device 140 may be implemented on the computing device 300 .
- athletic data targeting system 180 may be implemented on computing device 300 .
- computing device 300 may include internal communication bus 310 , at least one processor 320 , at least one storage medium, communication port 350 , input/output interface 360 , and user interface 380 .
- Internal communication bus 310 enables data communication between components in computing device 300 .
- at least one processor 320 may send data to at least one storage medium or other hardware such as input/output port 360 through the internal communication bus 310 .
- the internal communication bus 310 may be an Industry Standard (ISA) bus, an Extended Industry Standard (EISA) bus, a Video Electronics Standard (VESA) bus, a Peripheral Component Interconnect (PCI) bus, or the like.
- ISA Industry Standard
- EISA Extended Industry Standard
- VESA Video Electronics Standard
- PCI Peripheral Component Interconnect
- the internal communication bus 310 can be used to connect various modules in the motion monitoring system 100 shown in FIG. ).
- At least one storage medium of computing device 300 may include data storage.
- the data storage device may be a non-transitory storage medium or a temporary storage medium.
- the data storage device may include one or more of a read only memory (ROM) 330, a random access memory (RAM) 340, a hard disk 370, and the like.
- Exemplary ROMs may include mask ROM (MROM), programmable ROM (PROM), erasable programmable ROM (PEROM), electrically erasable programmable ROM (EEPROM), compact disc ROM (CD-ROM), and digital Universal disk ROM, etc.
- Exemplary RAMs may include dynamic RAM (DRAM), double rate synchronous dynamic RAM (DDR SDRAM), static RAM (SRAM), thyristor RAM (T-RAM), and zero-capacitance RAM (Z-RAM), among others.
- the storage medium may store data/information obtained from any other component of the motion monitoring system 100 .
- the storage medium also includes at least one instruction set stored in the data storage device.
- the instructions are computer program codes, and the computer program codes may include programs, routines, objects, components, data structures, procedures, modules, etc. for executing the motion data calibration method provided in this specification.
- the storage medium of the computing device 300 may be located in the wearable device 130 or in the processing device 110 .
- At least one processor 320 may be communicatively coupled to at least one storage medium. At least one processor 320 is configured to execute the above at least one instruction set. When the computing device 300 is running, at least one processor 320 can read the at least one instruction set, and execute computing instructions (program codes) according to the instructions of the at least one instruction set, thereby executing the motion monitoring system 100 described in this application function. The processor 320 can execute all the steps included in the data processing method.
- the computing instructions may include programs, objects, components, data structures, procedures, modules, and functions (the functions refer to specific functions described in this application).
- the processor 320 can process the action signal (for example, myoelectric signal, gesture signal) obtained from the wearable device 130 or/and mobile terminal device 140 of the exercise monitoring system 100 when the user is exercising, and The motion signal monitors the motion of the user's motion.
- the processor 320 can be used to process the motion data obtained from the wearable device 130 or/and mobile terminal device 140 of the motion monitoring system 100 during the user's exercise, and execute the instructions described in this specification according to the instructions of the at least one instruction set.
- the motion data calibration method described above converts the motion data into two-dimensional posture data.
- the processor 320 may include a microcontroller, a microprocessor, a reduced instruction set computer (RISC), an application specific integrated circuit (ASIC), an application specific instruction set processor (ASIP), a central processing unit (CPU) , Graphics Processing Unit (GPU), Physical Processing Unit (PPU), Microcontroller Unit, Digital Signal Processor (DSP), Field Programmable Gate Array (FPGA), Advanced Reduced Instruction Set Computer (ARM), Programmable Logic Device And any circuit and processor etc. capable of performing one or more functions, or any combination thereof.
- RISC reduced instruction set computer
- ASIC application specific integrated circuit
- ASIP application specific instruction set processor
- CPU central processing unit
- GPU Graphics Processing Unit
- PPU Physical Processing Unit
- Microcontroller Unit Microcontroller Unit
- DSP Digital Signal Processor
- FPGA Field Programmable Gate Array
- ARM Advanced Reduced Instruction Set Computer
- Programmable Logic Device any circuit and processor etc. capable of performing one or more functions, or any combination thereof.
- the computing device 300 in FIG. 3 only depicts one
- Hard disk 370 may be used to store information and data generated by or received from processing device 110 .
- the hard disk 370 may store user identification information of the user.
- the hard disk 370 may be disposed in the processing device 110 or in the wearable device 130 .
- User interface 380 may enable interaction and information exchange between computing device 300 and a user.
- the user interface 380 may be used to present the athletic records generated by the athletic monitoring system 100 to the user.
- user interface 380 may include a physical display, such as a display with speakers, LCD display, LED display, OLED display, electronic ink display (E-Ink), and the like.
- the input/output interface 360 may be used to input or output signals, data or information. In some embodiments, input/output interface 360 may enable a user to interact with athletic monitoring system 100 .
- Fig. 4 is an exemplary structural diagram of a wearable device according to some embodiments of the present application.
- a top garment is used as an example.
- wearable device 400 may include upper garment 410 .
- the tops garment 410 may include a tops garment base 4110, at least one tops processing module 4120, at least one tops feedback module 4130, at least one tops acquisition module 4140, and the like.
- the upper clothing base 4110 may refer to clothing worn on the upper body of a human body.
- the upper garment substrate 4110 may include a short-sleeved T-shirt, a long-sleeved T-shirt, a shirt, a coat, or the like.
- At least one upper garment processing module 4120 and at least one upper garment acquisition module 4140 may be located on the upper garment base 4110 in areas that fit different parts of the human body.
- At least one upper garment feedback module 4130 can be located at any position of the upper garment garment base 4110, and the at least one upper garment feedback module 4130 can be configured to feed back information about the user's upper body movement state.
- Exemplary feedback methods may include, but are not limited to, voice prompts, text prompts, pressure prompts, electrical stimulation, and the like.
- At least one upper garment acquisition module 4140 may include, but not limited to, one of a posture sensor, an electrocardiogram sensor, an electromyography sensor, a temperature sensor, a humidity sensor, an inertial sensor, an acid-base sensor, an acoustic wave transducer, etc. or more.
- the sensors in the upper garment acquisition module 4140 can be placed on different positions of the user's body according to different signals to be measured. For example, when the posture sensor is used to acquire the posture signal during the user's movement, the posture sensor can be placed in the upper clothing base 4110 at positions corresponding to the torso, arms, and joints of the human body.
- the myoelectric sensor when used to acquire the myoelectric signal during the user's exercise, the myoelectric sensor may be located near the user's muscle to be measured.
- the attitude sensor may include, but not limited to, an acceleration three-axis sensor, an angular velocity three-axis sensor, a magnetic force sensor, etc., or any combination thereof.
- an attitude sensor may include an acceleration three-axis sensor and an angular velocity three-axis sensor.
- the attitude sensor may also include a strain gauge sensor.
- the strain sensor may refer to a sensor that can be based on the strain generated by the deformation of the object under test.
- the strain gauge sensor may include, but not limited to, one or more of strain gauge force sensors, strain gauge pressure sensors, strain gauge torque sensors, strain gauge displacement sensors, and strain gauge acceleration sensors.
- the strain gauge sensor can be set at the joint position of the user, and by measuring the magnitude of the resistance in the strain gauge sensor that changes with the stretching length, the bending angle and bending direction at the joint of the user can be obtained.
- the upper garment 410 may also include other modules, such as a power supply module, a communication module, an input/output modules etc.
- the jacket processing module 4120 is similar to the processing module 220 in FIG. 2, and the jacket acquisition module 4140 is similar to the acquisition module 210 in FIG. description and will not be repeated here.
- Fig. 5 is an exemplary flowchart of a motion monitoring method according to some embodiments of the present application. As shown in Figure 5, the process 500 may include:
- step 510 an action signal when the user is exercising is acquired.
- this step 510 may be performed by the acquisition module 210 .
- the motion signal refers to the human body parameter information when the user is exercising.
- the human body parameter information may include, but not limited to, one or more of myoelectric signals, posture signals, electrocardiographic signals, temperature signals, humidity signals, blood oxygen concentration, respiratory rate, and the like.
- the myoelectric sensor in the acquiring module 210 can collect the myoelectric signal of the user during exercise. For example, when the user performs chest clamping in a sitting position, the myoelectric sensor in the wearable device corresponding to the positions of the human pectoral muscles and latissimus dorsi can collect the myoelectric signals of the corresponding muscle positions of the user.
- the myoelectric sensor corresponding to the position of the human gluteus maximus and quadriceps in the wearable device can collect the myoelectric signal of the corresponding muscle position of the user.
- the myoelectric sensor corresponding to the human gastrocnemius muscle and other positions in the wearable device can collect the myoelectric signals of the human gastrocnemius muscle and other positions.
- the posture sensor in the acquisition module 210 can collect posture signals of the user during motion.
- the posture sensor corresponding to the position of the triceps of the human body in the wearable device can collect the posture signals of the position of the user's triceps.
- the posture sensor installed at the deltoid muscle of the human body can collect posture signals of the deltoid muscle of the user.
- the number of posture sensors in the acquisition module 210 can be multiple, and multiple posture sensors can obtain posture signals of multiple parts when the user is moving, and the posture signals of multiple parts can reflect the relative relationship between different parts of the human body. Sports situation.
- a pose signal at the arm and a pose signal at the torso may reflect the motion of the arm relative to the torso.
- the attitude signal is associated with a type of attitude sensor.
- the attitude sensor is an angular velocity three-axis sensor
- the acquired attitude signal is angular velocity information.
- the attitude sensor is an angular velocity three-axis sensor and an acceleration three-axis sensor
- the acquired attitude signal is angular velocity information and acceleration information.
- the attitude sensor is a strain sensor
- the strain sensor can be set at the joint position of the user. By measuring the resistance of the strain sensor that changes with the stretching length, the acquired attitude signal can be displacement information, stress etc.
- the bending angle and bending direction at the user's joints can be characterized.
- the parameter information that can be used to reflect the relative movement of the user's body can be the feature information corresponding to the attitude signal, and different types of attitude sensors can be used to obtain it according to the type of feature information.
- the action signal may include a myoelectric signal of a specific part of the user's body and a gesture signal of the specific part.
- Myoelectric signals and posture signals can reflect the movement state of specific parts of the user's body from different angles. To put it simply, the posture signal of a specific part of the user's body can reflect the type of movement, range of movement, frequency of movement, etc. of the specific part.
- the EMG signal can reflect the muscle state of the specific part during exercise. In some embodiments, by using the electromyography signal and/or posture signal of the same body part, it is possible to better evaluate whether the movement of the part is normal.
- step 520 the movement of the user is monitored based at least on the feature information corresponding to the EMG signal or the feature information corresponding to the gesture signal.
- the feature information corresponding to the EMG signal may include but not limited to one or more of frequency information, amplitude information, and the like.
- the feature information corresponding to the posture signal refers to parameter information used to characterize the relative movement of the user's body.
- the feature information corresponding to the attitude signal may include, but not limited to, one or more of angular velocity direction, angular velocity value, angular velocity acceleration value, and the like.
- the feature information corresponding to the attitude signal may also include angle, displacement information (such as stretching length in a strain gauge sensor), stress, and the like.
- the strain sensor when the attitude sensor is a strain sensor, the strain sensor can be set at the joint position of the user. By measuring the resistance of the strain sensor that changes with the stretching length, the acquired attitude signal can be displacement information, stress, etc. , the bending angle and bending direction at the user's joints can be characterized by these posture signals.
- the processing module 220 and/or the processing device 110 can extract feature information (for example, frequency information, amplitude information) corresponding to the EMG signal or feature information (for example, angular velocity direction, angular velocity value, Acceleration value of angular velocity, angle, displacement information, stress, etc.), and based on the characteristic information corresponding to the electromyographic signal or the characteristic information corresponding to the attitude signal, the user's movement is monitored.
- feature information for example, frequency information, amplitude information
- feature information for example, angular velocity direction, angular velocity value, Acceleration value of angular velocity, angle, displacement information, stress, etc.
- monitoring the motion of the user includes monitoring information related to the user's motion.
- the action-related information may include one or more of user action type, action quantity, action quality (eg, whether the user action meets a standard), action time, and the like.
- the action type refers to the fitness action taken by the user when exercising.
- the type of action may include, but not limited to, one or more of sitting chest clamp, squat, deadlift, plank, running, swimming, and the like.
- the number of actions refers to the number of times the user performs actions during exercise. For example, the user performed 10 times of chest clamping in a sitting position during exercise, where 10 times is the number of movements.
- Movement quality refers to the standard degree of the fitness movement performed by the user relative to the standard fitness movement.
- the processing device 110 can determine the action type of the user's action based on the feature information corresponding to the action signal (myoelectric signal and posture signal) of a specific muscle position (glutes, quadriceps, etc.) , and judge the motion quality of the user's squat motion based on the motion signal of the standard squat motion.
- the action time refers to the time corresponding to one or more action types of the user or the total time of the exercise process.
- the feature information corresponding to the posture signal may include parameter information that can be used to reflect the relative movement of the user's body.
- the exercise monitoring system 100 needs to obtain relative movements between different parts of the user's body.
- the attitude signal may be attitude data measured by an attitude sensor.
- the posture sensors can be distributed in many different parts of the user's body.
- the motion monitoring system 100 may calibrate gesture signals between multiple different parts of the user's body.
- Fig. 6 is an exemplary flowchart of a motion data calibration method 3000 according to some embodiments of the present application.
- the motion data calibration system 180 can execute the motion data calibration method 3000 provided in this application.
- the motion data marking system 180 may execute the motion data marking method 3000 on the processing device 110 , and may also execute the motion data marking method 3000 on the mobile terminal device 140 .
- the processor 320 in the processing device 110 can read the instruction set stored in its local storage medium, and then execute the motion data calibration method 3000 provided in this application according to the specification of the instruction set.
- the method 3000 of motion data calibration may include:
- the motion data refers to information about human body motion parameters when the user is exercising.
- the action data may include at least one gesture signal corresponding to at least one measurement position on the user's body.
- the posture signal and its corresponding feature information (for example, angular velocity direction, angular velocity value, angular velocity acceleration value, angle, displacement information, stress, etc.) can reflect the posture of the user's motion.
- At least one measurement position is in one-to-one correspondence with at least one attitude signal.
- the measurement locations may be different parts on the user's body.
- the at least one posture signal corresponds to the actual posture of the at least one measured location on the user's body when the user moves.
- Each attitude signal of the at least one attitude signal may include three-dimensional attitude data of its corresponding measurement position in the original coordinate system.
- the original coordinate system may be the coordinate system where the attitude signal is located.
- the change of the original coordinate system may also cause the change of the posture signal.
- Each attitude signal may include one or more types of three-dimensional attitude data. For example, three-dimensional angle data, three-dimensional angular velocity data, three-dimensional angular acceleration data, three-dimensional velocity data, three-dimensional displacement data, three-dimensional stress data, and so on.
- the attitude signal can be acquired by an attitude sensor on the wearable device 130 .
- the sensor unit of the acquisition module 210 in the wearable device 130 may include an attitude sensor.
- the wearable device 130 may include at least one attitude sensor.
- At least one posture sensor may be located at at least one measurement location on the user's body.
- the attitude sensor can collect the attitude signal of the corresponding measurement position on the user's body.
- the attitude sensors on the wearable device 130 may be distributed in the limbs of the human body (eg, arms, legs, etc.), the torso of the human body (eg, chest, abdomen, back, waist, etc.), and the head of the human body.
- the attitude sensor can realize the acquisition of attitude signals of other parts such as the limbs and torso of the human body.
- the posture sensor can be placed at different positions of the wearable device 130 according to the posture signals to be acquired, so as to measure posture signals corresponding to different positions of the human body.
- the attitude sensor may also be an attitude measurement unit (AHRS) sensor with an attitude fusion algorithm.
- the attitude fusion algorithm can fuse the data of a nine-axis inertial measurement unit (IMU) with a three-axis acceleration sensor, a three-axis angular velocity sensor, and a three-axis geomagnetic sensor into Euler angles or quaternions to obtain the position of the user's body where the attitude sensor is located. Attitude signal.
- the processing module 220 and/or the processing device 110 may determine feature information corresponding to the gesture based on the gesture signal.
- the feature information corresponding to the attitude signal may include but not limited to angular velocity value, angular velocity direction, angular velocity acceleration value, and the like.
- the posture sensor may be a strain sensor, and the strain sensor may obtain the bending direction and bending angle of the joints of the user, so as to obtain the posture signal of the user during motion.
- the strain sensor can be set at the user's knee joint. When the user moves, the user's body parts act on the strain sensor, and the bending direction and bending angle at the user's knee joint can be calculated based on the resistance or length change of the strain sensor.
- the attitude sensor may also include an optical fiber sensor, and the attitude signal may be represented by a direction change of light of the optical fiber sensor after being bent.
- the attitude sensor can also be a magnetic flux sensor, and the attitude signal can be characterized by the transformation of the magnetic flux. It should be noted that the type of the posture sensor is not limited to the above-mentioned sensors, and may also be other sensors, and all sensors capable of acquiring user posture signals are within the scope of the posture sensor of the present application.
- each attitude signal may include one or more types of three-dimensional attitude data.
- Attitude sensors may also include multiple types of sensors.
- the attitude sensor may include at least one of an acceleration sensor, an angular velocity sensor, and a magnetic force sensor.
- the original coordinate system may be the coordinate system where the attitude sensor is located.
- the original coordinate system refers to the coordinate system corresponding to the posture sensor disposed on the human body.
- the posture sensors on the wearable device 130 are distributed in different parts of the human body, so that the installation angles of the posture sensors on the human body are different, and the posture sensors of different parts are respectively based on the coordinate system of the body
- attitude sensors in different parts have different original coordinate systems.
- the attitude signals acquired by each attitude sensor may be expressed in its corresponding original coordinate system.
- the attitude signal acquired by the attitude sensor may be the attitude signal of the preset fixed coordinate system in the original coordinate system.
- the preset fixed coordinate system may be a geodetic coordinate system, or any other preset coordinate system.
- the conversion relationship between the original coordinate system and the preset fixed coordinate system may be pre-stored in the motion data calibration system 180 .
- the attitude signal can be a signal directly acquired by the attitude sensor, or an attitude signal formed by signal processing processes such as conventional filtering, rectification, wavelet transformation, and glitch processing, or any of the above A signal resulting from a permutation and combination of one or more processing flows.
- the gesture signal may be data measured by an image sensor.
- the image sensor may be an image sensor capable of acquiring depth information, such as a 3D structured light camera, a binocular camera, and the like.
- the image sensor can be installed at any position where an image of the user's movement can be captured.
- the number of image sensors can be one or more. When the number of image sensors is plural, the plural image sensors may be installed at plural different positions.
- the image sensor can acquire a depth image of the user's motion.
- the depth image may contain depth information of at least one measurement position on the user's body relative to the coordinate system where the image sensor is located.
- the motion data calibration system 180 may calculate an attitude signal of each measurement position in the at least one measurement position based on changes in multiple frames of depth images. As mentioned above, each attitude signal may include one or more types of three-dimensional attitude data.
- the motion data calibration system 180 can calculate different three-dimensional pose data.
- the original coordinate system may be the coordinate system where the image sensor itself is located.
- the gesture signal acquired by the image sensor may be expressed in the corresponding coordinate system (original coordinate system) of the image sensor itself.
- Image sensors can be pre-calibrated. That is, the motion data calibration system 180 may pre-store the conversion relationship between the image sensor's own coordinate system and the aforementioned preset fixed coordinate system.
- the action data measured by at least one attitude sensor we will take the action data measured by at least one attitude sensor as an example.
- the original coordinate system we define the o-xyz coordinate system.
- o is the coordinate circle point of the original coordinate system o-xyz
- the x-axis, y-axis and z-axis are three mutually perpendicular coordinate axes of the original coordinate system o-xyz respectively.
- each attitude signal may be three-dimensional attitude data whose corresponding measurement position is in the original coordinate system o-xyz.
- the three-dimensional pose data may be pose data on three mutually perpendicular coordinate axes in the coordinate system where it is located.
- attitude data may include angle data and angular velocity data.
- the three-dimensional attitude data in the original coordinate system o-xyz may include angle data and angular velocity data on three mutually perpendicular coordinate axes x-axis, y-axis and z-axis.
- the 3D attitude data of each attitude signal in the original coordinate system o-xyz may include 3D angle (Euler) data E sens and 3D angular velocity (Gyro) data G sens .
- the three-dimensional angular data E sens may include angular data E sens_x on the x-axis, angular data E sens_y on the y-axis, and angular data E sens_z on the z-axis.
- the three-dimensional angular velocity data G sens may include angular velocity data G sens_x on the x-axis, angular velocity data G sens_y on the y-axis, and angular velocity data G sens_z on the z-axis.
- this step may be performed by the processing module 220 and/or the processing device 110 .
- the motion data calibration system 180 can transform the motion data into posture data in the same known coordinate system (for example, the target coordinate system is defined as O-XYZ).
- the target coordinate system O-XYZ may include three mutually perpendicular coordinate axes of X axis, Y axis and Z axis.
- the target coordinate system O-XYZ may be any calibrated coordinate system. In some embodiments, the target coordinate system O-XYZ may be the aforementioned preset fixed coordinate system. In some embodiments, the target coordinate system O-XYZ and the aforementioned preset fixed coordinate system may be different coordinate systems.
- the motion data calibration system 180 may pre-store the aforementioned conversion relationship between the preset fixed coordinate system and the target coordinate system O-XYZ.
- the exercise data calibration system 180 is used to calibrate the motion data of the user when exercising, and the measurement object is the user. Therefore, in some embodiments, the target coordinate system O-XYZ may take the length direction of the torso when the human body is standing as the Z axis. That is, the Z axis is the opposite direction of the vertical direction where the acceleration of gravity is located. That is, the Z axis is a coordinate axis perpendicular to the ground and pointing to the sky. The plane formed by the X axis and the Y axis is a horizontal plane perpendicular to the Z axis.
- the X axis and the Y axis may be any two mutually perpendicular coordinate axes in a horizontal plane perpendicular to the Z axis.
- the X-axis may be an east-west oriented coordinate, such as a coordinate axis pointing eastward
- the Y-axis may be a north-south oriented coordinate, such as a coordinate axis pointing northward.
- Fig. 7 shows a schematic diagram of an object coordinate system O-XYZ provided according to an embodiment of the present specification. Wherein, the X-axis direction is just in front of the user 001 when standing.
- S3060 Transform each attitude signal into two-dimensional attitude data in the target coordinate system.
- FIG. 8 is an exemplary flow chart of converting into two-dimensional pose data according to some embodiments of the present application.
- FIG. 8 corresponds to step S3060.
- step S3060 may include:
- the attitude signal acquired by each attitude sensor can be expressed in its corresponding original coordinate system.
- the attitude signal acquired by the attitude sensor may be an attitude signal in the original coordinate system of the measurement position corresponding to the current attitude signal in a preset fixed coordinate system.
- the motion data calibration system 180 can obtain the expression of the corresponding original coordinate system o-xyz of the attitude sensor in the preset fixed coordinate system based on each attitude signal through inverse conversion.
- the motion data calibration system 180 may pre-store the conversion relationship between the preset fixed coordinate system and the target coordinate system O-XYZ.
- the motion data calibration system 180 can calculate and determine the conversion relationship between each original coordinate system o-xyz and the target coordinate system O-XYZ based on the conversion relationship between the preset fixed coordinate system and the target coordinate system O-XYZ.
- the motion data calibration system 180 can convert the posture information in the original coordinate system o-xyz into the posture information of O-XYZ in the target coordinate system based on the conversion relationship.
- the transformation relationship may be expressed as one or more rotation matrices.
- the rotation matrix may be pre-stored in the motion data calibration system 180 .
- each attitude signal may include three-dimensional attitude data E sens_x , E sens_y , E sens_z , G sens_x , G sens_y and G sens_z in its corresponding original coordinate system o-xyz.
- the motion data calibration system 180 can determine the original coordinate system 0-xyz where the measurement position corresponding to each attitude signal is located in the target coordinate system O-XYZ based on the conversion relationship between the target coordinate system O-XYZ and the original coordinate system o-xyz The 3D motion data below.
- the three-dimensional motion data at least includes angular velocity data G global_X on the X axis, angular velocity data G global_Y on the Y axis, and angular velocity data G global_Z on the Z axis.
- S3066 Transform the three-dimensional motion data in the target coordinate system O-XYZ into two-dimensional posture data in the target coordinate system O-XYZ.
- the two-dimensional posture data is data in a two-dimensional coordinate system.
- the two-dimensional coordinate system in the target coordinate system O-XYZ may include a motion plane when the limbs of the user 001 swings and a motion plane when the torso of the user 001 rotates.
- the two-dimensional attitude data in the target coordinate system O-XYZ may include horizontal attitude data and vertical attitude data.
- the horizontal attitude data may include horizontal angle data E global_Z and horizontal angular velocity data G global_Z when moving in a horizontal plane perpendicular to the Z axis.
- the vertical attitude data may include vertical angle data E global_XY and vertical angular velocity data G global_XY when moving in any vertical plane perpendicular to the horizontal plane.
- the vertical plane may be any plane perpendicular to the horizontal plane.
- the horizontal angle data E global_Z may be the rotation angle of the measurement position within the horizontal plane in the target coordinate system O-XYZ.
- the horizontal angular velocity data G global_Z may be the rotational angular velocity of the measurement position within the horizontal plane in the target coordinate system O-XYZ.
- the vertical angle data E global_XY may be the rotation angle of the measurement position in any vertical plane in the target coordinate system O-XYZ.
- the vertical angular velocity data G global_XY may be the rotational angular velocity of the measurement position in any vertical plane in the target coordinate system O-XYZ.
- the main part of user 001's actions can be decomposed into movements on two planes. For example, movement on the horizontal plane in situ and movement on any vertical plane.
- the different actions performed by the user 001 when exercising can be distinguished only by the movement in the horizontal plane and the movement in the vertical plane.
- the running action of the user 001 mainly focuses on the rotational movement around the joints parallel to the horizontal plane. At this time, the rotational movement occurs in a vertical plane perpendicular to the horizontal plane, and the vertical plane extends along the body facing direction of the user 001 .
- the center of gravity of user 001 moves linearly up and down, and the user's limbs swing with the center of gravity.
- the biceps curling movement only has movement in the vertical plane.
- the chest clamping action in the sitting position only moves in the horizontal plane.
- Fig. 9 is a coordinate system diagram of user 001 when exercising according to some embodiments of the present application.
- the orientation of user 001 is basically unchanged.
- the action of the biceps curl is mainly concentrated on the vertical plane P formed by the forearm AO' and the upper arm O'B. Because it is an action to overcome the gravity of the dumbbell, the plane P is generally perpendicular to the horizontal plane X-Y plane.
- the pose of the upper arm O'B is basically unchanged, and the forearm AO' swings back and forth around the elbow O'.
- the vector direction of the swing is the normal direction of the vertical plane P.
- user 001's bicep curl movement has very small components in other directions and can be ignored.
- the motion data calibration system 180 converts the 3D motion data into 2D posture data.
- step S3066 may include:
- S3066-2 Convert the angular velocity data G global_X on the X-axis and the angular velocity data G global_Y on the Y-axis into vertical angular velocity data G global_XY by vector law.
- the vertical angular velocity data G global_XY can be expressed as the following formula:
- S3066-4 Time-integrate the vertical angular velocity data G glohal_XY based on the time corresponding to the start position and end position of the user 001 during exercise, and obtain the vertical angle data E glohal_XY .
- startpos and endpos are the start time and end time corresponding to the start position (start position) and end position (end position) of an action.
- S3066-8 Time-integrate the horizontal angular velocity data G glohal_Z based on the time corresponding to the starting position and the ending position when the user 001 moves, and obtain the horizontal angle data E global_Z .
- the horizontal angle data E global_Z can be expressed as the following formula:
- startpos and endpos are the start time and end time corresponding to the start position (start position) and end position (end position) of an action.
- the motion data calibration method 3000 may also include:
- the motion data calibration system 180 can determine the relative motion between different motion parts of the user 001's body through at least one two-dimensional posture data corresponding to at least one posture signal corresponding to at least one measurement position of the user's 001 body. For example, the relative movement between the arm and the torso of the user 001 during the movement can be judged through the characteristic information corresponding to the attitude sensor on the arm of the user 001 and the characteristic information corresponding to the attitude sensor on the torso of the user 001.
- the motion data calibration method 3000 and system 180 provided by this application can convert the motion data of user 001 during motion from three-dimensional posture data in three mutually perpendicular coordinate axes to two-dimensional data in the target coordinate system , that is, the gesture data in the horizontal plane and the gesture data in the vertical plane, so that the actions of the user 001 when moving are divided into the movement in the horizontal direction and the movement in the vertical direction, so as to avoid the different orientations of the user 001 resulting data discrepancies.
- the method 3000 and system 180 can remove the influence of the user's 001 orientation on the motion data. Therefore, the method 3000 and the system 180 can calibrate the exercise data without the need for the user 001 to perform a calibration action.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Physiology (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Physical Education & Sports Medicine (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Human Computer Interaction (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
Claims (11)
- 一种运动数据标定方法,其特征在于,包括:获取用户运动时的动作数据,所述动作数据包括所述用户身上的至少一个测量位置对应的至少一个姿态信号,所述至少一个姿态信号中的每个姿态信号包括其对应的测量位置在原始坐标系下的三维姿态数据;构建目标坐标系,所述目标坐标系包括X轴、Y轴、Z轴三个互相垂直的坐标轴;以及将所述每个姿态信号转换成在所述目标坐标系下的二维姿态数据。
- 如权利要求1所述的运动数据标定方法,其特征在于,所述每个姿态信号包括由姿态传感器测量得到的数据,所述原始坐标系包括所述姿态传感器所在的坐标系。
- 如权利要求2所述的运动数据标定方法,其特征在于,所述姿态传感器包括加速度传感器、角速度传感器以及磁力传感器中的至少一种。
- 如权利要求1所述的运动数据标定方法,其特征在于,所述每个姿态信号包括由图像传感器测量的数据,所述原始坐标系包括所述图像传感器所在的坐标系。
- 如权利要求1所述的运动数据标定方法,其特征在于,所述三维姿态数据包括三个相互垂直的坐标轴上的角度数据和角速度数据。
- 如权利要求1所述的运动数据标定方法,其特征在于,所述将所述每个姿态信号转换成在所述目标坐标系下的二维姿态数据,包括:获取预先存储的所述目标坐标系与所述原始坐标系之间的转换关系;基于所述转换关系,将所述每个姿态信号转换为所述目标坐标系下的三维运动数据,所述三维运动数据至少包括所述X轴上的角速度数据、所述Y轴上的角速度数据以及所述Z轴上的角速度数据;以及将所述目标坐标系中的所述三维运动数据转换为所述目标坐标系中的所述二维姿态数据。
- 如权利要求6所述的运动数据标定方法,其特征在于,所述目标坐标系的所述Z轴为重力加速度所在的竖直方向的反方向。
- 如权利要求7所述的运动数据标定方法,其特征在于,所述目标坐标系下的所述二维姿态数据包括:水平姿态数据,包括在垂直于所述Z轴的水平平面内运动时的水平角度数据和水平角速度数据;以及竖直姿态数据,包括在垂直于所述水平平面的任意竖直平面内运动时的竖直角度数据和竖直角速度数据。
- 如权利要求8所述的运动数据标定方法,其特征在于,所述将所述目标坐标系中的所述三维运动数据转换为所述目标坐标系中的所述二维姿态数据,包括:将所述X轴上的角速度数据和所述Y轴上的角速度数据通过矢量法则转换成所述竖直角速度数据;基于所述用户运动时的起始位置和结束位置对应的时间,对所述竖直角速度数据进行时间积分,获取所述竖直角度数据;将所述Z轴的角速度数据作为所述水平角速度数据;以及基于所述用户运动时的起始位置和结束位置对应的时间,对所述水平角速度数据进行时间积分,获取所述水平角度数据。
- 如权利要求1所述的运动数据标定方法,其特征在于,还包括:基于所述每个姿态信号对应的所述二维姿态数据,确定所述至少一个测量位置之间的相对运动。
- 一种运动数据标定系统,其特征在于,包括:至少一个存储介质,存储有至少一个指令集用于运动数据标定;以及至少一个处理器,同所述至少一个存储介质通信连接,其中,当所述运动数据标定系统运行时,所述至少一个处理器读取所述至少一个指令集并实施权利要求1-10中任一项所述的运动数据标定方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21965891.1A EP4365850A1 (en) | 2021-11-30 | 2021-11-30 | Movement data calibration method and system |
CN202180100382.3A CN117651847A (zh) | 2021-11-30 | 2021-11-30 | 运动数据标定方法和系统 |
KR1020237044869A KR20240013224A (ko) | 2021-11-30 | 2021-11-30 | 운동 데이터 캘리브레이션 방법과 시스템 |
PCT/CN2021/134421 WO2023097447A1 (zh) | 2021-11-30 | 2021-11-30 | 运动数据标定方法和系统 |
US18/421,955 US20240168051A1 (en) | 2021-11-30 | 2024-01-24 | Motion data calibration method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2021/134421 WO2023097447A1 (zh) | 2021-11-30 | 2021-11-30 | 运动数据标定方法和系统 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/421,955 Continuation US20240168051A1 (en) | 2021-11-30 | 2024-01-24 | Motion data calibration method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023097447A1 true WO2023097447A1 (zh) | 2023-06-08 |
Family
ID=86611295
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/134421 WO2023097447A1 (zh) | 2021-11-30 | 2021-11-30 | 运动数据标定方法和系统 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240168051A1 (zh) |
EP (1) | EP4365850A1 (zh) |
KR (1) | KR20240013224A (zh) |
CN (1) | CN117651847A (zh) |
WO (1) | WO2023097447A1 (zh) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013242790A (ja) * | 2012-05-22 | 2013-12-05 | Nippon Telegr & Teleph Corp <Ntt> | 移動物体ポーズ算出装置、移動物体ポーズ算出方法、及びプログラム |
US20150294597A1 (en) * | 2012-10-23 | 2015-10-15 | New York University | Somatosensory feedback wearable object |
CN108939512A (zh) * | 2018-07-23 | 2018-12-07 | 大连理工大学 | 一种基于穿戴式传感器的游泳姿态测量方法 |
CN110517336A (zh) * | 2019-08-28 | 2019-11-29 | 北京理工大学 | 一种基于主发力关节点的人体运动数据压缩方法及设备 |
CN110705496A (zh) * | 2019-10-11 | 2020-01-17 | 成都乐动信息技术有限公司 | 一种基于九轴传感器的游泳姿势识别方法 |
CN111353941A (zh) * | 2018-12-21 | 2020-06-30 | 广州幻境科技有限公司 | 一种空间坐标转换方法 |
CN111767932A (zh) * | 2019-04-02 | 2020-10-13 | 北京深蓝长盛科技有限公司 | 动作判定方法及装置、计算机设备及计算机可读存储介质 |
-
2021
- 2021-11-30 WO PCT/CN2021/134421 patent/WO2023097447A1/zh active Application Filing
- 2021-11-30 EP EP21965891.1A patent/EP4365850A1/en active Pending
- 2021-11-30 CN CN202180100382.3A patent/CN117651847A/zh active Pending
- 2021-11-30 KR KR1020237044869A patent/KR20240013224A/ko unknown
-
2024
- 2024-01-24 US US18/421,955 patent/US20240168051A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013242790A (ja) * | 2012-05-22 | 2013-12-05 | Nippon Telegr & Teleph Corp <Ntt> | 移動物体ポーズ算出装置、移動物体ポーズ算出方法、及びプログラム |
US20150294597A1 (en) * | 2012-10-23 | 2015-10-15 | New York University | Somatosensory feedback wearable object |
CN108939512A (zh) * | 2018-07-23 | 2018-12-07 | 大连理工大学 | 一种基于穿戴式传感器的游泳姿态测量方法 |
CN111353941A (zh) * | 2018-12-21 | 2020-06-30 | 广州幻境科技有限公司 | 一种空间坐标转换方法 |
CN111767932A (zh) * | 2019-04-02 | 2020-10-13 | 北京深蓝长盛科技有限公司 | 动作判定方法及装置、计算机设备及计算机可读存储介质 |
CN110517336A (zh) * | 2019-08-28 | 2019-11-29 | 北京理工大学 | 一种基于主发力关节点的人体运动数据压缩方法及设备 |
CN110705496A (zh) * | 2019-10-11 | 2020-01-17 | 成都乐动信息技术有限公司 | 一种基于九轴传感器的游泳姿势识别方法 |
Also Published As
Publication number | Publication date |
---|---|
CN117651847A (zh) | 2024-03-05 |
EP4365850A1 (en) | 2024-05-08 |
KR20240013224A (ko) | 2024-01-30 |
US20240168051A1 (en) | 2024-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018196227A1 (zh) | 人体运动能力评价方法、装置及系统 | |
Slade et al. | An open-source and wearable system for measuring 3D human motion in real-time | |
JP6772276B2 (ja) | 運動認識装置及び運動認識方法 | |
WO2022193425A1 (zh) | 运动数据展示方法和系统 | |
Najafi et al. | Estimation of center of mass trajectory using wearable sensors during golf swing | |
JP2017520336A (ja) | 人体および物体運動への生体力学フィードバックを送達するための方法およびシステム | |
Du et al. | An IMU-compensated skeletal tracking system using Kinect for the upper limb | |
Sanders et al. | An approach to identifying the effect of technique asymmetries on body alignment in swimming exemplified by a case study of a breaststroke swimmer | |
WO2021041823A1 (en) | Systems and methods for wearable devices that determine balance indices | |
Wang et al. | Motion analysis of deadlift for trainers with different levels based on body sensor network | |
Chen et al. | Development of an upper limb rehabilitation system using inertial movement units and kinect device | |
Steijlen et al. | Smart sensor tights: Movement tracking of the lower limbs in football | |
KR20180031610A (ko) | 밴드형 운동 및 생체정보 측정 장치 | |
Postolache et al. | Postural balance analysis using force platform for K-theragame users | |
CN116304544A (zh) | 运动数据标定方法和系统 | |
WO2023097447A1 (zh) | 运动数据标定方法和系统 | |
JP2024523942A (ja) | 運動データ較正方法及びシステム | |
Mangin et al. | An instrumented glove for swimming performance monitoring | |
WO2022145563A1 (ko) | 사용자 맞춤형 운동 훈련 방법 및 시스템 | |
CN113017615A (zh) | 基于惯性动作捕捉设备的虚拟交互运动辅助系统及方法 | |
Tannus de Souza et al. | A virtual reality exergame with a low-cost 3d motion tracking for at-home post-stroke rehabilitation | |
TWM584698U (zh) | 人體關節訓練回饋裝置以及人體關節訓練回饋系統 | |
TW201701223A (zh) | 健身記錄分享系統及方法 | |
US20230337989A1 (en) | Motion data display method and system | |
TWI745839B (zh) | 結合擴增實境的即時回饋軀幹對稱協調訓練系統 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 20237044869 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020237044869 Country of ref document: KR |
|
ENP | Entry into the national phase |
Ref document number: 2024500107 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180100382.3 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021965891 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2021965891 Country of ref document: EP Effective date: 20240130 |