WO2023097447A1 - 运动数据标定方法和系统 - Google Patents

运动数据标定方法和系统 Download PDF

Info

Publication number
WO2023097447A1
WO2023097447A1 PCT/CN2021/134421 CN2021134421W WO2023097447A1 WO 2023097447 A1 WO2023097447 A1 WO 2023097447A1 CN 2021134421 W CN2021134421 W CN 2021134421W WO 2023097447 A1 WO2023097447 A1 WO 2023097447A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
coordinate system
attitude
sensor
Prior art date
Application number
PCT/CN2021/134421
Other languages
English (en)
French (fr)
Inventor
黎美琪
苏雷
周鑫
廖风云
齐心
Original Assignee
深圳市韶音科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市韶音科技有限公司 filed Critical 深圳市韶音科技有限公司
Priority to EP21965891.1A priority Critical patent/EP4365850A1/en
Priority to CN202180100382.3A priority patent/CN117651847A/zh
Priority to KR1020237044869A priority patent/KR20240013224A/ko
Priority to PCT/CN2021/134421 priority patent/WO2023097447A1/zh
Publication of WO2023097447A1 publication Critical patent/WO2023097447A1/zh
Priority to US18/421,955 priority patent/US20240168051A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • G06T3/067Reshaping or unfolding 3D tree structures onto 2D planes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P21/00Testing or calibrating of apparatus or devices covered by the preceding groups
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6805Vests
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/42Devices characterised by the use of electric or magnetic means
    • G01P3/44Devices characterised by the use of electric or magnetic means for measuring angular speed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/10Athletes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/30Speed
    • A63B2220/34Angular speed
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • A63B2220/836Sensors arranged on the body of the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/62Measuring physiological parameters of the user posture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • the present application relates to the technical field of wearable devices, in particular to a method and system for calibration of motion data.
  • motion monitoring equipment mainly monitors motion parameters of users during motion through sensors.
  • the user when calibrating the coordinates of the sensor, the user needs to do a series of calibration actions (such as raising both hands in front, raising both hands sideways, etc.), so that the motion monitoring equipment can adjust the coordinate system of the sensor according to the calibration action.
  • the posture data under the body coordinate system is converted into the posture data in the human body coordinate system.
  • the present application provides a motion data calibration method and system that can calibrate motion data without requiring a user to perform a calibration action.
  • the present application provides a movement data calibration method, including: acquiring movement data of the user during movement, the movement data including at least one gesture signal corresponding to at least one measurement position on the user, the at least one gesture
  • Each posture signal in the signal comprises three-dimensional posture data of its corresponding measurement position under the original coordinate system;
  • Build a target coordinate system and the target coordinate system includes three mutually perpendicular coordinate axes of the X axis, the Y axis, and the Z axis; and converting each attitude signal into two-dimensional attitude data in the target coordinate system.
  • each attitude signal includes data measured by an attitude sensor
  • the original coordinate system includes a coordinate system where the attitude sensor is located.
  • the attitude sensor includes at least one of an acceleration sensor, an angular velocity sensor and a magnetic force sensor.
  • each pose signal includes data measured by an image sensor
  • the original coordinate system includes a coordinate system in which the image sensor is located.
  • the three-dimensional pose data includes angle data and angular velocity data on three mutually perpendicular coordinate axes.
  • the converting each attitude signal into two-dimensional attitude data in the target coordinate system includes: acquiring a pre-stored distance between the target coordinate system and the original coordinate system Conversion relationship: based on the conversion relationship, each attitude signal is converted into three-dimensional motion data in the target coordinate system, and the three-dimensional motion data includes at least the angular velocity data on the X-axis, the angular velocity data on the Y-axis and the angular velocity data on the Z axis; and converting the three-dimensional motion data in the target coordinate system into the two-dimensional attitude data in the target coordinate system.
  • the Z-axis of the target coordinate system is a direction opposite to the vertical direction where the acceleration of gravity is located.
  • the two-dimensional attitude data in the target coordinate system includes: horizontal attitude data, including horizontal angle data and horizontal angular velocity data when moving in a horizontal plane perpendicular to the Z axis; and vertical Vertical attitude data, including vertical angle data and vertical angular velocity data when moving in any vertical plane perpendicular to the horizontal plane.
  • the converting the three-dimensional motion data in the target coordinate system into the two-dimensional attitude data in the target coordinate system includes: converting the angular velocity data on the X axis and the The angular velocity data on the Y-axis is converted into the vertical angular velocity data by the vector law; based on the time corresponding to the starting position and the ending position of the user's movement, the vertical angular velocity data is time-integrated to obtain the vertical angle data; using the Z-axis angular velocity data as the horizontal angular velocity data; and based on the time corresponding to the start position and end position of the user's movement, time-integrate the horizontal angular velocity data to obtain the The horizontal angle data described above.
  • the motion data calibration method further includes: determining the relative motion between the at least one measurement position based on the two-dimensional attitude data corresponding to each attitude signal.
  • this specification also provides a sports data calibration system, including at least one storage medium and at least one processor, the at least one storage medium stores at least one instruction set for sports data calibration; the at least one processor Communicatively connected with the at least one storage medium, wherein when the exercise data calibration system is running, the at least one processor reads the at least one instruction set and implements the exercise data calibration method described in the first aspect of this specification.
  • the motion data calibration method and system provided by the present application can convert the motion data of the user during motion from three-dimensional posture data in three mutually perpendicular coordinate axes to two-dimensional data in the target coordinate system, namely The posture data in the horizontal plane and the posture data in the vertical plane, so that the user's movement can be divided into horizontal movement and vertical movement, so as to avoid the user's different orientation. data discrepancies.
  • the method and system can remove the influence of user orientation on motion data. Therefore, the method and system can calibrate the motion data without the need for the user to perform a calibration action.
  • FIG. 1 is a schematic diagram of an application scenario of a motion monitoring system according to some embodiments of the present application
  • FIG. 2 is a schematic diagram of exemplary hardware and/or software of a wearable device according to some embodiments of the present application
  • FIG. 3 is a schematic diagram of exemplary hardware and/or software of a computing device according to some embodiments of the present application.
  • Fig. 4 is an exemplary structural diagram of a wearable device according to some embodiments of the present application.
  • Fig. 5 is an exemplary flowchart of a motion monitoring method according to some embodiments of the present application.
  • Fig. 6 is an exemplary flow chart of a motion data calibration method according to some embodiments of the present application.
  • Fig. 7 is a schematic diagram of a target coordinate system according to some embodiments of the present application.
  • Fig. 8 is an exemplary flow chart of converting into two-dimensional pose data according to some embodiments of the present application.
  • Fig. 9 is a coordinate system diagram when a user moves according to some embodiments of the present application.
  • system means for distinguishing different components, elements, components, parts or assemblies of different levels.
  • the words may be replaced by other expressions if other words can achieve the same purpose.
  • the flow chart is used in this application to illustrate the operations performed by the system according to the embodiment of this application. It should be understood that the preceding or following operations are not necessarily performed in the exact order. Instead, various steps may be processed in reverse order or simultaneously. At the same time, other operations can be added to these procedures, or a certain step or steps can be removed from these procedures.
  • the present application provides a motion monitoring system, which can acquire action signals of a user during exercise, wherein the action signals include at least myoelectric signals, posture signals, electrocardiographic signals, respiratory frequency signals, and the like.
  • the system can monitor the movement of the user based at least on the feature information corresponding to the electromyography signal or the feature information corresponding to the gesture signal. For example, the user's action type, number of actions, action quality, action time, Or physiological parameter information when the user performs an action, etc.
  • the exercise monitoring system can also generate feedback on the user's exercise based on the analysis results of the user's exercise, so as to guide the user's exercise.
  • the exercise monitoring system can send prompt information (for example, voice prompt, vibration prompt, electric stimulation, etc.) to the user.
  • prompt information for example, voice prompt, vibration prompt, electric stimulation, etc.
  • the motion monitoring system can be applied to wearable devices (for example, clothing, wristbands, helmets), medical testing equipment (for example, electromyography tester), fitness equipment, etc.
  • the motion monitoring system can be Accurate monitoring and feedback of the user's actions without the participation of professionals can improve the user's fitness efficiency while reducing the cost of the user's fitness.
  • Fig. 1 is a schematic diagram of an application scenario of a motion monitoring system according to some embodiments of the present application.
  • the motion monitoring system 100 may include a processing device 110 , a network 120 , a wearable device 130 and a mobile terminal device 140 .
  • the athletic monitoring system 100 may also include an athletic data calibration system 180 .
  • the motion monitoring system 100 can acquire motion signals (such as myoelectric signals, posture signals, electrocardiogram signals, respiratory rate signals, etc.) used to characterize user motions, and monitor and give feedback on user motions according to the user's motion signals. .
  • motion signals such as myoelectric signals, posture signals, electrocardiogram signals, respiratory rate signals, etc.
  • the exercise monitoring system 100 can monitor and give feedback on the actions of the user when exercising.
  • the wearable device 130 may acquire the user's motion signal.
  • the processing device 110 or the mobile terminal device 140 may receive and analyze the user's motion signal to determine whether the user's fitness motion is standardized, so as to monitor the user's motion.
  • monitoring the user's action may include determining the action type, action quantity, action quality, action time, or physiological parameter information when the user performs the action.
  • the exercise monitoring system 100 can generate feedback on the user's exercise action according to the analysis result of the user's exercise action, so as to guide the user's exercise.
  • the exercise monitoring system 100 may monitor and give feedback on the actions of the user while running. For example, when the user wears the wearable device 130 for running, the exercise monitoring system 100 can monitor whether the user's running action is normal, whether the running time meets health standards, and the like. When the user runs for too long or the running action is incorrect, the fitness device can feed back the user's exercise status to remind the user that the running action or running time needs to be adjusted.
  • the processing device 110 may be configured to process information and/or data related to user motion.
  • the processing device 110 may receive the user's action signal (for example, myoelectric signal, posture signal, electrocardiogram signal, respiratory frequency signal, etc.), and further extract the feature information corresponding to the action signal (for example, the myoelectric signal in the action signal The corresponding feature information, the feature information corresponding to the attitude signal).
  • the processing device 110 may perform specific signal processing on the EMG signal or posture signal collected by the wearable device 130, such as signal segmentation, signal preprocessing (eg, signal correction processing, filtering processing, etc.), and the like.
  • the processing device 110 may also determine whether the user's action is correct based on the user's action signal. For example, the processing device 110 may determine whether the user's action is correct based on feature information (eg, amplitude information, frequency information, etc.) corresponding to the electromyography signal. For another example, the processing device 110 may determine whether the user action is correct based on the feature information corresponding to the attitude signal (eg, angular velocity, angular velocity direction, angular velocity acceleration, angle, displacement information, stress, etc.). For another example, the processing device 110 may determine whether the user's action is correct based on the feature information corresponding to the EMG signal and the feature information corresponding to the gesture signal.
  • feature information eg, amplitude information, frequency information, etc.
  • the processing device 110 may determine whether the user action is correct based on the feature information corresponding to the attitude signal (eg, angular velocity, angular velocity direction, angular velocity acceleration, angle, displacement information, stress, etc.).
  • the processing device 110 may
  • the processing device 110 may also determine whether the physiological parameter information of the user during exercise meets the health standard. In some embodiments, the processing device 110 may also issue corresponding instructions to feed back the user's exercise situation. For example, when the user is running, the exercise monitoring system 100 monitors that the user's running time is too long. At this time, the processing device 110 may send an instruction to the mobile terminal device 140 to prompt the user to adjust the running time.
  • the characteristic information corresponding to the attitude signal is not limited to the above-mentioned angular velocity, angular velocity direction, angular velocity acceleration, angle, displacement information, stress, etc., but can also be other characteristic information, which can be used to reflect the relative movement of the user's body.
  • the parameter information of can be the feature information corresponding to the attitude signal.
  • the posture sensor is a strain sensor
  • the bending angle and bending direction at the joints of the user can be obtained by measuring the magnitude of the resistance in the strain sensor that changes with the stretching length.
  • processing device 110 may be local or remote.
  • the processing device 110 may access information and/or data stored in the wearable device 130 and/or the mobile terminal device 140 through the network 120 .
  • the processing device 110 may be directly connected to the wearable device 130 and/or the mobile terminal device 140 to access information and/or materials stored therein.
  • the processing device 110 may be located in the wearable device 130 and implement information interaction with the mobile terminal device 140 through the network 120 .
  • the processing device 110 may be located in the mobile terminal device 140, and realize information interaction with the wearable device 130 through a network.
  • the processing device 110 may execute on a cloud platform.
  • processing device 110 may process data and/or information related to motion monitoring to perform one or more of the functions described herein.
  • the processing device 110 may acquire motion signals collected by the wearable device 130 during the user's exercise.
  • the processing device 110 may send control instructions to the wearable device 130 or the mobile terminal device 140 .
  • the control instruction can control the switch state of the wearable device 130 and its sensors, and can also control the mobile terminal device 140 to send out prompt information.
  • the processing device 110 may include one or more sub-processing devices (eg, a single-core processing device or a multi-core multi-core processing device).
  • Network 120 may facilitate the exchange of data and/or information within athletic monitoring system 100 .
  • one or more components of the athletic monitoring system 100 may send data and/or information to other components of the athletic monitoring system 100 over the network 120 .
  • the motion signal collected by the wearable device 130 may be transmitted to the processing device 110 through the network 120 .
  • the confirmation result of the action signal in the processing device 110 may be transmitted to the mobile terminal device 140 through the network 120 .
  • network 120 may be any type of wired or wireless network.
  • the wearable device 130 refers to clothing or equipment with a wearable function.
  • the wearable device 130 may include, but not limited to, an upper garment device 130-1, a trouser device 130-2, a wrist device 130-3, a shoe 130-4, and the like.
  • wearable device 130 may include multiple sensors.
  • the sensor can acquire various action signals (eg, myoelectric signals, posture signals, temperature information, heartbeat frequency, electrocardiographic signals, etc.) of the user during exercise.
  • the sensor may include but not limited to one of myoelectric sensor, attitude sensor, temperature sensor, humidity sensor, ECG sensor, blood oxygen saturation sensor, Hall sensor, electrodermal sensor, rotation sensor, etc. or more.
  • an electromyographic sensor can be set at the muscle position of the human body (for example, biceps, triceps, latissimus dorsi, trapezius, etc.) in the upper garment device 130-1. Myoelectric signals when the user is exercising.
  • an electrocardiographic sensor may be installed near the left pectoral muscle of the human body in the upper garment device 130-1, and the electrocardiographic sensor may collect the user's electrocardiographic signal.
  • a posture sensor can be set at the muscle position of the human body (eg, gluteus maximus, vastus lateralis, vastus medialis, gastrocnemius, etc.) in the trousers device 130-2, and the posture sensor can collect user's posture signals.
  • the wearable device 130 can also provide feedback to the user's actions. For example, when the movement of a certain part of the body of the user does not meet the standard, the corresponding myoelectric sensor of this part can generate a stimulation signal (for example, a current stimulation or a hitting signal) to remind the user.
  • a stimulation signal for example, a current stimulation or a hitting signal
  • wearable device 130 is not limited to the jacket device 130-1, trousers device 130-2, wrist support device 130-3 and shoe device 130-4 shown in FIG.
  • Devices for motion monitoring such as helmet devices, knee pads, etc., are not limited here, and any device that can use the motion monitoring method contained in this application is within the scope of protection of this application.
  • the mobile terminal device 140 can acquire information or data in the exercise monitoring system 100 .
  • the mobile terminal device 140 may receive the exercise data processed by the processing device 110, and feed back exercise records and the like based on the processed exercise data.
  • Exemplary feedback methods may include, but are not limited to, voice prompts, image prompts, video presentations, text prompts, and the like.
  • the user can obtain the action record during his exercise through the mobile terminal device 140 .
  • the mobile terminal device 140 can be connected with the wearable device 130 through the network 120 (for example, wired connection, wireless connection), and the user can obtain the action record during the user's exercise through the mobile terminal device 140, and the action record can be obtained through the mobile terminal device. 140 to the processing device 110.
  • the mobile terminal device 140 may include one of a mobile device 140-1, a tablet computer 140-2, a notebook computer 140-3, etc., or any combination thereof.
  • the mobile device 140-1 may include a mobile phone, a smart home device, a smart mobile device, a virtual reality device, an augmented reality device, etc., or any combination thereof.
  • the smart home device may include a control device for smart appliances, a smart monitoring device, a smart TV, a smart camera, etc., or any combination thereof.
  • a smart mobile device may include a smart phone, a personal digital assistant (PDA), a gaming device, a navigation device, a POS device, etc., or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a virtual reality helmet, virtual reality glasses, virtual reality goggles, augmented reality helmets, augmented reality glasses, augmented reality goggles, etc., or any combination thereof.
  • the athletic monitoring system 100 may also include an athletic data calibration system 180 .
  • the motion data calibration system 180 can be used to process motion data related to the user's motion, and can implement the motion data calibration method described in this specification.
  • the motion data calibration system 180 can receive motion data of the user during motion, and can transform the motion data from three-dimensional posture data in three mutually perpendicular coordinate axes into two-dimensional posture data in the target coordinate system, namely The posture data in the horizontal plane and the posture data in the vertical plane, so that the user's movement can be divided into horizontal movement and vertical movement, avoiding the data caused by different user orientations difference.
  • the exercise data calibration system 180 can remove the influence of the user's orientation on the exercise data, and the exercise data can be calibrated without the need for the user to perform calibration actions.
  • the motion data may be three-dimensional posture data of the measured position on the user's body when it is in motion. The action data and the calibration method for the motion data will be described in detail later.
  • the athletic data calibration system 180 may be integrated on the processing device 110 . In some embodiments, the sports data marking system 180 can also be integrated on the mobile terminal device 140 . In some embodiments, the sports data marking system 180 may also exist independently of the processing device 110 and the mobile terminal device 140 .
  • the motion data calibration system 180 may be communicatively connected with the processing device 110, the wearable device 130 and the mobile terminal device 140 for information and/or data transmission and exchange. In some embodiments, the exercise data calibration system 180 can access information and/or data stored in the processing device 110 , the wearable device 130 and/or the mobile terminal device 140 through the network 120 .
  • the wearable device 130 can be directly connected to the processing device 110 and/or the mobile terminal device 140 to access information and/or data stored therein.
  • the motion data calibration system 180 may be located in the processing device 110 and realize information exchange with the wearable device 130 and the mobile terminal device 140 through the network 120 .
  • the sports data calibration system 180 may be located in the mobile terminal device 140, and realize information exchange with the processing device 110 and the wearable device 130 through the network.
  • the exercise data calibration system 180 can be executed on a cloud platform, and realize information interaction with the processing device 110 , the wearable device 130 and the mobile terminal device 140 through the network.
  • the motion monitoring system 100 may also include a database.
  • the database may store data (eg, initially set threshold conditions, etc.) and/or instructions (eg, feedback instructions).
  • the database can store data acquired from the wearable device 130 and/or the mobile terminal device 140 .
  • the database may store information and/or instructions for execution or use by the processing device 110 to perform the example methods described herein.
  • the database may be connected to the network 120 to communicate with one or more components of the exercise monitoring system 100 (eg, the processing device 110 , the wearable device 130 , the mobile terminal device 140 , etc.).
  • One or more components of the athletic monitoring system 100 may access data or instructions stored in the database via the network 120 .
  • the database may be directly connected or communicated with one or more components in the athletic monitoring system 100 .
  • the database may be part of the processing device 110 .
  • Fig. 2 is a schematic diagram of exemplary hardware and/or software of the wearable device 130 according to some embodiments of the present application.
  • the wearable device 130 may include an acquisition module 210, a processing module 220 (also called a processor), a control module 230 (also called a main control, MCU, controller), a communication module 240, a power supply module 250 and input/output module 260 .
  • the acquiring module 210 can be used to acquire motion signals of the user when exercising.
  • the acquisition module 210 may include a sensor unit, and the sensor unit may be used to acquire one or more types of action signals when the user is exercising.
  • the sensor unit may include, but not limited to, myoelectric sensor, attitude sensor, electrocardiogram sensor, respiration sensor, temperature sensor, humidity sensor, inertial sensor, blood oxygen saturation sensor, Hall sensor, electrodermal sensor, One or more of rotation sensors, etc.
  • the action signal may include one or more of myoelectric signal, posture signal, electrocardiogram signal, respiratory rate, temperature signal, humidity signal, etc.
  • the sensor unit can be placed in different positions of the wearable device 130 according to the type of motion signal to be acquired.
  • a myoelectric sensor (also referred to as an electrode element) may be disposed at a muscle position of a human body, and the myoelectric sensor may be configured to collect myoelectric signals when the user moves.
  • the EMG signal and its corresponding feature information (for example, frequency information, amplitude information, etc.) can reflect the state of the user's muscles when exercising.
  • the posture sensor can be set at different positions of the human body (for example, positions corresponding to the torso, limbs, and joints in the wearable device 130), and the posture sensor can be configured to collect posture signals when the user moves.
  • the posture signal and its corresponding feature information can reflect the posture of the user's movement.
  • the electrocardiogram sensor can be arranged around the chest of the human body, and the electrocardiogram sensor can be configured to collect electrocardiogram data when the user is exercising.
  • the respiration sensor may be arranged around the chest of the human body, and the respiration sensor may be configured to collect respiration data (eg, respiration frequency, respiration amplitude, etc.) of the user during exercise.
  • the temperature sensor may be configured to collect temperature data (eg, body surface temperature) of the user while exercising.
  • the humidity sensor may be configured to collect humidity data of the external environment when the user is exercising.
  • the processing module 220 may process data from the acquisition module 210 , the control module 230 , the communication module 240 , the power supply module 250 and/or the input/output module 260 .
  • the processing module 220 may process the action signal from the acquisition module 210 during the user's exercise.
  • the processing module 220 may preprocess the action signals (eg, myoelectric signals, posture signals) acquired by the acquisition module 210 .
  • the processing module 220 performs segmentation processing on the myoelectric signal or gesture signal when the user is exercising.
  • the processing module 220 may perform pre-processing (for example, filter processing, signal correction processing) on the electromyographic signal when the user is exercising, so as to improve the quality of the electromyographic signal.
  • processing module 220 may determine feature information corresponding to the gesture signal based on the gesture signal when the user is exercising.
  • processing module 220 may process instructions or operations from input/output module 260 .
  • the processed data can be stored in memory or hard disk.
  • the processing module 220 can transmit the processed data to one or more components in the motion monitoring system 100 through the communication module 240 or the network 120 .
  • the processing module 220 may send the monitoring result of the user's movement to the control module 230, and the control module 230 may execute subsequent operations or instructions according to the action determination result.
  • the control module 230 can be connected with other modules in the wearable device 130 .
  • the control module 230 can control the running status of other modules in the wearable device 130 .
  • the control module 230 can control the power supply state (eg, normal mode, power saving mode), power supply time, etc. of the power supply module 250 .
  • the control module 230 may control the input/output module 260 according to the determination result of the user's motion, and then may control the mobile terminal device 140 to send the feedback result of its motion to the user.
  • the control module 230 can control the input/output module 260, and then can control the mobile terminal device 140 to give feedback to the user, so that the user can know the state of his own exercise in real time. Make adjustments to the action.
  • the control module 230 may also control one or more sensors or other modules in the acquisition module 210 to provide feedback to the human body. For example, when a certain muscle exerts too much force during the user's exercise, the control module 230 may control the electrode module at the position of the muscle to provide electrical stimulation to the user to prompt the user to adjust the action in time.
  • the communication module 240 may be used for information or data exchange. In some embodiments, the communication module 240 can be used for communication between internal components of the wearable device 130 . For example, the acquisition module 210 may send user action signals (eg, myoelectric signals, gesture signals, etc.) to the communication module 240 , and the communication module 240 may send the action signals to the processing module 220 . In some embodiments, the communication module 240 can also be used for communication between the wearable device 130 and other components in the exercise monitoring system 100 . For example, the communication module 240 may send status information (for example, switch status) of the wearable device 130 to the processing device 110, and the processing device 110 may monitor the wearable device 130 based on the status information.
  • the communication module 240 can adopt wired, wireless, and wired/wireless mixed technologies.
  • the power supply module 250 may provide power to other components in the motion monitoring system 100 .
  • the input/output module 260 can acquire, transmit and send signals.
  • the input/output module 260 may connect or communicate with other components in the motion monitoring system 100 .
  • Other components in the motion monitoring system 100 may be connected or communicated through the input/output module 260 .
  • the above description of the motion monitoring system 100 and its modules is only for convenience of description, and does not limit one or more embodiments of the present application within the scope of the illustrated embodiments. It can be understood that for those skilled in the art, after understanding the principle of the system, it is possible to combine the various modules arbitrarily, or form a subsystem to connect with other modules, or to One or more modules of the .
  • the acquisition module 210 and the processing module 220 may be one module, and this module may have the function of acquiring and processing user action signals.
  • the processing module 220 may also be integrated in the processing device 110 instead of being disposed in the wearable device 130 . Such modifications are within the protection scope of one or more embodiments of the present application.
  • FIG. 3 is a schematic diagram of exemplary hardware and/or software of a computing device 300 according to some embodiments of the present application.
  • the processing device 110 and/or the mobile terminal device 140 may be implemented on the computing device 300 .
  • athletic data targeting system 180 may be implemented on computing device 300 .
  • computing device 300 may include internal communication bus 310 , at least one processor 320 , at least one storage medium, communication port 350 , input/output interface 360 , and user interface 380 .
  • Internal communication bus 310 enables data communication between components in computing device 300 .
  • at least one processor 320 may send data to at least one storage medium or other hardware such as input/output port 360 through the internal communication bus 310 .
  • the internal communication bus 310 may be an Industry Standard (ISA) bus, an Extended Industry Standard (EISA) bus, a Video Electronics Standard (VESA) bus, a Peripheral Component Interconnect (PCI) bus, or the like.
  • ISA Industry Standard
  • EISA Extended Industry Standard
  • VESA Video Electronics Standard
  • PCI Peripheral Component Interconnect
  • the internal communication bus 310 can be used to connect various modules in the motion monitoring system 100 shown in FIG. ).
  • At least one storage medium of computing device 300 may include data storage.
  • the data storage device may be a non-transitory storage medium or a temporary storage medium.
  • the data storage device may include one or more of a read only memory (ROM) 330, a random access memory (RAM) 340, a hard disk 370, and the like.
  • Exemplary ROMs may include mask ROM (MROM), programmable ROM (PROM), erasable programmable ROM (PEROM), electrically erasable programmable ROM (EEPROM), compact disc ROM (CD-ROM), and digital Universal disk ROM, etc.
  • Exemplary RAMs may include dynamic RAM (DRAM), double rate synchronous dynamic RAM (DDR SDRAM), static RAM (SRAM), thyristor RAM (T-RAM), and zero-capacitance RAM (Z-RAM), among others.
  • the storage medium may store data/information obtained from any other component of the motion monitoring system 100 .
  • the storage medium also includes at least one instruction set stored in the data storage device.
  • the instructions are computer program codes, and the computer program codes may include programs, routines, objects, components, data structures, procedures, modules, etc. for executing the motion data calibration method provided in this specification.
  • the storage medium of the computing device 300 may be located in the wearable device 130 or in the processing device 110 .
  • At least one processor 320 may be communicatively coupled to at least one storage medium. At least one processor 320 is configured to execute the above at least one instruction set. When the computing device 300 is running, at least one processor 320 can read the at least one instruction set, and execute computing instructions (program codes) according to the instructions of the at least one instruction set, thereby executing the motion monitoring system 100 described in this application function. The processor 320 can execute all the steps included in the data processing method.
  • the computing instructions may include programs, objects, components, data structures, procedures, modules, and functions (the functions refer to specific functions described in this application).
  • the processor 320 can process the action signal (for example, myoelectric signal, gesture signal) obtained from the wearable device 130 or/and mobile terminal device 140 of the exercise monitoring system 100 when the user is exercising, and The motion signal monitors the motion of the user's motion.
  • the processor 320 can be used to process the motion data obtained from the wearable device 130 or/and mobile terminal device 140 of the motion monitoring system 100 during the user's exercise, and execute the instructions described in this specification according to the instructions of the at least one instruction set.
  • the motion data calibration method described above converts the motion data into two-dimensional posture data.
  • the processor 320 may include a microcontroller, a microprocessor, a reduced instruction set computer (RISC), an application specific integrated circuit (ASIC), an application specific instruction set processor (ASIP), a central processing unit (CPU) , Graphics Processing Unit (GPU), Physical Processing Unit (PPU), Microcontroller Unit, Digital Signal Processor (DSP), Field Programmable Gate Array (FPGA), Advanced Reduced Instruction Set Computer (ARM), Programmable Logic Device And any circuit and processor etc. capable of performing one or more functions, or any combination thereof.
  • RISC reduced instruction set computer
  • ASIC application specific integrated circuit
  • ASIP application specific instruction set processor
  • CPU central processing unit
  • GPU Graphics Processing Unit
  • PPU Physical Processing Unit
  • Microcontroller Unit Microcontroller Unit
  • DSP Digital Signal Processor
  • FPGA Field Programmable Gate Array
  • ARM Advanced Reduced Instruction Set Computer
  • Programmable Logic Device any circuit and processor etc. capable of performing one or more functions, or any combination thereof.
  • the computing device 300 in FIG. 3 only depicts one
  • Hard disk 370 may be used to store information and data generated by or received from processing device 110 .
  • the hard disk 370 may store user identification information of the user.
  • the hard disk 370 may be disposed in the processing device 110 or in the wearable device 130 .
  • User interface 380 may enable interaction and information exchange between computing device 300 and a user.
  • the user interface 380 may be used to present the athletic records generated by the athletic monitoring system 100 to the user.
  • user interface 380 may include a physical display, such as a display with speakers, LCD display, LED display, OLED display, electronic ink display (E-Ink), and the like.
  • the input/output interface 360 may be used to input or output signals, data or information. In some embodiments, input/output interface 360 may enable a user to interact with athletic monitoring system 100 .
  • Fig. 4 is an exemplary structural diagram of a wearable device according to some embodiments of the present application.
  • a top garment is used as an example.
  • wearable device 400 may include upper garment 410 .
  • the tops garment 410 may include a tops garment base 4110, at least one tops processing module 4120, at least one tops feedback module 4130, at least one tops acquisition module 4140, and the like.
  • the upper clothing base 4110 may refer to clothing worn on the upper body of a human body.
  • the upper garment substrate 4110 may include a short-sleeved T-shirt, a long-sleeved T-shirt, a shirt, a coat, or the like.
  • At least one upper garment processing module 4120 and at least one upper garment acquisition module 4140 may be located on the upper garment base 4110 in areas that fit different parts of the human body.
  • At least one upper garment feedback module 4130 can be located at any position of the upper garment garment base 4110, and the at least one upper garment feedback module 4130 can be configured to feed back information about the user's upper body movement state.
  • Exemplary feedback methods may include, but are not limited to, voice prompts, text prompts, pressure prompts, electrical stimulation, and the like.
  • At least one upper garment acquisition module 4140 may include, but not limited to, one of a posture sensor, an electrocardiogram sensor, an electromyography sensor, a temperature sensor, a humidity sensor, an inertial sensor, an acid-base sensor, an acoustic wave transducer, etc. or more.
  • the sensors in the upper garment acquisition module 4140 can be placed on different positions of the user's body according to different signals to be measured. For example, when the posture sensor is used to acquire the posture signal during the user's movement, the posture sensor can be placed in the upper clothing base 4110 at positions corresponding to the torso, arms, and joints of the human body.
  • the myoelectric sensor when used to acquire the myoelectric signal during the user's exercise, the myoelectric sensor may be located near the user's muscle to be measured.
  • the attitude sensor may include, but not limited to, an acceleration three-axis sensor, an angular velocity three-axis sensor, a magnetic force sensor, etc., or any combination thereof.
  • an attitude sensor may include an acceleration three-axis sensor and an angular velocity three-axis sensor.
  • the attitude sensor may also include a strain gauge sensor.
  • the strain sensor may refer to a sensor that can be based on the strain generated by the deformation of the object under test.
  • the strain gauge sensor may include, but not limited to, one or more of strain gauge force sensors, strain gauge pressure sensors, strain gauge torque sensors, strain gauge displacement sensors, and strain gauge acceleration sensors.
  • the strain gauge sensor can be set at the joint position of the user, and by measuring the magnitude of the resistance in the strain gauge sensor that changes with the stretching length, the bending angle and bending direction at the joint of the user can be obtained.
  • the upper garment 410 may also include other modules, such as a power supply module, a communication module, an input/output modules etc.
  • the jacket processing module 4120 is similar to the processing module 220 in FIG. 2, and the jacket acquisition module 4140 is similar to the acquisition module 210 in FIG. description and will not be repeated here.
  • Fig. 5 is an exemplary flowchart of a motion monitoring method according to some embodiments of the present application. As shown in Figure 5, the process 500 may include:
  • step 510 an action signal when the user is exercising is acquired.
  • this step 510 may be performed by the acquisition module 210 .
  • the motion signal refers to the human body parameter information when the user is exercising.
  • the human body parameter information may include, but not limited to, one or more of myoelectric signals, posture signals, electrocardiographic signals, temperature signals, humidity signals, blood oxygen concentration, respiratory rate, and the like.
  • the myoelectric sensor in the acquiring module 210 can collect the myoelectric signal of the user during exercise. For example, when the user performs chest clamping in a sitting position, the myoelectric sensor in the wearable device corresponding to the positions of the human pectoral muscles and latissimus dorsi can collect the myoelectric signals of the corresponding muscle positions of the user.
  • the myoelectric sensor corresponding to the position of the human gluteus maximus and quadriceps in the wearable device can collect the myoelectric signal of the corresponding muscle position of the user.
  • the myoelectric sensor corresponding to the human gastrocnemius muscle and other positions in the wearable device can collect the myoelectric signals of the human gastrocnemius muscle and other positions.
  • the posture sensor in the acquisition module 210 can collect posture signals of the user during motion.
  • the posture sensor corresponding to the position of the triceps of the human body in the wearable device can collect the posture signals of the position of the user's triceps.
  • the posture sensor installed at the deltoid muscle of the human body can collect posture signals of the deltoid muscle of the user.
  • the number of posture sensors in the acquisition module 210 can be multiple, and multiple posture sensors can obtain posture signals of multiple parts when the user is moving, and the posture signals of multiple parts can reflect the relative relationship between different parts of the human body. Sports situation.
  • a pose signal at the arm and a pose signal at the torso may reflect the motion of the arm relative to the torso.
  • the attitude signal is associated with a type of attitude sensor.
  • the attitude sensor is an angular velocity three-axis sensor
  • the acquired attitude signal is angular velocity information.
  • the attitude sensor is an angular velocity three-axis sensor and an acceleration three-axis sensor
  • the acquired attitude signal is angular velocity information and acceleration information.
  • the attitude sensor is a strain sensor
  • the strain sensor can be set at the joint position of the user. By measuring the resistance of the strain sensor that changes with the stretching length, the acquired attitude signal can be displacement information, stress etc.
  • the bending angle and bending direction at the user's joints can be characterized.
  • the parameter information that can be used to reflect the relative movement of the user's body can be the feature information corresponding to the attitude signal, and different types of attitude sensors can be used to obtain it according to the type of feature information.
  • the action signal may include a myoelectric signal of a specific part of the user's body and a gesture signal of the specific part.
  • Myoelectric signals and posture signals can reflect the movement state of specific parts of the user's body from different angles. To put it simply, the posture signal of a specific part of the user's body can reflect the type of movement, range of movement, frequency of movement, etc. of the specific part.
  • the EMG signal can reflect the muscle state of the specific part during exercise. In some embodiments, by using the electromyography signal and/or posture signal of the same body part, it is possible to better evaluate whether the movement of the part is normal.
  • step 520 the movement of the user is monitored based at least on the feature information corresponding to the EMG signal or the feature information corresponding to the gesture signal.
  • the feature information corresponding to the EMG signal may include but not limited to one or more of frequency information, amplitude information, and the like.
  • the feature information corresponding to the posture signal refers to parameter information used to characterize the relative movement of the user's body.
  • the feature information corresponding to the attitude signal may include, but not limited to, one or more of angular velocity direction, angular velocity value, angular velocity acceleration value, and the like.
  • the feature information corresponding to the attitude signal may also include angle, displacement information (such as stretching length in a strain gauge sensor), stress, and the like.
  • the strain sensor when the attitude sensor is a strain sensor, the strain sensor can be set at the joint position of the user. By measuring the resistance of the strain sensor that changes with the stretching length, the acquired attitude signal can be displacement information, stress, etc. , the bending angle and bending direction at the user's joints can be characterized by these posture signals.
  • the processing module 220 and/or the processing device 110 can extract feature information (for example, frequency information, amplitude information) corresponding to the EMG signal or feature information (for example, angular velocity direction, angular velocity value, Acceleration value of angular velocity, angle, displacement information, stress, etc.), and based on the characteristic information corresponding to the electromyographic signal or the characteristic information corresponding to the attitude signal, the user's movement is monitored.
  • feature information for example, frequency information, amplitude information
  • feature information for example, angular velocity direction, angular velocity value, Acceleration value of angular velocity, angle, displacement information, stress, etc.
  • monitoring the motion of the user includes monitoring information related to the user's motion.
  • the action-related information may include one or more of user action type, action quantity, action quality (eg, whether the user action meets a standard), action time, and the like.
  • the action type refers to the fitness action taken by the user when exercising.
  • the type of action may include, but not limited to, one or more of sitting chest clamp, squat, deadlift, plank, running, swimming, and the like.
  • the number of actions refers to the number of times the user performs actions during exercise. For example, the user performed 10 times of chest clamping in a sitting position during exercise, where 10 times is the number of movements.
  • Movement quality refers to the standard degree of the fitness movement performed by the user relative to the standard fitness movement.
  • the processing device 110 can determine the action type of the user's action based on the feature information corresponding to the action signal (myoelectric signal and posture signal) of a specific muscle position (glutes, quadriceps, etc.) , and judge the motion quality of the user's squat motion based on the motion signal of the standard squat motion.
  • the action time refers to the time corresponding to one or more action types of the user or the total time of the exercise process.
  • the feature information corresponding to the posture signal may include parameter information that can be used to reflect the relative movement of the user's body.
  • the exercise monitoring system 100 needs to obtain relative movements between different parts of the user's body.
  • the attitude signal may be attitude data measured by an attitude sensor.
  • the posture sensors can be distributed in many different parts of the user's body.
  • the motion monitoring system 100 may calibrate gesture signals between multiple different parts of the user's body.
  • Fig. 6 is an exemplary flowchart of a motion data calibration method 3000 according to some embodiments of the present application.
  • the motion data calibration system 180 can execute the motion data calibration method 3000 provided in this application.
  • the motion data marking system 180 may execute the motion data marking method 3000 on the processing device 110 , and may also execute the motion data marking method 3000 on the mobile terminal device 140 .
  • the processor 320 in the processing device 110 can read the instruction set stored in its local storage medium, and then execute the motion data calibration method 3000 provided in this application according to the specification of the instruction set.
  • the method 3000 of motion data calibration may include:
  • the motion data refers to information about human body motion parameters when the user is exercising.
  • the action data may include at least one gesture signal corresponding to at least one measurement position on the user's body.
  • the posture signal and its corresponding feature information (for example, angular velocity direction, angular velocity value, angular velocity acceleration value, angle, displacement information, stress, etc.) can reflect the posture of the user's motion.
  • At least one measurement position is in one-to-one correspondence with at least one attitude signal.
  • the measurement locations may be different parts on the user's body.
  • the at least one posture signal corresponds to the actual posture of the at least one measured location on the user's body when the user moves.
  • Each attitude signal of the at least one attitude signal may include three-dimensional attitude data of its corresponding measurement position in the original coordinate system.
  • the original coordinate system may be the coordinate system where the attitude signal is located.
  • the change of the original coordinate system may also cause the change of the posture signal.
  • Each attitude signal may include one or more types of three-dimensional attitude data. For example, three-dimensional angle data, three-dimensional angular velocity data, three-dimensional angular acceleration data, three-dimensional velocity data, three-dimensional displacement data, three-dimensional stress data, and so on.
  • the attitude signal can be acquired by an attitude sensor on the wearable device 130 .
  • the sensor unit of the acquisition module 210 in the wearable device 130 may include an attitude sensor.
  • the wearable device 130 may include at least one attitude sensor.
  • At least one posture sensor may be located at at least one measurement location on the user's body.
  • the attitude sensor can collect the attitude signal of the corresponding measurement position on the user's body.
  • the attitude sensors on the wearable device 130 may be distributed in the limbs of the human body (eg, arms, legs, etc.), the torso of the human body (eg, chest, abdomen, back, waist, etc.), and the head of the human body.
  • the attitude sensor can realize the acquisition of attitude signals of other parts such as the limbs and torso of the human body.
  • the posture sensor can be placed at different positions of the wearable device 130 according to the posture signals to be acquired, so as to measure posture signals corresponding to different positions of the human body.
  • the attitude sensor may also be an attitude measurement unit (AHRS) sensor with an attitude fusion algorithm.
  • the attitude fusion algorithm can fuse the data of a nine-axis inertial measurement unit (IMU) with a three-axis acceleration sensor, a three-axis angular velocity sensor, and a three-axis geomagnetic sensor into Euler angles or quaternions to obtain the position of the user's body where the attitude sensor is located. Attitude signal.
  • the processing module 220 and/or the processing device 110 may determine feature information corresponding to the gesture based on the gesture signal.
  • the feature information corresponding to the attitude signal may include but not limited to angular velocity value, angular velocity direction, angular velocity acceleration value, and the like.
  • the posture sensor may be a strain sensor, and the strain sensor may obtain the bending direction and bending angle of the joints of the user, so as to obtain the posture signal of the user during motion.
  • the strain sensor can be set at the user's knee joint. When the user moves, the user's body parts act on the strain sensor, and the bending direction and bending angle at the user's knee joint can be calculated based on the resistance or length change of the strain sensor.
  • the attitude sensor may also include an optical fiber sensor, and the attitude signal may be represented by a direction change of light of the optical fiber sensor after being bent.
  • the attitude sensor can also be a magnetic flux sensor, and the attitude signal can be characterized by the transformation of the magnetic flux. It should be noted that the type of the posture sensor is not limited to the above-mentioned sensors, and may also be other sensors, and all sensors capable of acquiring user posture signals are within the scope of the posture sensor of the present application.
  • each attitude signal may include one or more types of three-dimensional attitude data.
  • Attitude sensors may also include multiple types of sensors.
  • the attitude sensor may include at least one of an acceleration sensor, an angular velocity sensor, and a magnetic force sensor.
  • the original coordinate system may be the coordinate system where the attitude sensor is located.
  • the original coordinate system refers to the coordinate system corresponding to the posture sensor disposed on the human body.
  • the posture sensors on the wearable device 130 are distributed in different parts of the human body, so that the installation angles of the posture sensors on the human body are different, and the posture sensors of different parts are respectively based on the coordinate system of the body
  • attitude sensors in different parts have different original coordinate systems.
  • the attitude signals acquired by each attitude sensor may be expressed in its corresponding original coordinate system.
  • the attitude signal acquired by the attitude sensor may be the attitude signal of the preset fixed coordinate system in the original coordinate system.
  • the preset fixed coordinate system may be a geodetic coordinate system, or any other preset coordinate system.
  • the conversion relationship between the original coordinate system and the preset fixed coordinate system may be pre-stored in the motion data calibration system 180 .
  • the attitude signal can be a signal directly acquired by the attitude sensor, or an attitude signal formed by signal processing processes such as conventional filtering, rectification, wavelet transformation, and glitch processing, or any of the above A signal resulting from a permutation and combination of one or more processing flows.
  • the gesture signal may be data measured by an image sensor.
  • the image sensor may be an image sensor capable of acquiring depth information, such as a 3D structured light camera, a binocular camera, and the like.
  • the image sensor can be installed at any position where an image of the user's movement can be captured.
  • the number of image sensors can be one or more. When the number of image sensors is plural, the plural image sensors may be installed at plural different positions.
  • the image sensor can acquire a depth image of the user's motion.
  • the depth image may contain depth information of at least one measurement position on the user's body relative to the coordinate system where the image sensor is located.
  • the motion data calibration system 180 may calculate an attitude signal of each measurement position in the at least one measurement position based on changes in multiple frames of depth images. As mentioned above, each attitude signal may include one or more types of three-dimensional attitude data.
  • the motion data calibration system 180 can calculate different three-dimensional pose data.
  • the original coordinate system may be the coordinate system where the image sensor itself is located.
  • the gesture signal acquired by the image sensor may be expressed in the corresponding coordinate system (original coordinate system) of the image sensor itself.
  • Image sensors can be pre-calibrated. That is, the motion data calibration system 180 may pre-store the conversion relationship between the image sensor's own coordinate system and the aforementioned preset fixed coordinate system.
  • the action data measured by at least one attitude sensor we will take the action data measured by at least one attitude sensor as an example.
  • the original coordinate system we define the o-xyz coordinate system.
  • o is the coordinate circle point of the original coordinate system o-xyz
  • the x-axis, y-axis and z-axis are three mutually perpendicular coordinate axes of the original coordinate system o-xyz respectively.
  • each attitude signal may be three-dimensional attitude data whose corresponding measurement position is in the original coordinate system o-xyz.
  • the three-dimensional pose data may be pose data on three mutually perpendicular coordinate axes in the coordinate system where it is located.
  • attitude data may include angle data and angular velocity data.
  • the three-dimensional attitude data in the original coordinate system o-xyz may include angle data and angular velocity data on three mutually perpendicular coordinate axes x-axis, y-axis and z-axis.
  • the 3D attitude data of each attitude signal in the original coordinate system o-xyz may include 3D angle (Euler) data E sens and 3D angular velocity (Gyro) data G sens .
  • the three-dimensional angular data E sens may include angular data E sens_x on the x-axis, angular data E sens_y on the y-axis, and angular data E sens_z on the z-axis.
  • the three-dimensional angular velocity data G sens may include angular velocity data G sens_x on the x-axis, angular velocity data G sens_y on the y-axis, and angular velocity data G sens_z on the z-axis.
  • this step may be performed by the processing module 220 and/or the processing device 110 .
  • the motion data calibration system 180 can transform the motion data into posture data in the same known coordinate system (for example, the target coordinate system is defined as O-XYZ).
  • the target coordinate system O-XYZ may include three mutually perpendicular coordinate axes of X axis, Y axis and Z axis.
  • the target coordinate system O-XYZ may be any calibrated coordinate system. In some embodiments, the target coordinate system O-XYZ may be the aforementioned preset fixed coordinate system. In some embodiments, the target coordinate system O-XYZ and the aforementioned preset fixed coordinate system may be different coordinate systems.
  • the motion data calibration system 180 may pre-store the aforementioned conversion relationship between the preset fixed coordinate system and the target coordinate system O-XYZ.
  • the exercise data calibration system 180 is used to calibrate the motion data of the user when exercising, and the measurement object is the user. Therefore, in some embodiments, the target coordinate system O-XYZ may take the length direction of the torso when the human body is standing as the Z axis. That is, the Z axis is the opposite direction of the vertical direction where the acceleration of gravity is located. That is, the Z axis is a coordinate axis perpendicular to the ground and pointing to the sky. The plane formed by the X axis and the Y axis is a horizontal plane perpendicular to the Z axis.
  • the X axis and the Y axis may be any two mutually perpendicular coordinate axes in a horizontal plane perpendicular to the Z axis.
  • the X-axis may be an east-west oriented coordinate, such as a coordinate axis pointing eastward
  • the Y-axis may be a north-south oriented coordinate, such as a coordinate axis pointing northward.
  • Fig. 7 shows a schematic diagram of an object coordinate system O-XYZ provided according to an embodiment of the present specification. Wherein, the X-axis direction is just in front of the user 001 when standing.
  • S3060 Transform each attitude signal into two-dimensional attitude data in the target coordinate system.
  • FIG. 8 is an exemplary flow chart of converting into two-dimensional pose data according to some embodiments of the present application.
  • FIG. 8 corresponds to step S3060.
  • step S3060 may include:
  • the attitude signal acquired by each attitude sensor can be expressed in its corresponding original coordinate system.
  • the attitude signal acquired by the attitude sensor may be an attitude signal in the original coordinate system of the measurement position corresponding to the current attitude signal in a preset fixed coordinate system.
  • the motion data calibration system 180 can obtain the expression of the corresponding original coordinate system o-xyz of the attitude sensor in the preset fixed coordinate system based on each attitude signal through inverse conversion.
  • the motion data calibration system 180 may pre-store the conversion relationship between the preset fixed coordinate system and the target coordinate system O-XYZ.
  • the motion data calibration system 180 can calculate and determine the conversion relationship between each original coordinate system o-xyz and the target coordinate system O-XYZ based on the conversion relationship between the preset fixed coordinate system and the target coordinate system O-XYZ.
  • the motion data calibration system 180 can convert the posture information in the original coordinate system o-xyz into the posture information of O-XYZ in the target coordinate system based on the conversion relationship.
  • the transformation relationship may be expressed as one or more rotation matrices.
  • the rotation matrix may be pre-stored in the motion data calibration system 180 .
  • each attitude signal may include three-dimensional attitude data E sens_x , E sens_y , E sens_z , G sens_x , G sens_y and G sens_z in its corresponding original coordinate system o-xyz.
  • the motion data calibration system 180 can determine the original coordinate system 0-xyz where the measurement position corresponding to each attitude signal is located in the target coordinate system O-XYZ based on the conversion relationship between the target coordinate system O-XYZ and the original coordinate system o-xyz The 3D motion data below.
  • the three-dimensional motion data at least includes angular velocity data G global_X on the X axis, angular velocity data G global_Y on the Y axis, and angular velocity data G global_Z on the Z axis.
  • S3066 Transform the three-dimensional motion data in the target coordinate system O-XYZ into two-dimensional posture data in the target coordinate system O-XYZ.
  • the two-dimensional posture data is data in a two-dimensional coordinate system.
  • the two-dimensional coordinate system in the target coordinate system O-XYZ may include a motion plane when the limbs of the user 001 swings and a motion plane when the torso of the user 001 rotates.
  • the two-dimensional attitude data in the target coordinate system O-XYZ may include horizontal attitude data and vertical attitude data.
  • the horizontal attitude data may include horizontal angle data E global_Z and horizontal angular velocity data G global_Z when moving in a horizontal plane perpendicular to the Z axis.
  • the vertical attitude data may include vertical angle data E global_XY and vertical angular velocity data G global_XY when moving in any vertical plane perpendicular to the horizontal plane.
  • the vertical plane may be any plane perpendicular to the horizontal plane.
  • the horizontal angle data E global_Z may be the rotation angle of the measurement position within the horizontal plane in the target coordinate system O-XYZ.
  • the horizontal angular velocity data G global_Z may be the rotational angular velocity of the measurement position within the horizontal plane in the target coordinate system O-XYZ.
  • the vertical angle data E global_XY may be the rotation angle of the measurement position in any vertical plane in the target coordinate system O-XYZ.
  • the vertical angular velocity data G global_XY may be the rotational angular velocity of the measurement position in any vertical plane in the target coordinate system O-XYZ.
  • the main part of user 001's actions can be decomposed into movements on two planes. For example, movement on the horizontal plane in situ and movement on any vertical plane.
  • the different actions performed by the user 001 when exercising can be distinguished only by the movement in the horizontal plane and the movement in the vertical plane.
  • the running action of the user 001 mainly focuses on the rotational movement around the joints parallel to the horizontal plane. At this time, the rotational movement occurs in a vertical plane perpendicular to the horizontal plane, and the vertical plane extends along the body facing direction of the user 001 .
  • the center of gravity of user 001 moves linearly up and down, and the user's limbs swing with the center of gravity.
  • the biceps curling movement only has movement in the vertical plane.
  • the chest clamping action in the sitting position only moves in the horizontal plane.
  • Fig. 9 is a coordinate system diagram of user 001 when exercising according to some embodiments of the present application.
  • the orientation of user 001 is basically unchanged.
  • the action of the biceps curl is mainly concentrated on the vertical plane P formed by the forearm AO' and the upper arm O'B. Because it is an action to overcome the gravity of the dumbbell, the plane P is generally perpendicular to the horizontal plane X-Y plane.
  • the pose of the upper arm O'B is basically unchanged, and the forearm AO' swings back and forth around the elbow O'.
  • the vector direction of the swing is the normal direction of the vertical plane P.
  • user 001's bicep curl movement has very small components in other directions and can be ignored.
  • the motion data calibration system 180 converts the 3D motion data into 2D posture data.
  • step S3066 may include:
  • S3066-2 Convert the angular velocity data G global_X on the X-axis and the angular velocity data G global_Y on the Y-axis into vertical angular velocity data G global_XY by vector law.
  • the vertical angular velocity data G global_XY can be expressed as the following formula:
  • S3066-4 Time-integrate the vertical angular velocity data G glohal_XY based on the time corresponding to the start position and end position of the user 001 during exercise, and obtain the vertical angle data E glohal_XY .
  • startpos and endpos are the start time and end time corresponding to the start position (start position) and end position (end position) of an action.
  • S3066-8 Time-integrate the horizontal angular velocity data G glohal_Z based on the time corresponding to the starting position and the ending position when the user 001 moves, and obtain the horizontal angle data E global_Z .
  • the horizontal angle data E global_Z can be expressed as the following formula:
  • startpos and endpos are the start time and end time corresponding to the start position (start position) and end position (end position) of an action.
  • the motion data calibration method 3000 may also include:
  • the motion data calibration system 180 can determine the relative motion between different motion parts of the user 001's body through at least one two-dimensional posture data corresponding to at least one posture signal corresponding to at least one measurement position of the user's 001 body. For example, the relative movement between the arm and the torso of the user 001 during the movement can be judged through the characteristic information corresponding to the attitude sensor on the arm of the user 001 and the characteristic information corresponding to the attitude sensor on the torso of the user 001.
  • the motion data calibration method 3000 and system 180 provided by this application can convert the motion data of user 001 during motion from three-dimensional posture data in three mutually perpendicular coordinate axes to two-dimensional data in the target coordinate system , that is, the gesture data in the horizontal plane and the gesture data in the vertical plane, so that the actions of the user 001 when moving are divided into the movement in the horizontal direction and the movement in the vertical direction, so as to avoid the different orientations of the user 001 resulting data discrepancies.
  • the method 3000 and system 180 can remove the influence of the user's 001 orientation on the motion data. Therefore, the method 3000 and the system 180 can calibrate the exercise data without the need for the user 001 to perform a calibration action.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Physiology (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Human Computer Interaction (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

本申请提供的运动数据标定方法和系统,能够将用户运动时的动作数据从三个相互垂直的坐标轴中的三维姿态数据转化为目标坐标系下的二维数据,即在水平平面内的姿态数据和竖直平面内的姿态数据,从而将用户运动时的动作分为在水平方向上的运动和在竖直方向上的运动,从而避免因用户朝向不同所带来的数据差异。所述方法和系统可以去除用户朝向对运动数据的影响。因此,所述方法和系统无需用户做标定动作,即可对运动数据进行标定。

Description

运动数据标定方法和系统 技术领域
本申请涉及可穿戴设备技术领域,尤其涉及一种运动数据标定方法和系统。
背景技术
随着人们对科学运动和身体健康的关注,运动监控设备正在极大的发展。目前运动监控设备主要是通过传感器对用户运动过程中的动作参数进行监控。现有技术中的运动监控设备,在对传感器进行坐标标定时,需要用户做一系列标定动作(如双手前平举、双手侧平举等),从而运动监控设备才能根据标定动作将传感器坐标系下的姿态数据转换成人体坐标系下的姿态数据。每当用户调整运动监控设备的位置,都需要重新进行坐标标定,否则计算结果将受到影响,导致用户体验不好。
因此,需要提供一种不需要用户做标定动作,即可对运动数据进行标定的运动数据标定方法和系统。
发明内容
本申请提供一种不需要用户做标定动作,即可对运动数据进行标定的运动数据标定方法和系统。
第一方面,本申请提一种运动数据标定方法,包括:获取用户运动时的动作数据,所述动作数据包括所述用户身上的至少一个测量位置对应的至少一个姿态信号,所述至少一个姿态信号中的每个姿态信号包括其对应的测量位置在原始坐标系下的三维姿态数据;构建目标坐标系,所述目标坐标系包括X轴、Y轴、Z轴三个互相垂直的坐标轴;以及将所述每个姿态信号转换成在所述目标坐标系下的二维姿态数据。
在一些实施例中,所述每个姿态信号包括由姿态传感器测量得到的数据,所述原始坐标系包括所述姿态传感器所在的坐标系。
在一些实施例中,所述姿态传感器包括加速度传感器、角速度传感器以及磁力传感器中的至少一种。
在一些实施例中,所述每个姿态信号包括由图像传感器测量的数据,所述原始坐标系包括所述图像传感器所在的坐标系。
在一些实施例中,所述三维姿态数据包括三个相互垂直的坐标轴上的角度数据和角速度数据。
在一些实施例中,所述将所述每个姿态信号转换成在所述目标坐标系下的二维姿态数据,包括:获取预先存储的所述目标坐标系与所述原始坐标系之间的转换关系;基于所述转换关系,将所述每个姿态信号转换为所述目标坐标系下的三维运动数据,所述三维运动数据至少包括所述X轴上的角速度数据、所述Y轴上的角速度数据以及所述Z轴上的角速度数据;以及将所述目标坐标系中的所述三维运动数据转换为所述目标坐标系中的所述二维姿态数据。
在一些实施例中,所述目标坐标系的所述Z轴为重力加速度所在的竖直方向的反方向。
在一些实施例中,所述目标坐标系下的所述二维姿态数据包括:水平姿态数据,包括在垂直于所述Z轴的水平平面内运动时的水平角度数据和水平角速度数据;以及竖直姿态数据,包括在垂直于所述水平平面的任意竖直平面内运动时的竖直角度数据和竖直角速度数据。
在一些实施例中,所述将所述目标坐标系中的所述三维运动数据转换为所述目标坐标系中的所述二维姿态数据,包括:将所述X轴上的角速度数据和所述Y轴上的角速度数据通过矢量法则转换成所述竖直角速度数据;基于所述用户运动时的起始位置和结束位置对应的时间,对所述竖直角速度数据进行时间积分,获取所述竖直角度数据;将所述Z轴的角速度数据作为所述水平角速度数据;以及基于所述用户运动时的起始位置和结束位置对应的时间,对所述水平角速度数据进行时间积分,获取所述水平角度数据。
在一些实施例中,所述运动数据标定方法还包括:基于所述每个姿态信号对应的所述二维姿态数据,确定所述至少一个测量位置之间的相对运动。
第二方面,本说明书还提供一种运动数据标定系统,包括至少一个存储介质以及至少一个处理器,所述至少一个存储介质存储有至少一个指令集用于运动数据标定;所述至少一个处理器同所述至少一个存储介质通信连接,其中当所述运动数据标定系统运行时,所述至少一个处理器读取所述至少一个指令集并实施本说明书第一方面所述的运动数据标定方法。
由以上技术方案可知,本申请提供的运动数据标定方法和系统,能够将用户运动时 的动作数据从三个相互垂直的坐标轴中的三维姿态数据转化为目标坐标系下的二维数据,即在水平平面内的姿态数据和竖直平面内的姿态数据,从而将用户运动时的动作分为在水平方向上的运动和在竖直方向上的运动,从而避免因用户朝向不同所带来的数据差异。所述方法和系统可以去除用户朝向对运动数据的影响。因此,所述方法和系统无需用户做标定动作,即可对运动数据进行标定。
附图说明
本申请将以示例性实施例的方式进一步说明,这些示例性实施例将通过附图进行详细描述。这些实施例并非限制性的,在这些实施例中,相同的编号表示相同的结构,其中:
图1是根据本申请一些实施例所示的运动监控系统的应用场景示意图;
图2是根据本申请一些实施例所示的可穿戴设备的示例性硬件和/或软件的示意图;
图3是根据本申请一些实施例所示的计算设备的示例性硬件和/或软件的示意图;
图4是根据本申请一些实施例所示的可穿戴设备的示例性结构图;
图5是根据本申请一些实施例所示的运动监控方法的示例性流程图;
图6是根据本申请一些实施例所示的运动数据标定方法的示例性流程图;
图7是根据本申请一些实施例所示的目标坐标系的示意图;
图8是根据本申请一些实施例所示的转化为二维姿态数据的示例性流程图;以及
图9是根据本申请一些实施例所示的用户运动时的坐标系图。
具体实施方式
为了更清楚地说明本申请实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本申请的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本申请应用于其它类似情景。除非从语言环境中显而易见或另做说明,图中相同标号代表相同结构或操作。
应当理解,本文使用的“系统”、“装置”、“单元”和/或“模组”是用于区分不同级别的不同组件、元件、部件、部分或装配的一种方法。然而,如果其他词语可实现相同的目的,则可通过其他表达来替换所述词语。
如本申请和权利要求书中所示,除非上下文明确提示例外情形,“一”、“一个”、“一 种”和/或“该”等词并非特指单数,也可包括复数。一般说来,术语“包括”与“包含”仅提示包括已明确标识的步骤和元素,而这些步骤和元素不构成一个排它性的罗列,方法或者设备也可能包含其它的步骤或元素。
本申请中使用了流程图用来说明根据本申请的实施例的系统所执行的操作。应当理解的是,前面或后面操作不一定按照顺序来精确地执行。相反,可以按照倒序或同时处理各个步骤。同时,也可以将其他操作添加到这些过程中,或从这些过程移除某一步或数步操作。
本申请中提供一种运动监控系统,该运动监控系统可以获取用户运动时的动作信号,其中,动作信号至少包括肌电信号、姿态信号、心电信号、呼吸频率信号等。该系统可以至少基于肌电信号对应的特征信息或姿态信号对应的特征信息对用户运动的动作进行监控。例如,通过肌电信号对应的频率信息、幅值信息和姿态信号对应的角速度、角速度方向和角速度的角速度值、角度、位移信息、应力等确定用户的动作类型、动作数量、动作质量动作时间、或者用户实施动作时的生理参数信息等。在一些实施例中,运动监控系统还可以根据对用户健身动作的分析结果,生成对用户健身动作的反馈,以对用户的健身进行指导。例如,用户的健身动作不标准时,运动监控系统可以对用户发出提示信息(例如,语音提示、振动提示、电流刺激等)。该运动监控系统可以应用于可穿戴设备(例如,服装、护腕、头盔)、医学检测设备(例如,肌电测试仪)、健身设备等,该运动监控系统通过获取用户运动时的动作信号可以对用户的动作进行精准地监控和反馈,而不需要专业人员的参与,可以在提高用户的健身效率的同时降低用户健身的成本。
图1是根据本申请一些实施例所示的运动监控系统的应用场景示意图。如图1所示,运动监控系统100可以包括处理设备110、网络120、可穿戴设备130和移动终端设备140。在一些实施例中,运动监控系统100还可以包括运动数据标定系统180。运动监控系统100可以获取用于表征用户运动动作的动作信号(例如,肌电信号、姿态信号、心电信号、呼吸频率信号等)并根据用户的动作信号对用户运动时的动作进行监控和反馈。
例如,运动监控系统100可以对用户健身时的动作进行监控和反馈。当用户穿戴可穿戴设备130进行健身运动时,可穿戴设备130可以获取用户的动作信号。处理设备110或移动终端设备140可以接收并对用户的动作信号进行分析,以判断用户的健身动作是否规范,从而对用户的动作进行监控。具体地,对用户的动作进行监控可以包括确定动作的动作类型、动作数量、动作质量、动作时间、或者用户实施动作时的生理参数信息 等。进一步地,运动监控系统100可以根据对用户健身动作的分析结果,生成对用户健身动作的反馈,以对用户的健身进行指导。
再例如,运动监控系统100可以对用户跑步时的动作进行监控和反馈。例如,当用户穿戴可穿戴设备130进行跑步运动时,运动监控系统100可以监控用户跑步动作是否规范,跑步时间是否符合健康标准等。当用户跑步时间过长或者跑步动作不正确时,健身设备可以向用户反馈其运动状态,以提示用户需要调整跑步动作或者跑步时间。
在一些实施例中,处理设备110可以用于处理与用户运动相关的信息和/或数据。例如,处理设备110可以接收用户的动作信号(例如,肌电信号、姿态信号、心电信号、呼吸频率信号等),并进一步提取动作信号对应的特征信息(例如,动作信号中的肌电信号对应的特征信息、姿态信号对应的特征信息)。在一些实施例中,处理设备110可以对可穿戴设备130采集的肌电信号或姿态信号进行特定的信号处理,例如信号分段、信号预处理(例如,信号校正处理、滤波处理等)等。在一些实施例中,处理设备110也可以基于用户的动作信号判断用户动作是否正确。例如,处理设备110可以基于肌电信号对应的特征信息(例如,幅值信息、频率信息等)判断用户动作是否正确。又例如,处理设备110可以基于姿态信号对应的特征信息(例如,角速度、角速度方向、角速度的加速度、角度、位移信息、应力等)判断用户动作是否正确。再例如,处理设备110可以基于肌电信号对应的特征信息和姿态信号对应的特征信息判断用户动作是否正确。在一些实施例中,处理设备110还可以判断用户运动时的生理参数信息是否符合健康标准。在一些实施例中,处理设备110还可以发出相应指令,用以反馈用户的运动情况。例如,用户进行跑步运动时,运动监控系统100监控到用户跑步时间过长,此时处理设备110可以向移动终端设备140发出指令以提示用户调整跑步时间。需要注意的是,姿态信号对应的特征信息并不限于上述的角速度、角速度方向、角速度的加速度、角度、位移信息、应力等,还可以为其它特征信息,凡是能够用于体现用户身体发生相对运动的参数信息都可以为姿态信号对应的特征信息。例如,当姿态传感器为应变式传感器时,通过测量应变式传感器中随着拉伸长度而变化的电阻的大小,可以获取用户关节处的弯曲角度和弯曲方向。
在一些实施例中,处理设备110可以是本地的或者远程的。例如,处理设备110可以通过网络120访问存储于可穿戴设备130和/或移动终端设备140中的信息和/或资料。在一些实施例中,处理设备110可以直接与可穿戴设备130和/或移动终端设备140连接以访问存储于其中的信息和/或资料。例如,处理设备110可以位于可穿戴设备130中, 并通过网络120实现与移动终端设备140的信息交互。再例如,处理设备110可以位于移动终端设备140中,并通过网络实现与可穿戴设备130的信息交互。在一些实施例中,处理设备110可以在云平台上执行。
在一些实施例中,处理设备110可以处理与运动监控有关的数据和/或信息以执行一个或多个本申请中描述的功能。在一些实施例中,处理设备110可以获取可穿戴设备130采集的用户运动时的动作信号。在一些实施例中,处理设备110可以向可穿戴设备130或移动终端设备140发送控制指令。控制指令可以控制可穿戴设备130及其各传感器的开关状态,还可以控制移动终端设备140发出提示信息。在一些实施例中,处理设备110可以包含一个或多个子处理设备(例如,单芯处理设备或多核多芯处理设备)。
网络120可以促进运动监控系统100中数据和/或信息的交换。在一些实施例中,运动监控系统100中的一个或多个组件可以通过网络120发送数据和/或信息给运动监控系统100中的其他组件。例如,可穿戴设备130采集的动作信号可以通过网络120传输至处理设备110。又例如,处理设备110中关于动作信号的确认结果可以通过网络120传输至移动终端设备140。在一些实施例中,网络120可以是任意类型的有线或无线网络。
可穿戴设备130是指具有穿戴功能的服装或设备。在一些实施例中,可穿戴设备130可以包括但不限于上衣装置130-1、裤子装置130-2、护腕装置130-3和鞋子130-4等。在一些实施例中,可穿戴设备130可以包括多个传感器。传感器可以获取用户运动时的各种动作信号(例如,肌电信号、姿态信号、温度信息、心跳频率、心电信号等)。在一些实施例中,传感器可以包括但不限于肌电传感器、姿态传感器、温度传感器、湿度传感器、心电传感器、血氧饱和度传感器、霍尔传感器、皮电传感器、旋转传感器等中的一种或多种。例如,上衣装置130-1中人体肌肉位置(例如,肱二头肌、肱三头肌、背阔肌、斜方肌等)处可以设置肌电传感器,肌电传感器可以贴合用户皮肤并采集用户运动时的肌电信号。又例如,上衣装置130-1中人体左侧胸肌附近可以设置心电传感器,心电传感器可以采集用户的心电信号。再例如,裤子装置130-2中人体肌肉位置(例如,臀大肌、股外侧肌、股内侧肌、腓肠肌等)处可以设置姿态传感器,姿态传感器可以采集用户的姿态信号。在一些实施例中,可穿戴设备130还可以对用户的动作进行反馈。例如,用户运动时身体某一部位的动作不符合标准时,该部位对应的肌电传感器可以产生刺激信号(例如,电流刺激或者击打信号)以提醒用户。
需要注意的是,可穿戴设备130并不限于图1中所示的上衣装置130-1、裤子装置130-2、护腕装置130-3和鞋子装置130-4,还可以包括应用在其他需要进行运动监控的 设备,例如、头盔装置、护膝装置等,在此不做限定,任何可以使用本申请所包含的运动监控方法的设备都在本申请的保护范围内。
在一些实施例中,移动终端设备140可以获取运动监控系统100中的信息或数据。在一些实施例中,移动终端设备140可以接收处理设备110处理后的运动数据,并基于处理后的运动数据反馈运动记录等。示例性的反馈方式可以包括但不限于语音提示、图像提示、视频展示、文字提示等。在一些实施例中,用户可以通过移动终端设备140获取自身运动过程中的动作记录。例如,移动终端设备140可以与可穿戴设备130通过网络120连接(例如,有线连接、无线连接),用户可以通过移动终端设备140获取用户运动过程中的动作记录,该动作记录可通过移动终端设备140传输至处理设备110。在一些实施例中,移动终端设备140可以包括移动装置140-1、平板电脑140-2、笔记本电脑140-3等中的一种或其任意组合。在一些实施例中,移动装置140-1可以包括手机、智能家居装置、智能行动装置、虚拟实境装置、增强实境装置等,或其任意组合。在一些实施例中,智能家居装置可以包括智能电器的控制装置、智能监测装置、智能电视、智能摄像机等,或其任意组合。在一些实施例中,智能行动装置可以包括智能电话、个人数字助理(PDA)、游戏装置、导航装置、POS装置等,或其任意组合。在一些实施例中,虚拟实境装置和/或增强实境装置可以包括虚拟实境头盔、虚拟实境眼镜、虚拟实境眼罩、增强实境头盔、增强实境眼镜、增强实境眼罩等,或其任意组合。
在一些实施例中,运动监控系统100还可以包括运动数据标定系统180。运动数据标定系统180可以用于处理与用户运动相关的动作数据,并能够执行本说明书所述的运动数据标定方法。具体地,运动数据标定系统180可以接收用户运动时的动作数据,并能够将所述动作数据从三个相互垂直的坐标轴中的三维姿态数据转化为目标坐标系下的二维姿态数据,即在水平平面内的姿态数据和竖直平面内的姿态数据,从而将用户运动时的动作分为在水平方向上的运动和在竖直方向上的运动,避免因用户朝向不同所带来的数据差异。运动数据标定系统180可以去除用户朝向对运动数据的影响,无需用户做标定动作,即可对运动数据进行标定。所述动作数据可以是所述用户身上的测量位置在运动时的三维姿态数据。关于所述动作数据以及所述运动数据标定方法将在后面的描述中详细介绍。
在一些实施例中,运动数据标定系统180可以集成在处理设备110上。在一些实施例中,运动数据标定系统180也可以集成在移动终端设备140上。在一些实施例中,运动数据标定系统180也可以独立于处理设备110和移动终端设备140单独存在。运动数 据标定系统180可以与处理设备110、可穿戴设备130和移动终端设备140通信连接,以进行信息和/或数据的传输和交换。在一些实施例中,运动数据标定系统180可以通过网络120访问存储于处理设备110、可穿戴设备130和/或移动终端设备140中的信息和/或资料。在一些实施例中,可穿戴设备130可以直接与处理设备110和/或移动终端设备140连接以访问存储于其中的信息和/或资料。例如,运动数据标定系统180可以位于处理设备110中,并通过网络120实现与可穿戴设备130和移动终端设备140的信息交互。再例如,运动数据标定系统180可以位于移动终端设备140中,并通过网络实现与处理设备110和可穿戴设备130的信息交互。在一些实施例中,运动数据标定系统180可以在云平台上执行,并通过网络实现与处理设备110、可穿戴设备130和移动终端设备140的信息交互。
为了方便展示,下面的描述中我们将以运动数据标定系统180位于处理设备110中为例进行描述。
在一些实施例中,运动监控系统100还可以包括数据库。数据库可以存储资料(例如,初始设置的阈值条件等)和/或指令(例如,反馈指令)。在一些实施例中,数据库可以存储从可穿戴设备130和/或移动终端设备140获取的资料。在一些实施例中,数据库可以存储供处理设备110执行或使用的信息和/或指令,以执行本申请中描述的示例性方法。在一些实施例中,数据库可以与网络120连接以与运动监控系统100的一个或多个组件(例如,处理设备110、可穿戴设备130、移动终端设备140等)通讯。运动监控系统100的一个或多个组件可以通过网络120访问存储于数据库中的资料或指令。在一些实施例中,数据库可以直接与运动监控系统100中的一个或多个组件连接或通讯。在一些实施例中,数据库可以是处理设备110的一部分。
图2是根据本申请一些实施例所示的可穿戴设备130的示例性硬件和/或软件的示意图。如图2所示,可穿戴设备130可以包括获取模块210、处理模块220(也被称为处理器)、控制模块230(也被称为主控、MCU、控制器)、通讯模块240、供电模块250以及输入/输出模块260。
获取模块210可以用于获取用户运动时的动作信号。在一些实施例中,获取模块210可以包括传感器单元,传感器单元可以用于获取用户运动时的一种或多种动作信号。在一些实施例中,传感器单元可以包括但不限于肌电传感器、姿态传感器、心电传感器、呼吸传感器、温度传感器、湿度传感器、惯性传感器、血氧饱和度传感器、霍尔传感器、皮电传感器、旋转传感器等中的一种或多种。在一些实施例中,动作信号可以包括肌电 信号、姿态信号、心电信号、呼吸频率、温度信号、湿度信号等中的一种或多种。传感器单元可以根据所要获取的动作信号类型放置在可穿戴设备130的不同位置。例如,在一些实施例中,肌电传感器(也被称为电极元件)可以设置于人体肌肉位置,肌电传感器可以被配置为采集用户运动时的肌电信号。肌电信号及其对应的特征信息(例如,频率信息、幅值信息等)可以反映用户运动时肌肉的状态。姿态传感器可以设置于人体的不同位置(例如,可穿戴设备130中与躯干、四肢、关节对应的位置),姿态传感器可以被配置为采集用户运动时的姿态信号。姿态信号及其对应的特征信息(例如,角速度方向、角速度值、角速度加速度值、角度、位移信息、应力等)可以反映用户运动的姿势。心电传感器可以设置于人体胸口周侧的位置,心电传感器可以被配置为采集用户运动时的心电数据。呼吸传感器可以设置于人体胸口周侧的位置,呼吸传感器可以被配置为采集用户运动时的呼吸数据(例如,呼吸频率、呼吸幅度等)。温度传感器可以被配置为采集用户运动时的温度数据(例如,体表温度)。湿度传感器可以被配置为采集用户运动时的外部环境的湿度数据。
处理模块220可以处理来自获取模块210、控制模块230、通讯模块240、供电模块250和/或输入/输出模块260的数据。例如,处理模块220可以处理来自获取模块210的用户运动过程中的动作信号。在一些实施例中,处理模块220可以将获取模块210获取的动作信号(例如,肌电信号、姿态信号)进行预处理。例如,处理模块220对用户运动时的肌电信号或姿态信号进行分段处理。又例如,处理模块220可以对用户运动时的肌电信号进行预处理(例如,滤波处理、信号校正处理),以提高肌电信号质量。再例如,处理模块220可以基于用户运动时的姿态信号确定与姿态信号对应的特征信息。在一些实施例中,处理模块220可以处理来自输入/输出模块260的指令或操作。在一些实施例中,处理后的数据可以存储到存储器或硬盘中。在一些实施例中,处理模块220可以将其处理后的数据通过通讯模块240或网络120传送到运动监控系统100中的一个或者多个组件中。例如,处理模块220可以将用户运动的监控结果发送给控制模块230,控制模块230可以根据动作确定结果执行后续的操作或指令。
控制模块230可以与可穿戴设备130中其他模块相连接。在一些实施例中,控制模块230可以控制可穿戴设备130中其它模块的运行状态。例如,控制模块230可以控制供电模块250的供电状态(例如,正常模式、省电模式)、供电时间等。又例如,控制模块230可以根据用户的动作确定结果控制输入/输出模块260,进而可以控制移动终端设备140向用户发送其运动的反馈结果。当用户运动时的动作出现问题(例如,动作不 符合标准)时,控制模块230可以控制输入/输出模块260,进而可以控制移动终端设备140向用户进行反馈,使得用户可以实时了解自身运动状态并对动作进行调整。在一些实施例中,控制模块230还可以控制获取模块210中的一个或多个传感器或者其它模块对人体进行反馈。例如,当用户运动过程中某块肌肉发力强度过大,控制模块230可以控制该肌肉位置处的电极模块对用户进行电刺激以提示用户及时调整动作。
在一些实施例中,通讯模块240可以用于信息或数据的交换。在一些实施例中,通讯模块240可以用于可穿戴设备130内部组件之间的通信。例如,获取模块210可以发送用户动作信号(例如,肌电信号、姿态信号等)到通讯模块240,通讯模块240可以将所述动作信号发送给处理模块220。在一些实施例中,通讯模块240还可以用于可穿戴设备130和运动监控系统100中的其他组件之间的通信。例如,通讯模块240可以将可穿戴设备130的状态信息(例如,开关状态)发送到处理设备110,处理设备110可以基于所述状态信息对可穿戴设备130进行监控。通讯模块240可以采用有线、无线以及有线/无线混合技术。
在一些实施例中,供电模块250可以为运动监控系统100中的其他组件提供电力。
输入/输出模块260可以获取、传输和发送信号。输入/输出模块260可以与运动监控系统100中的其他组件进行连接或通信。运动监控系统100中的其他组件可以通过输入/输出模块260实现连接或通信。
需要注意的是,以上对于运动监控系统100及其模块的描述,仅为描述方便,并不能把本申请的一个或多个实施例限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该系统的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接,或者对其中的一个或多个模块进行省略。例如,获取模块210和处理模块220可以为一个模块,该模块可以具有获取和处理用户动作信号的功能。又例如,处理模块220还可以不设置于可穿戴设备130中,而集成在处理设备110中。诸如此类的变形,均在本申请的一个或多个实施例的保护范围之内。
图3是根据本申请一些实施例所示的计算设备300的示例性硬件和/或软件的示意图。在一些实施例中,处理设备110和/或移动终端设备140可以在计算设备300上实现。在一些实施例中,运动数据标定系统180可以在计算设备300上实现。如图3所示,计算设备300可以包括内部通信总线310、至少一个处理器320、至少一个存储介质、通信端口350、输入/输出接口360以及用户界面380。
内部通信总线310可以实现计算设备300中各组件间的数据通信。例如,至少一个处理器320可以通过内部通信总线310将数据发送到至少一个存储介质或输入/输出端口360等其它硬件中。在一些实施例中,内部通信总线310可以为工业标准(ISA)总线、扩展工业标准(EISA)总线、视频电子标准(VESA)总线、外部部件互联标准(PCI)总线等。在一些实施例中,内部通信总线310可以用于连接图1所示的运动监控系统100中的各个模块(例如,获取模块210、处理模块220、控制模块230、通讯模块240、输入输出模块260)。
计算设备300的至少一个存储介质可以包括数据存储装置。所述数据存储装置可以是非暂时性存储介质,也可以是暂时性存储介质。比如,所述数据存储装置可以包括只读存储器(ROM)330、随机存储器(RAM)340、硬盘370等中的一种或多种。示例性的ROM可以包括掩模ROM(MROM)、可编程ROM(PROM)、可擦除可编程ROM(PEROM)、电可擦除可编程ROM(EEPROM)、光盘ROM(CD-ROM)和数字通用盘ROM等。示例性的RAM可以包括动态RAM(DRAM)、双倍速率同步动态RAM(DDR SDRAM)、静态RAM(SRAM)、晶闸管RAM(T-RAM)和零电容(Z-RAM)等。存储介质可以存储从运动监控系统100的任何其他组件中获取的数据/信息。存储介质还包括存储在所述数据存储装置中的至少一个指令集。所述指令是计算机程序代码,所述计算机程序代码可以包括执行本说明书提供的运动数据标定方法的程序、例程、对象、组件、数据结构、过程、模块等等。在一些实施例中,计算设备300的存储介质可以位于可穿戴设备130中,也可以位于处理设备110中。
至少一个处理器320可以同至少一个存储介质通信连接。至少一个处理器320用以执行上述至少一个指令集。当计算设备300运行时,至少一个处理器320可以读取所述至少一个指令集,并且根据所述至少一个指令集的指示执行计算指令(程序代码),从而执行本申请描述的运动监控系统100的功能。处理器320可以执行数据处理方法包含的所有步骤。所述计算指令可以包括程序、对象、组件、数据结构、过程、模块和功能(所述功能指本申请中描述的特定功能)。例如,处理器320可以处理从运动监控系统100的可穿戴设备130或/和移动终端设备140中获取的用户运动时的动作信号(例如,肌电信号、姿态信号),并根据用户运动时的动作信号对用户的运动的动作进行监控。例如,处理器320可以用于处理从运动监控系统100的可穿戴设备130或/和移动终端设备140中获取的用户运动时的动作数据,并根据所述至少一个指令集的指示执行本说明书所述的运动数据标定方法,将所述动作数据转化为二维姿态数据。在一些实施例中, 处理器320可以包括微控制器、微处理器、精简指令集计算机(RISC)、专用集成电路(ASIC)、应用特定指令集处理器(ASIP)、中央处理器(CPU)、图形处理单元(GPU)、物理处理单元(PPU)、微控制器单元、数字信号处理器(DSP)、现场可编程门阵列(FPGA)、高级精简指令集计算机(ARM)、可编程逻辑器件以及能够执行一个或多个功能的任何电路和处理器等,或其任意组合。仅为了说明,图3中的计算设备300只描述了一个处理器320,但需要注意的是,本申请中的计算设备300还可以包括多个处理器320。
硬盘370可以用于存储处理设备110所产生的或从处理设备110所接收到的信息及数据。例如,硬盘370可以储存用户的用户确认信息。在一些实施例中,硬盘370可以设置于处理设备110中或可穿戴设备130中。
用户界面380可以实现计算设备300和用户之间的交互和信息交换。在一些实施例中,用户界面380可以用于将运动监控系统100生成的运动记录呈现给用户。在一些实施例中,用户界面380可以包括一个物理显示器,如带扬声器的显示器、LCD显示器、LED显示器、OLED显示器、电子墨水显示器(E-Ink)等。
输入/输出接口360可以用于输入或输出信号、数据或信息。在一些实施例中,输入/输出接口360可以使用户与运动监控系统100进行交互。
图4是根据本申请一些实施例所示的可穿戴设备的示例性结构图。为了进一步对可穿戴设备进行描述,将上衣服装作为示例性说明。如图4所示,可穿戴设备400可以包括上衣服装410。上衣服装410可以包括上衣服装基底4110、至少一个上衣处理模块4120、至少一个上衣反馈模块4130、至少一个上衣获取模块4140等。上衣服装基底4110可以是指穿戴于人体上身的衣物。在一些实施例中,上衣服装基底4110可以包括短袖T恤、长袖T恤、衬衫、外套等。至少一个上衣处理模块4120、至少一个上衣获取模块4140可以位于上衣服装基底4110上与人体不同部位贴合的区域。至少一个上衣反馈模块4130可以位于上衣服装基底4110的任意位置,至少一个上衣反馈模块4130可以被配置为反馈用户上身运动状态信息。示例性的反馈方式可以包括但不限于语音提示、文字提示、压力提示、电流刺激等。在一些实施例中,至少一个上衣获取模块4140可以包括但不限于姿态传感器、心电传感器、肌电传感器、温度传感器、湿度传感器、惯性传感器、酸碱传感器、声波换能器等中的一种或多种。上衣获取模块4140中的传感器可以根据待测量的信号不同而放置在用户身体的不同位置。例如,姿态传感器用于获取用户运动过程中的姿态信号时,姿态传感器可以放置于上衣服装基底4110中与人体躯干、双臂、关节对应的位置。又例如,肌电传感器用于获取用户运动过程中的肌电信号时, 肌电传感器可以位于用户待测量的肌肉附近。在一些实施例中,姿态传感器可以包括但不限于加速度三轴传感器、角速度三轴传感器、磁力传感器等,或其任意组合。例如,一个姿态传感器可以包含加速度三轴传感器、角速度三轴传感器。在一些实施例中,姿态传感器还可以包括应变式传感器。应变式传感器可以是指可以基于待测物受力变形产生的应变的传感器。在一些实施例中,应变式传感器可以包括但不限于应变式测力传感器、应变式压力传感器、应变式扭矩传感器、应变式位移传感器、应变式加速度传感器等中的一种或多种。例如,应变式传感器可以设置在用户的关节位置,通过测量应变式传感器中随着拉伸长度而变化的电阻的大小,可以获取用户关节处的弯曲角度和弯曲方向。需要注意的是,上衣服装410除了上述的上衣服装基底4110、上衣处理模块4120、上衣反馈模块4130、上衣获取模块4140之外,还可以包括其它模块,例如,供电模块、通讯模块、输入/输出模块等。上衣处理模块4120与图2中的处理模块220相类似、上衣获取模块4140与图2中的获取模块210相类似,关于上衣服装410中的各个模块的具体描述可以参考本申请图2中的相关描述,在此不做赘述。
图5是根据本申请一些实施例所示的运动监控方法的示例性流程图。如图5所示,流程500可以包括:
在步骤510中,获取用户运动时的动作信号。
在一些实施例中,该步骤510可以由获取模块210执行。动作信号是指用户运动时的人体参数信息。在一些实施例中,人体参数信息可以包括但不限于肌电信号、姿态信号、心电信号、温度信号、湿度信号、血氧浓度、呼吸频率等中的一种或多种。在一些实施例中,获取模块210中的肌电传感器可以采集用户在运动过程中的肌电信号。例如,当用户进行坐姿夹胸时,可穿戴设备中与人体胸肌、背阔肌等位置对应的肌电传感器可以采集用户相应肌肉位置的肌电信号。又例如,当用户进行深蹲动作时,可穿戴设备中与人体臀大肌、股四头肌等位置对应的肌电传感器可以采集用户相应肌肉位置的肌电信号。再例如,用户进行跑步运动时,可穿戴设备中与人体腓肠肌等位置对应的肌电传感器可以采集人体腓肠肌等位置的肌电信号。在一些实施例中,获取模块210中的姿态传感器可以采集用户运动时的姿态信号。例如,当用户进行杠铃卧推运动时,可穿戴设备中与人体肱三头肌等位置对应的姿态传感器可以采集用户肱三头肌等位置的姿态信号。又例如,当用户进行哑铃飞鸟动作时,设置在人体三角肌等位置处的姿态传感器可以采集用户三角肌等位置的姿态信号。在一些实施例中,获取模块210中的姿态传感器的数量可以为多个,多个姿态传感器可以获取用户运动时多个部位的姿态信号,多个部位姿 态信号可以反映人体不同部位之间的相对运动情况。例如,手臂处的姿态信号和躯干处的姿态信号可以反映手臂相对于躯干的运动情况。在一些实施例中,姿态信号与姿态传感器的类型相关联。例如,当姿态传感器为角速度三轴传感器时,获取的姿态信号为角速度信息。又例如,当姿态传感器为角速度三轴传感器和加速度三轴传感器,获取的姿态信号为角速度信息和加速度信息。再例如,姿态传感器为应变式传感器时,应变式传感器可以设置在用户的关节位置,通过测量应变式传感器中随着拉伸长度而变化的电阻的大小,获取的姿态信号可以为位移信息、应力等,通过这些姿态信号可以表征用户关节处的弯曲角度和弯曲方向。需要注意的是,能够用于体现用户身体发生相对运动的参数信息都可以为姿态信号对应的特征信息,根据特征信息的类型可以采用不同类型的姿态传感器进行获取。
在一些实施例中,所述动作信号可以包括用户身体特定部位的肌电信号以及该特定部位的姿态信号。肌电信号和姿态信号可以从不同角度反映出用户身体特定部位的运动状态。简单来说,用户身体特定部位的姿态信号可以反映该特定部位的动作类型、动作幅度、动作频率等。肌电信号可以反映出该特定部位在运动时的肌肉状态。在一些实施例中,通过相同身体部位的肌电信号和/或姿态信号,可以更好地评估该部位的动作是否规范。
在步骤520中,至少基于肌电信号对应的特征信息或姿态信号对应的特征信息对用户运动的动作进行监控。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,肌电信号对应的特征信息可以包括但不限于频率信息、幅值信息等中的一种或多种。姿态信号对应的特征信息是指用于表征用户身体发生相对运动的参数信息。在一些实施例中,姿态信号对应的特征信息可以包括但不限于角速度方向、角速度值、角速度的加速度值等中的一种或多种。在一些实施例中,姿态信号对应的特征信息还可以包括角度、位移信息(例如应变式传感器中的拉伸长度)、应力等。例如,姿态传感器为应变式传感器时,应变式传感器可以设置在用户的关节位置,通过测量应变式传感器中随着拉伸长度而变化的电阻的大小,获取的姿态信号可以为位移信息、应力等,通过这些姿态信号可以表征用户关节处的弯曲角度和弯曲方向。在一些实施例中,处理模块220和/或处理设备110可以提取肌电信号对应的特征信息(例如,频率信息、幅值信息)或姿态信号对应的特征信息(例如,角速度方向、角速度值、角速度的加速度值、角度、位移信息、应力等),并基于肌电信号对应的特征信息或姿态信号对应的特征信息对用 户运动的动作进行监控。这里对用户运动的动作进行监控包括对用户动作相关的信息进行监控。在一些实施例中,动作相关的信息可以包括用户动作类型、动作数量、动作质量(例如,用户动作是否符合标准)、动作时间等中的一个或多个。动作类型是指用户运动时采取的健身动作。在一些实施例中,动作类型可以包括但不限于坐姿夹胸、深蹲运动、硬拉运动、平板支撑、跑步、游泳等中的一种或多种。动作数量是指用户运动过程中执行动作的次数。例如,用户在运动过程中进行了10次坐姿夹胸,这里的10次为动作次数。动作质量是指用户执行的健身动作相对于标准健身动作的标准度。例如,当用户进行深蹲动作时,处理设备110可以基于特定肌肉位置(臀大肌、股四头肌等)的动作信号(肌电信号和姿态信号)对应的特征信息判断用户动作的动作类型,并基于标准深蹲动作的动作信号判断用户深蹲动作的动作质量。动作时间是指用户一个或多个动作类型对应的时间或运动过程的总时间。
如前所述,姿态信号对应的特征信息可以包括能够用于体现用户身体发生相对运动的参数信息。为了实现对用户运动时的动作进行监控,运动监控系统100需获取用户身体的不同部位之间的相对运动。如前,姿态信号可以是姿态传感器测得的姿态数据。姿态传感器可以分布于用户身体的多个不同部位。为了获取用户身体的多个不同部位之间的相对运动,运动监控系统100可以对用户身上的多个不同部位之间的姿态信号进行标定。
图6是根据本申请一些实施例所示的运动数据标定方法3000的示例性流程图。如前,运动数据标定系统180可以执行本申请提供的运动数据标定的方法3000。具体地,运动数据标定系统180可以在处理设备110上执行运动数据标定的方法3000,也可以在移动终端设备140上执行运动数据标定的方法3000。为了方便展示,我们将以运动数据标定系统180在处理设备110上执行运动数据标定的方法3000为例进行描述。具体地,处理设备110中的处理器320可以读取存储在其本地存储介质中的指令集,然后根据指令集的规定,执行本申请提供的运动数据标定的方法3000。运动数据标定的方法3000可以包括:
S3020:获取用户运动时的动作数据。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。动作数据是指用户在运动时的人体运动参数信息。在一些实施例中,动作数据可以包括用户身上的至少一个测量位置对应的至少一个姿态信号。姿态信号及其对应的特征信息(例如,角速度方向、角速度值、角速度加速度值、角度、位移信息、应力等)可以反映用户运动 的姿势。至少一个测量位置与至少一个姿态信号一一对应。测量位置可以是用户身体上的不同部位。至少一个姿态信号对应用户身上的至少一个测量位置在用户运动时的实际姿态。至少一个姿态信号中的每个姿态信号可以包括其对应的测量位置在原始坐标系下的三维姿态数据。原始坐标系可以是姿态信号所处的坐标系。当用户的姿态不发生变化时,原始坐标系的变化也可能导致姿态信号的变化。每个姿态信号可以包括一种或多种三维姿态数据。比如,三维角度数据、三维角速度数据、三维角加速度数据、三维速度数据、三维位移数据、三维应力数据,等等。
在一些实施例中,姿态信号可以由可穿戴设备130上的姿态传感器进行获取。如前所述,可穿戴设备130中的获取模块210的传感器单元可以包括姿态传感器。具体地,可穿戴设备130可以包括至少一个姿态传感器。至少一个姿态传感器可以位于用户身体的至少一个测量位置。姿态传感器可以采集用户身上与其对应的测量位置的姿态信号。可穿戴设备130上的姿态传感器可以分布在人体四肢部位(例如,手臂、腿部等)、人体的躯干部位(例如,胸部、腹部、背部、腰部等)和人体的头部等。姿态传感器可以实现人体的四肢部位、躯干部位等其它部位的姿态信号采集。姿态传感器可以根据所要获取的姿态信号放置在可穿戴设备130的不同位置,以测量人体不同的位置对应的姿态信号。在一些实施例中,姿态传感器还可以为具有姿态融合算法的姿态测量单元(AHRS)的传感器。姿态融合算法可以将具有三轴加速度传感器、三轴角速度传感器、三轴地磁传感器的九轴惯性测量单元(IMU)的数据融合为欧拉角或四元数,以获取姿态传感器所在用户身体部位的姿态信号。在一些实施例中,处理模块220和/或处理设备110可以基于姿态信号确定姿态对应的特征信息。在一些实施例中,姿态信号对应的特征信息可以包括但不限于角速度值、角速度方向、角速度的加速度值等。在一些实施例中,姿态传感器可以为应变传感器,应变传感器可以获取用户关节处的弯曲方向和弯曲角度,从而获取用户运动时的姿态信号。例如,应变传感器可以设置于用户的膝关节处,当用户运动时,用户的身体部位作用于应变传感器,基于应变传感器的电阻或长度变化情况可以计算出用户膝关节处的弯曲方向和弯曲角度,从而获取用户腿部的姿态信号。在一些实施例中,姿态传感器还可以包括光纤传感器,姿态信号可以由光纤传感器的光线弯曲后的方向变化来表征。在一些实施例中,姿态传感器还可以为磁通量传感器,姿态信号可以由磁通量的变换情况进行表征。需要注意的是,姿态传感器的类型不限于上述的传感器,还可以为其它传感器,能够获取用户姿态信号的传感器均在本申请的姿态传感器的范围内。
如前所述,每个姿态信号可以包括一种或多种三维姿态数据。姿态传感器也可以包括多种类型的传感器。在一些实施例中,姿态传感器可以包括加速度传感器、角速度传感器以及磁力传感器中的至少一种。
当姿态信号是由姿态传感器测量得到的数据时,原始坐标系可以是姿态传感器所在的坐标系。在一些实施例中,原始坐标系是指设置在人体上的姿态传感器对应的坐标系。当用户使用可穿戴设备130时,可穿戴设备130上的各姿态传感器分布于人体的不同部位,使得各姿态传感器在人体上的安装角度不同,而不同部位的姿态传感器分别以各自本体的坐标系作为原始坐标系,因此不同部位的姿态传感器具有不同的原始坐标系。在一些实施例中,各个姿态传感器获取的姿态信号可以是在其对应的原始坐标系下的表达。姿态传感器获取的姿态信号可以是预设固定坐标系在原始坐标系中的姿态信号。所述预设固定坐标系可以是大地坐标系,也可以是其他任意预设坐标系。运动数据标定系统180中可以预先存储有原始坐标系相对于所述预设固定坐标系的转换关系。
在一些实施例中,姿态信号可以是姿态传感器直接获取的信号,也可以是姿态传感器直接获取的信号经过常规滤波、整流、小波变换、毛刺处理等信号处理过程形成的姿态信号,或者是以上任意一个或多个处理流程的排列组合得到的信号。
在一些实施例中,姿态信号可以是由图像传感器测量的数据。图像传感器可以是能够获取深度信息的图像传感器,例如3D结构光相机、双目摄像机,等等。图像传感器可以安装在能够拍摄到用户运动的图像的任意位置。图像传感器的数量可以是一个,也可以是多个。当图像传感器的数量为多个时,多个图像传感器可以安装在多个不同位置。图像传感器可以获取用户运动时的深度图像。深度图像中可以包含用户身上的至少一个测量位置相对于图像传感器所在的坐标系的深度信息。当用户运动时,运动数据标定系统180可以基于多帧深度图像的变化,计算得到至少一个测量位置中的每个测量位置的姿态信号。如前所述,每个姿态信号可以包括一种或多种三维姿态数据。运动数据标定系统180可以计算得到不同的三维姿态数据。
当姿态信号是由图像传感器测量得到的数据时,原始坐标系可以是图像传感器自身所在的坐标系。在一些实施例中,图像传感器获取的姿态信号可以是在其对应的图像传感器自身的坐标系(原始坐标系)下的表达。图像传感器可以预先进行标定。即运动数据标定系统180中可以预先存储有图像传感器自身坐标系相对于前述的预设固定坐标系的转换关系。
为了方便展示,下面的描述中我们将以动作数据是由至少一个姿态传感器测量得到的数据为例进行描述。为了方便展示,我们将原始坐标系定义为o-xyz坐标系。其中,o为原始坐标系o-xyz的坐标圆点,x轴、y轴以及z轴分别为原始坐标系o-xyz的三个相互垂直的坐标轴。
如前所述,每个姿态信号可以是其对应的测量位置在原始坐标系o-xyz下的三维姿态数据。在一些实施例中,三维姿态数据可以是其所在坐标系中的三个相互垂直的坐标轴上的姿态数据。在一些实施例中,姿态数据可以包括角度数据和角速度数据。在一些实施例中,在原始坐标系o-xyz下的三维姿态数据可以包括三个相互垂直的坐标轴x轴、y轴以及z轴上的角度数据和角速度数据。为了方便展示,我们将每个姿态信号在原始坐标系o-xyz下的三维姿态数据分别标记为三维角度(Euler)数据E sens和三维角速度(Gyro)数据G sens。三维角度数据E sens可以包括x轴上的角度数据E sens_x、y轴上的角度数据E sens_y以及z轴上的角度数据E sens_z。三维角速度数据G sens可以包括x轴上的角速度数据G sens_x、y轴上的角速度数据G sens_y以及z轴上的角速度数据G sens_z
S3040:构建目标坐标系。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。为了方便展示,我们将目标坐标系定义为O-XYZ。为了便于确定用户不同部位之间的相对运动,运动数据标定系统180可以将动作数据转化为同一个已知坐标系(例如,目标坐标系定义为O-XYZ)下的姿态数据。目标坐标系O-XYZ可以包括X轴、Y轴、Z轴三个互相垂直的坐标轴。
在一些实施例中,目标坐标系O-XYZ可以是任意经过标定的坐标系。在一些实施例中,目标坐标系O-XYZ可以是前述的预设固定坐标系。在一些实施例中,目标坐标系O-XYZ与前述的预设固定坐标系可以为不同的坐标系。运动数据标定系统180中可以预先存储有前述的预设固定坐标系与目标坐标系O-XYZ的转换关系。
运动数据标定系统180用于对用户运动时的动作数据进行标定,其测量对象为用户。因此,在一些实施例中,目标坐标系O-XYZ可以以人体站立时躯干的长度方向为Z轴。即Z轴为重力加速度所在的竖直方向的反方向。即Z轴为垂直于地面指向天空的坐标轴。X轴和Y轴构成的平面为垂直于Z轴的水平平面。在一些实施例中,X轴和Y轴可以是垂直于Z轴的水平面内的任意两个互相垂直的坐标轴。在一些实施例中,X轴可以是东西朝向的坐标,比如指向东边方向的坐标轴,Y轴可以是南北朝向的坐标,比如指向 北边方向的坐标轴。图7示出了根据本说明书实施例提供的目标坐标系O-XYZ的示意图。其中,X轴方向正好为用户001站立时的前方。
S3060:将所述每个姿态信号转换成在所述目标坐标系下的二维姿态数据。
图8是根据本申请一些实施例所示的转化为二维姿态数据的示例性流程图。图8对应步骤S3060。如图8所示,步骤S3060可以包括:
S3062:获取预先存储的目标坐标系O-XYZ与原始坐标系o-xyz之间的转换关系。
如前所述,每个姿态传感器获取的姿态信号可以是在其对应的原始坐标系下的表达。具体地,姿态传感器获取的姿态信号可以是预设固定坐标系在当前姿态信号对应的测量位置的原始坐标系中的姿态信号。运动数据标定系统180可以通过逆向转换的方式,基于每个姿态信号获取其对应的姿态传感器的原始坐标系o-xyz在预设固定坐标系中的表达。如前所述,运动数据标定系统180中可以预先存储有预设固定坐标系与目标坐标系O-XYZ的转换关系。运动数据标定系统180可以基于预设固定坐标系与目标坐标系O-XYZ的转换关系,计算确定每个原始坐标系o-xyz与目标坐标系O-XYZ的转换关系。运动数据标定系统180可以基于所述转换关系将原始坐标系o-xyz中的姿态信息转换为目标坐标系中O-XYZ的姿态信息。在一些实施例中,该转换关系可以表示为一个或多个旋转矩阵。运动数据标定系统180中可以预先存储有所述旋转矩阵。
S3064:基于目标坐标系O-XYZ与原始坐标系o-xyz之间的转换关系,将所述每个姿态信号转换为所述目标坐标系O-XYZ下的三维运动数据。
如前所述,每个姿态信号可以包括在其对应的原始坐标系o-xyz下的三维姿态数据E sens_x、E sens_y、E sens_z、G sens_x、G sens_y以及G sens_z。运动数据标定系统180可以基于目标坐标系O-XYZ与原始坐标系o-xyz之间的转换关系,确定每个姿态信号对应的测量位置所在的原始坐标系0-xyz在目标坐标系O-XYZ下的三维运动数据。在一些实施例中,所述三维运动数据至少包括X轴上的角速度数据G global_X、Y轴上的角速度数据G global_Y以及Z轴上的角速度数据G global_Z
S3066:将目标坐标系O-XYZ中的所述三维运动数据转换为目标坐标系O-XYZ中的二维姿态数据。
所述二维姿态数据为二维坐标系下的数据。目标坐标系O-XYZ中的二维坐标系可以包括用户001肢体摆动时的运动平面和用户001躯干转动时的运动平面。在一些实施例中,目标坐标系O-XYZ下的二维姿态数据可以包括水平姿态数据以及竖直姿态数据。水平姿态数据可以包括在垂直于Z轴的水平平面内运动时的水平角度数据E global_Z和水 平角速度数据G global_Z。竖直姿态数据可以包括在垂直于水平平面的任意竖直平面内运动时的竖直角度数据E global_XY和竖直角速度数据G global_XY。所述竖直平面可以是垂直于水平平面的任意一个平面。水平角度数据E global_Z可以是测量位置在目标坐标系O-XYZ中的水平面内的旋转角度。水平角速度数据G global_Z可以是测量位置在目标坐标系O-XYZ中的水平面内的旋转角速度。竖直角度数据E global_XY可以是测量位置在目标坐标系O-XYZ中的任意一个竖直平面内的旋转角度。竖直角速度数据G global_XY可以是测量位置在目标坐标系O-XYZ中的任意一个竖直平面内的旋转角速度。
因为对大多数健身房内的健身动作来说,用户001的动作的主要部分都可以被分解成在两个平面上的运动。比如原地在水平平面上的运动以及任意一个竖直平面上的运动。用户001运动时所作的不同动作仅通过水平平面的运动和竖直平面的运动即可区分开。比如,当用户001在跑步机上跑步的时,用户001的跑步动作主要集中在绕着平行于水平平面的关节的旋转运动。此时,旋转运动发生在垂直于所述水平平面的竖直平面内,且所述竖直平面沿用户001的身体朝向方向延伸。在这个竖直平面上用户001的重心做上下的线性移动,用户的四肢随着重心各摆动。比如,用户001进行二头弯举动作时,二头弯举的动作只有竖直平面内的运动。再比如,用户001进行坐姿夹胸动作时,坐姿夹胸的动作只有水平平面内运动。
图9是根据本申请一些实施例所示的用户001运动时的坐标系图。图9中当用户001做二头弯举的时候,用户001的朝向基本不变。二头弯举的动作主要集中在前臂AO’同上臂O’B形成的竖直平面P上。由于是克服哑铃重力的动作,因此平面P大体是垂直于水平平面X-Y平面的。在竖直平面P内,上臂O’B的位姿基本不变,前臂AO’围绕肘部O’做往复摆动。摆动的矢量方向为竖直平面P的法线方向。除此之外,用户001的二头弯举动作在其他方向上的分量很小,可以忽略。基于上述分析,接下来,运动数据标定系统180便将所述三维运动数据转换成二维姿态数据。
如图8所示,步骤S3066可以包括:
S3066-2:将X轴上的角速度数据G global_X和Y轴上的角速度数据G global_Y通过矢量法则转换成竖直角速度数据G global_XY
竖直角速度数据G global_XY可以表示为以下公式:
Figure PCTCN2021134421-appb-000001
G globaal_XY=|G globaal_XY|*a    公式(2)
其中,
Figure PCTCN2021134421-appb-000002
S3066-4:基于用户001运动时的起始位置和结束位置对应的时间,对竖直角速度数据G glohal_XY进行时间积分,获取竖直角度数据E glohal_XY
其中,竖直角度数据E global_XY可以表示为以下公式:
Figure PCTCN2021134421-appb-000003
其中startpos和endpos是一个动作的开始位置(start position)和结束位置(end position)对应的开始时间和结束时间。
S3066-6:将Z轴的角速度数据G glohal_Z作为所述水平角速度数据G glohal_Z
S3066-8:基于用户001运动时的起始位置和结束位置对应的时间,对水平角速度数据G glohal_Z进行时间积分,获取水平角度数据E global_Z
其中,水平角度数据E global_Z可以表示为以下公式:
Figure PCTCN2021134421-appb-000004
其中startpos和endpos是一个动作的开始位置(start position)和结束位置(end position)对应的开始时间和结束时间。
在一些实施例中,所述运动数据标定方法3000还可以包括:
S3080:基于所述每个姿态信号对应的所述二维姿态数据,确定所述至少一个测量位置之间的相对运动。
运动数据标定系统180可以通过用户001身体至少一个测量位置对应的至少一个姿态信号对应的至少一个二维姿态数据,判断用户001身体不同运动部位之间的相对运动。例如,通过用户001手臂处的姿态传感器对应的特征信息和用户001躯干部位的姿态传感器对应的特征信息,可以判断用户001运动过程中手臂与躯干之间的相对运动。
综上所述,本申请提供的运动数据标定方法3000和系统180,能够将用户001运动时的动作数据从三个相互垂直的坐标轴中的三维姿态数据转化为目标坐标系下的二维数据,即在水平平面内的姿态数据和竖直平面内的姿态数据,从而将用户001运动时的动作分为在水平方向上的运动和在竖直方向上的运动,从而避免因用户001朝向不同所带来的数据差异。所述方法3000和系统180可以去除用户001朝向对运动数据的影响。因此,所述方法3000和系统180无需用户001做标定动作,即可对运动数据进行标定。
上文已对基本概念做了描述,显然,对于本领域技术人员来说,上述详细披露仅仅 作为示例,而并不构成对本申请的限定。虽然此处并没有明确说明,本领域技术人员可能会对本申请进行各种修改、改进和修正。该类修改、改进和修正在本申请中被建议,所以该类修改、改进、修正仍属于本申请示范实施例的精神和范围。
同时,本申请使用了特定词语来描述本申请的实施例。如“一个实施例”、“一实施例”、和/或“一些实施例”意指与本申请至少一个实施例相关的某一特征、结构或特点。因此,应强调并注意的是,本申请中在不同位置两次或多次提及的“一实施例”或“一个实施例”或“一个替代性实施例”并不一定是指同一实施例。此外,本申请的一个或多个实施例中的某些特征、结构或特点可以进行适当的组合。

Claims (11)

  1. 一种运动数据标定方法,其特征在于,包括:
    获取用户运动时的动作数据,所述动作数据包括所述用户身上的至少一个测量位置对应的至少一个姿态信号,所述至少一个姿态信号中的每个姿态信号包括其对应的测量位置在原始坐标系下的三维姿态数据;
    构建目标坐标系,所述目标坐标系包括X轴、Y轴、Z轴三个互相垂直的坐标轴;以及
    将所述每个姿态信号转换成在所述目标坐标系下的二维姿态数据。
  2. 如权利要求1所述的运动数据标定方法,其特征在于,所述每个姿态信号包括由姿态传感器测量得到的数据,所述原始坐标系包括所述姿态传感器所在的坐标系。
  3. 如权利要求2所述的运动数据标定方法,其特征在于,所述姿态传感器包括加速度传感器、角速度传感器以及磁力传感器中的至少一种。
  4. 如权利要求1所述的运动数据标定方法,其特征在于,所述每个姿态信号包括由图像传感器测量的数据,所述原始坐标系包括所述图像传感器所在的坐标系。
  5. 如权利要求1所述的运动数据标定方法,其特征在于,所述三维姿态数据包括三个相互垂直的坐标轴上的角度数据和角速度数据。
  6. 如权利要求1所述的运动数据标定方法,其特征在于,所述将所述每个姿态信号转换成在所述目标坐标系下的二维姿态数据,包括:
    获取预先存储的所述目标坐标系与所述原始坐标系之间的转换关系;
    基于所述转换关系,将所述每个姿态信号转换为所述目标坐标系下的三维运动数据,所述三维运动数据至少包括所述X轴上的角速度数据、所述Y轴上的角速度数据以及所述Z轴上的角速度数据;以及
    将所述目标坐标系中的所述三维运动数据转换为所述目标坐标系中的所述二维姿态数据。
  7. 如权利要求6所述的运动数据标定方法,其特征在于,所述目标坐标系的所述Z轴为重力加速度所在的竖直方向的反方向。
  8. 如权利要求7所述的运动数据标定方法,其特征在于,所述目标坐标系下的所述二维姿态数据包括:
    水平姿态数据,包括在垂直于所述Z轴的水平平面内运动时的水平角度数据和水平角速度数据;以及
    竖直姿态数据,包括在垂直于所述水平平面的任意竖直平面内运动时的竖直角度数据和竖直角速度数据。
  9. 如权利要求8所述的运动数据标定方法,其特征在于,所述将所述目标坐标系中的所述三维运动数据转换为所述目标坐标系中的所述二维姿态数据,包括:
    将所述X轴上的角速度数据和所述Y轴上的角速度数据通过矢量法则转换成所述竖直角速度数据;
    基于所述用户运动时的起始位置和结束位置对应的时间,对所述竖直角速度数据进行时间积分,获取所述竖直角度数据;
    将所述Z轴的角速度数据作为所述水平角速度数据;以及
    基于所述用户运动时的起始位置和结束位置对应的时间,对所述水平角速度数据进行时间积分,获取所述水平角度数据。
  10. 如权利要求1所述的运动数据标定方法,其特征在于,还包括:
    基于所述每个姿态信号对应的所述二维姿态数据,确定所述至少一个测量位置之间的相对运动。
  11. 一种运动数据标定系统,其特征在于,包括:
    至少一个存储介质,存储有至少一个指令集用于运动数据标定;以及
    至少一个处理器,同所述至少一个存储介质通信连接,
    其中,当所述运动数据标定系统运行时,所述至少一个处理器读取所述至少一个指令集并实施权利要求1-10中任一项所述的运动数据标定方法。
PCT/CN2021/134421 2021-11-30 2021-11-30 运动数据标定方法和系统 WO2023097447A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
EP21965891.1A EP4365850A1 (en) 2021-11-30 2021-11-30 Movement data calibration method and system
CN202180100382.3A CN117651847A (zh) 2021-11-30 2021-11-30 运动数据标定方法和系统
KR1020237044869A KR20240013224A (ko) 2021-11-30 2021-11-30 운동 데이터 캘리브레이션 방법과 시스템
PCT/CN2021/134421 WO2023097447A1 (zh) 2021-11-30 2021-11-30 运动数据标定方法和系统
US18/421,955 US20240168051A1 (en) 2021-11-30 2024-01-24 Motion data calibration method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/134421 WO2023097447A1 (zh) 2021-11-30 2021-11-30 运动数据标定方法和系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/421,955 Continuation US20240168051A1 (en) 2021-11-30 2024-01-24 Motion data calibration method and system

Publications (1)

Publication Number Publication Date
WO2023097447A1 true WO2023097447A1 (zh) 2023-06-08

Family

ID=86611295

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/134421 WO2023097447A1 (zh) 2021-11-30 2021-11-30 运动数据标定方法和系统

Country Status (5)

Country Link
US (1) US20240168051A1 (zh)
EP (1) EP4365850A1 (zh)
KR (1) KR20240013224A (zh)
CN (1) CN117651847A (zh)
WO (1) WO2023097447A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013242790A (ja) * 2012-05-22 2013-12-05 Nippon Telegr & Teleph Corp <Ntt> 移動物体ポーズ算出装置、移動物体ポーズ算出方法、及びプログラム
US20150294597A1 (en) * 2012-10-23 2015-10-15 New York University Somatosensory feedback wearable object
CN108939512A (zh) * 2018-07-23 2018-12-07 大连理工大学 一种基于穿戴式传感器的游泳姿态测量方法
CN110517336A (zh) * 2019-08-28 2019-11-29 北京理工大学 一种基于主发力关节点的人体运动数据压缩方法及设备
CN110705496A (zh) * 2019-10-11 2020-01-17 成都乐动信息技术有限公司 一种基于九轴传感器的游泳姿势识别方法
CN111353941A (zh) * 2018-12-21 2020-06-30 广州幻境科技有限公司 一种空间坐标转换方法
CN111767932A (zh) * 2019-04-02 2020-10-13 北京深蓝长盛科技有限公司 动作判定方法及装置、计算机设备及计算机可读存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013242790A (ja) * 2012-05-22 2013-12-05 Nippon Telegr & Teleph Corp <Ntt> 移動物体ポーズ算出装置、移動物体ポーズ算出方法、及びプログラム
US20150294597A1 (en) * 2012-10-23 2015-10-15 New York University Somatosensory feedback wearable object
CN108939512A (zh) * 2018-07-23 2018-12-07 大连理工大学 一种基于穿戴式传感器的游泳姿态测量方法
CN111353941A (zh) * 2018-12-21 2020-06-30 广州幻境科技有限公司 一种空间坐标转换方法
CN111767932A (zh) * 2019-04-02 2020-10-13 北京深蓝长盛科技有限公司 动作判定方法及装置、计算机设备及计算机可读存储介质
CN110517336A (zh) * 2019-08-28 2019-11-29 北京理工大学 一种基于主发力关节点的人体运动数据压缩方法及设备
CN110705496A (zh) * 2019-10-11 2020-01-17 成都乐动信息技术有限公司 一种基于九轴传感器的游泳姿势识别方法

Also Published As

Publication number Publication date
CN117651847A (zh) 2024-03-05
EP4365850A1 (en) 2024-05-08
KR20240013224A (ko) 2024-01-30
US20240168051A1 (en) 2024-05-23

Similar Documents

Publication Publication Date Title
WO2018196227A1 (zh) 人体运动能力评价方法、装置及系统
Slade et al. An open-source and wearable system for measuring 3D human motion in real-time
JP6772276B2 (ja) 運動認識装置及び運動認識方法
WO2022193425A1 (zh) 运动数据展示方法和系统
Najafi et al. Estimation of center of mass trajectory using wearable sensors during golf swing
JP2017520336A (ja) 人体および物体運動への生体力学フィードバックを送達するための方法およびシステム
Du et al. An IMU-compensated skeletal tracking system using Kinect for the upper limb
Sanders et al. An approach to identifying the effect of technique asymmetries on body alignment in swimming exemplified by a case study of a breaststroke swimmer
WO2021041823A1 (en) Systems and methods for wearable devices that determine balance indices
Wang et al. Motion analysis of deadlift for trainers with different levels based on body sensor network
Chen et al. Development of an upper limb rehabilitation system using inertial movement units and kinect device
Steijlen et al. Smart sensor tights: Movement tracking of the lower limbs in football
KR20180031610A (ko) 밴드형 운동 및 생체정보 측정 장치
Postolache et al. Postural balance analysis using force platform for K-theragame users
CN116304544A (zh) 运动数据标定方法和系统
WO2023097447A1 (zh) 运动数据标定方法和系统
JP2024523942A (ja) 運動データ較正方法及びシステム
Mangin et al. An instrumented glove for swimming performance monitoring
WO2022145563A1 (ko) 사용자 맞춤형 운동 훈련 방법 및 시스템
CN113017615A (zh) 基于惯性动作捕捉设备的虚拟交互运动辅助系统及方法
Tannus de Souza et al. A virtual reality exergame with a low-cost 3d motion tracking for at-home post-stroke rehabilitation
TWM584698U (zh) 人體關節訓練回饋裝置以及人體關節訓練回饋系統
TW201701223A (zh) 健身記錄分享系統及方法
US20230337989A1 (en) Motion data display method and system
TWI745839B (zh) 結合擴增實境的即時回饋軀幹對稱協調訓練系統

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 20237044869

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 1020237044869

Country of ref document: KR

ENP Entry into the national phase

Ref document number: 2024500107

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 202180100382.3

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2021965891

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2021965891

Country of ref document: EP

Effective date: 20240130