WO2022193330A1 - 一种运动监控方法及其系统 - Google Patents

一种运动监控方法及其系统 Download PDF

Info

Publication number
WO2022193330A1
WO2022193330A1 PCT/CN2021/081931 CN2021081931W WO2022193330A1 WO 2022193330 A1 WO2022193330 A1 WO 2022193330A1 CN 2021081931 W CN2021081931 W CN 2021081931W WO 2022193330 A1 WO2022193330 A1 WO 2022193330A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
action
user
coordinate system
information corresponding
Prior art date
Application number
PCT/CN2021/081931
Other languages
English (en)
French (fr)
Inventor
苏雷
周鑫
黎美琪
廖风云
Original Assignee
深圳市韶音科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市韶音科技有限公司 filed Critical 深圳市韶音科技有限公司
Priority to KR1020237016055A priority Critical patent/KR20230086750A/ko
Priority to PCT/CN2021/081931 priority patent/WO2022193330A1/zh
Priority to JP2023528497A priority patent/JP2023549242A/ja
Priority to EP21930919.2A priority patent/EP4201323A4/en
Priority to CN202180070833.3A priority patent/CN116981401A/zh
Priority to CN202110516387.6A priority patent/CN115115751A/zh
Priority to CN202180064627.1A priority patent/CN116963807A/zh
Priority to PCT/CN2021/093302 priority patent/WO2022193425A1/zh
Priority to EP22770210.7A priority patent/EP4167129A4/en
Priority to CN202210103211.2A priority patent/CN115105100A/zh
Priority to KR1020237007354A priority patent/KR20230044297A/ko
Priority to PCT/CN2022/074379 priority patent/WO2022193851A1/zh
Priority to CN202280005956.3A priority patent/CN116261749A/zh
Priority to CN202210103219.9A priority patent/CN115105056A/zh
Priority to EP22743434.7A priority patent/EP4085834A4/en
Priority to JP2023514098A priority patent/JP2023540286A/ja
Priority to PCT/CN2022/074377 priority patent/WO2022193850A1/zh
Priority to KR1020227032041A priority patent/KR20220142495A/ko
Priority to JP2022560093A priority patent/JP7455995B2/ja
Priority to JP2023535549A priority patent/JP2023553625A/ja
Priority to KR1020237016947A priority patent/KR20230091961A/ko
Priority to PCT/CN2022/081718 priority patent/WO2022194281A1/zh
Priority to EP22770633.0A priority patent/EP4202667A1/en
Priority to TW111110179A priority patent/TWI837620B/zh
Priority to CN202280006986.6A priority patent/CN117157622A/zh
Priority to US17/815,567 priority patent/US20220365600A1/en
Publication of WO2022193330A1 publication Critical patent/WO2022193330A1/zh
Priority to US18/155,703 priority patent/US20230154607A1/en
Priority to US18/182,373 priority patent/US20230210402A1/en
Priority to US18/183,923 priority patent/US20230233103A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/25Bioelectric electrodes therefor
    • A61B5/279Bioelectric electrodes therefor specially adapted for particular uses
    • A61B5/296Bioelectric electrodes therefor specially adapted for particular uses for electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • A61B5/397Analysis of electromyograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • A61B5/7207Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7246Details of waveform analysis using correlation, e.g. template matching or determination of similarity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7455Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/726Details of waveform analysis characterised by using transforms using Wavelet transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction

Definitions

  • the present application relates to the technical field of wearable devices, and in particular, to a motion monitoring method and system.
  • exercise monitoring equipment mainly monitors some physiological parameter information (eg, heart rate, body temperature, stride frequency, blood oxygen, etc.) during the user's exercise, but cannot accurately monitor and feedback the user's movements.
  • physiological parameter information eg, heart rate, body temperature, stride frequency, blood oxygen, etc.
  • the process of monitoring and feedback on user actions often requires the participation of professionals.
  • users generally can only continuously correct their fitness movements under the guidance of a fitness coach.
  • One aspect of the present application provides a motion monitoring method, including: acquiring an action signal when a user is exercising, wherein the action signal at least includes an electromyographic signal or a posture signal; and at least based on feature information corresponding to the electromyographic signal or The feature information corresponding to the gesture signal monitors the movement of the user.
  • monitoring the movement of the user based on at least the feature information corresponding to the EMG signal or the feature information corresponding to the gesture signal includes: based on the feature information corresponding to the EMG signal or the The feature information corresponding to the gesture signal segments the motion signal; and monitors the motion of the user's movement based on at least one segment of the motion signal.
  • the characteristic information corresponding to the electromyographic signal at least includes frequency information or amplitude information
  • the characteristic information corresponding to the attitude signal at least includes angular velocity direction, angular velocity value and acceleration value of angular velocity, angle, displacement information, one of the stresses.
  • the segmenting the action signal based on the feature information corresponding to the EMG signal or the feature information corresponding to the gesture signal includes: based on the EMG signal or the gesture signal a time domain window, determining at least one target feature point from the time domain window according to a preset condition; and segmenting the action signal based on the at least one target feature point.
  • the at least one target feature point includes one of an action start point, an action middle point, and an action end point.
  • the preset conditions include that the direction of the angular velocity corresponding to the attitude signal changes, the angular velocity corresponding to the attitude signal is greater than or equal to an angular velocity threshold, and the change value of the angular velocity value corresponding to the attitude signal is an extreme value , the angle corresponding to the attitude signal reaches an angle threshold, and the amplitude information corresponding to the myoelectric signal is greater than or equal to one or more of the myoelectric thresholds.
  • the preset condition further includes that the acceleration of the angular velocity corresponding to the attitude signal continues to be greater than or equal to the acceleration threshold of the angular velocity within a first specific time range.
  • the preset condition further includes that the amplitude corresponding to the electromyography signal is continuously greater than the electromyography threshold within a second specific time range.
  • monitoring the movement of the user based on at least the feature information corresponding to the myoelectric signal or the feature information corresponding to the gesture signal includes: monitoring the muscle movement in a frequency domain or a time domain. Preprocessing the electrical signal; and obtaining feature information corresponding to the EMG signal based on the preprocessed EMG signal, and performing the preprocessing on the EMG signal according to the feature information corresponding to the EMG signal or the feature information corresponding to the posture signal. The actions of the user's movements are monitored.
  • the preprocessing of the EMG signal in the frequency domain or the time domain includes: filtering the EMG signal to select a specific frequency range in the EMG signal in the frequency domain Element.
  • the preprocessing of the EMG signal in the frequency domain or the time domain includes performing signal correction processing on the EMG signal in the time domain.
  • the performing signal correction processing on the EMG signal in the time domain includes: determining a singular point in the EMG signal, where the singular point corresponds to a sudden change in the EMG signal; and performing signal correction processing on the singular point of the electromyographic signal.
  • the performing signal correction processing on the singular point of the EMG signal includes removing the singular point or correcting the singular point according to the signals around the singular point.
  • the singular point includes a spur signal
  • the determining the singular point in the myoelectric signal includes: based on a time domain window of the myoelectric signal, from within the time domain window of the myoelectric signal Selecting different time windows, wherein the different time windows cover different time ranges respectively; and determining the spur signal based on the characteristic information corresponding to the EMG signals in the different time windows.
  • the method further includes determining feature information corresponding to the attitude signal based on the attitude signal, wherein the attitude signal includes coordinate information in at least one original coordinate system;
  • the feature information corresponding to the attitude signal includes: acquiring a target coordinate system and a conversion relationship between the target coordinate system and the at least one original coordinate system; The coordinate information is converted into coordinate information in the target coordinate system; and based on the coordinate information in the target coordinate system, feature information corresponding to the attitude signal is determined.
  • the gesture signal includes coordinate information generated by at least two sensors, the at least two sensors are located at different moving parts of the user and correspond to different original coordinate systems, and the determination based on the gesture signal
  • the feature information corresponding to the attitude signal includes: determining the feature information corresponding to the at least two sensors based on the conversion relationship between the different original coordinate systems and the target coordinate system; The feature information corresponding to each sensor is used to determine the relative motion between different moving parts of the user.
  • the transformation relationship between the at least one original coordinate system and the target coordinate system is obtained through a calibration process
  • the calibration process includes: constructing a specific coordinate system, and the specific coordinate system is related to the user in the calibration process. is related to the orientation of the user; obtain the first coordinate information of the at least one original coordinate system when the user is in the first posture; obtain the second coordinate information of the at least one original coordinate system when the user is in the second posture; The first coordinate information, the second coordinate information and the specific coordinate system determine the conversion relationship between the at least one original coordinate system and the specific coordinate system.
  • the calibration process further includes: acquiring a transformation relationship between the specific coordinate system and the target coordinate system; and according to the transformation relationship between the at least one original coordinate system and the specific coordinate system, and The conversion relationship between the specific coordinate system and the target coordinate system is determined, and the conversion relationship between the at least one original coordinate system and the target coordinate system is determined.
  • the target coordinate system changes as the user's orientation changes.
  • Another aspect of the present application provides a training method for an action recognition model, including: acquiring sample information, where the sample information includes an action signal when a user is exercising, and the action signal at least includes feature information and a posture signal corresponding to the electromyographic signal corresponding feature information; and training the action recognition model based on the sample information.
  • Another aspect of the present application also provides a motion monitoring and feedback method, including: acquiring motion signals when a user is exercising, wherein the motion signals at least include myoelectric signals and posture signals; The feature information corresponding to the myoelectric signal and the feature information corresponding to the gesture signal monitor the user's action, and perform action feedback based on the output result of the action recognition model.
  • the action recognition model includes a trained machine learning model or a preset model.
  • the motion feedback includes at least one of sending out prompt information, stimulating a user's moving part, and outputting a motion record when the user moves.
  • FIG. 1 is a schematic diagram of an application scenario of a motion monitoring system according to some embodiments of the present application.
  • FIG. 2 is a schematic diagram of exemplary hardware and/or software of a wearable device according to some embodiments of the present application
  • FIG. 3 is a schematic diagram of exemplary hardware and/or software of a computing device according to some embodiments of the present application.
  • FIG. 4 is an exemplary structural diagram of a wearable device according to some embodiments of the present application.
  • FIG. 5 is an exemplary flowchart of a motion monitoring method according to some embodiments of the present application.
  • FIG. 6 is an exemplary flowchart of monitoring a user's motion according to some embodiments of the present application.
  • FIG. 7 is an exemplary flowchart of motion signal segmentation according to some embodiments of the present application.
  • FIG. 8 is an exemplary normalized result graph of motion signal segmentation according to some embodiments of the present application.
  • FIG. 9 is an exemplary flowchart of EMG signal preprocessing according to some embodiments of the present application.
  • FIG. 10 is an exemplary flowchart of a deglitch signal according to some embodiments of the present application.
  • FIG. 11 is an exemplary flowchart of determining feature information corresponding to a gesture signal according to some embodiments of the present application.
  • FIG. 12 is an exemplary flowchart of determining relative motion between different moving parts of a user according to some embodiments of the present application.
  • FIG. 13 is an exemplary flowchart of determining the conversion relationship between the original coordinate system and a specific coordinate system according to some embodiments of the present application;
  • FIG. 14 is an exemplary flowchart of determining the transformation relationship between the original coordinate system and the target coordinate system according to some embodiments of the present application.
  • 15A is an exemplary vector coordinate diagram of Euler angle data in the original coordinate system at the position of the human forearm according to some embodiments of the present application;
  • FIG. 15B is an exemplary vector coordinate diagram of Euler angle data in another original coordinate system at the position of the human forearm according to some embodiments of the present application;
  • 16A is an exemplary vector coordinate diagram of Euler angle data in a target coordinate system at a position of a human forearm according to some embodiments of the present application;
  • 16B is an exemplary vector coordinate diagram of Euler angle data in a target coordinate system at another position of the human forearm according to some embodiments of the present application;
  • 17 is an exemplary vector coordinate diagram of Euler angle data in a target coordinate system of a multi-sensor according to some embodiments of the present application.
  • FIG. 18A is an exemplary result graph of raw angular velocity according to some embodiments of the present application.
  • FIG. 18B is an exemplary result graph of filtered angular velocity according to some embodiments of the present application.
  • FIG. 19 is an exemplary flowchart of a motion monitoring and feedback method according to some embodiments of the present application.
  • FIG. 20 is an exemplary flowchart of the application of model training according to some embodiments of the present application.
  • system means for distinguishing different components, elements, parts, parts or assemblies at different levels.
  • device means for converting signals into signals.
  • unit means for converting signals into signals.
  • module means for converting signals into signals.
  • This specification provides a motion monitoring system, which can acquire motion signals of a user when exercising, wherein the motion signals at least include EMG signals, posture signals, ECG signals, respiratory rate signals, and the like.
  • the system can monitor the movement of the user based on at least the feature information corresponding to the electromyographic signal or the feature information corresponding to the gesture signal.
  • the user's action type, action quantity, action quality, action time, and action time are determined by the frequency information, amplitude information corresponding to the EMG signal, and the angular velocity, angular velocity direction, and angular velocity value, angle, displacement information, and stress corresponding to the posture signal.
  • physiological parameter information when the user performs an action.
  • the exercise monitoring system may also generate feedback on the user's fitness action according to the analysis result of the user's fitness action, so as to guide the user's fitness.
  • the motion monitoring system may issue prompt information (eg, voice prompt, vibration prompt, electrical stimulation, etc.) to the user.
  • the motion monitoring system can be applied to wearable devices (for example, clothing, wristbands, helmets), medical detection equipment (for example, an electromyography tester), fitness equipment, etc. Accurate monitoring and feedback of the user's movements without the participation of professionals can improve the user's fitness efficiency while reducing the user's fitness cost.
  • FIG. 1 is a schematic diagram of an application scenario of a motion monitoring system according to some embodiments of the present application.
  • the motion monitoring system 100 may include a processing device 110 , a network 120 , a wearable device 130 and a mobile terminal device 140 .
  • the motion monitoring system 100 can acquire motion signals (eg, EMG signals, posture signals, ECG signals, respiratory rate signals, etc.) used to characterize the user's motion motion, and monitor and feedback the user's motion during motion according to the user's motion signal .
  • motion signals eg, EMG signals, posture signals, ECG signals, respiratory rate signals, etc.
  • the exercise monitoring system 100 can monitor and feedback the user's movements during exercise.
  • the wearable device 130 may acquire the user's motion signal.
  • the processing device 110 or the mobile terminal device may receive and analyze the user's action signal to determine whether the user's fitness action is standardized, so as to monitor the user's action.
  • monitoring the user's actions may include determining the action type, action quantity, action quality, action time, or physiological parameter information when the user performs the action, and the like.
  • the motion monitoring system 100 may generate feedback on the user's fitness action according to the analysis result of the user's fitness action, so as to guide the user's fitness.
  • the motion monitoring system 100 can monitor and feedback the user's actions when running. For example, when the user wears the wearable device 130 for running, the sports monitoring system 100 can monitor whether the user's running action is standardized, whether the running time meets the health standard, and the like. When the user's running time is too long or the running action is incorrect, the fitness device can feed back its exercise state to the user to prompt the user that the running action or running time needs to be adjusted.
  • processing device 110 may be used to process information and/or data related to user movement.
  • the processing device 110 may receive the user's action signal (eg, an EMG signal, a posture signal, an ECG signal, a breathing frequency signal, etc.), and further extract feature information corresponding to the action signal (eg, an EMG signal in the action signal) The corresponding feature information, the feature information corresponding to the attitude signal).
  • the processing device 110 may perform specific signal processing, such as signal segmentation, signal preprocessing (eg, signal correction processing, filtering processing, etc.), on the EMG signal or gesture signal collected by the wearable device 130 .
  • the processing device 110 may also determine whether the user's action is correct based on the user's action signal. For example, the processing device 110 may determine whether the user's action is correct based on characteristic information (eg, amplitude information, frequency information, etc.) corresponding to the electromyographic signal. For another example, the processing device 110 may determine whether the user's action is correct based on feature information corresponding to the gesture signal (eg, angular velocity, direction of angular velocity, acceleration of angular velocity, angle, displacement information, stress, etc.). For another example, the processing device 110 may determine whether the user's action is correct based on the feature information corresponding to the EMG signal and the feature information corresponding to the gesture signal.
  • characteristic information eg, amplitude information, frequency information, etc.
  • feature information corresponding to the gesture signal eg, angular velocity, direction of angular velocity, acceleration of angular velocity, angle, displacement information, stress, etc.
  • the processing device 110 may determine whether the user's action is correct based on
  • the processing device 110 may also determine whether the physiological parameter information of the user when exercising meets the health standard. In some embodiments, the processing device 110 may also issue corresponding instructions to feed back the user's movement situation. For example, when the user is running, the motion monitoring system 100 monitors that the user's running time is too long. At this time, the processing device 110 may issue an instruction to the mobile terminal device 140 to prompt the user to adjust the running time.
  • the characteristic information corresponding to the attitude signal is not limited to the above-mentioned angular velocity, angular velocity direction, angular velocity acceleration, angle, displacement information, stress, etc., but can also be other characteristic information, which can be used to reflect the relative movement of the user's body.
  • the parameter information can be the feature information corresponding to the attitude signal.
  • the posture sensor is a strain gauge sensor
  • the bending angle and bending direction of the user's joint can be obtained by measuring the magnitude of the resistance in the strain gauge sensor that changes with the stretched length.
  • processing device 110 may be local or remote.
  • the processing device 110 may access information and/or data stored in the wearable device 130 and/or the mobile terminal device 140 through the network 120 .
  • the processing device 110 may connect directly with the wearable device 130 and/or the mobile terminal device 140 to access information and/or data stored therein.
  • the processing device 110 may be located in the wearable device 130 and realize information interaction with the mobile terminal device 140 through the network 120 .
  • the processing device 110 may be located in the mobile terminal device 140 and realize information interaction with the wearable device 130 through a network.
  • processing device 110 may execute on a cloud platform.
  • the cloud platform may include one or any combination of a private cloud, a public cloud, a hybrid cloud, a community cloud, a decentralized cloud, an internal cloud, and the like.
  • processing device 110 may process data and/or information related to motion monitoring to perform one or more of the functions described herein.
  • the processing device may acquire motion signals collected by the wearable device 130 when the user moves.
  • the processing device may send control instructions to the wearable device 130 or the mobile terminal device 140 .
  • the control instructions can control the switch states of the wearable device 130 and its sensors. It is also possible to control the mobile terminal device 140 to send out prompt information.
  • processing device 110 may include one or more sub-processing devices (eg, a single-core processing device or a multi-core multi-core processing device).
  • the processing device 110 may include a central processing unit (CPU), an application specific integrated circuit (ASIC), an application specific instruction processor (ASIP), a graphics processor (GPU), a physical processor (PPU), a digital signal processor ( DSP), field programmable gate array (FPGA), programmable logic circuit (PLD), controller, microcontroller unit, reduced instruction set computer (RISC), microprocessor, etc. or any combination of the above.
  • CPU central processing unit
  • ASIC application specific integrated circuit
  • ASIP application specific instruction processor
  • GPU graphics processor
  • PPU physical processor
  • DSP digital signal processor
  • FPGA field programmable gate array
  • PLD programmable logic circuit
  • controller microcontroller unit, reduced instruction set computer (RISC), microprocessor, etc. or any combination of the above.
  • Network 120 may facilitate the exchange of data and/or information in motion monitoring system 100 .
  • one or more components in motion monitoring system 100 eg, processing device 110 , wearable device 130 , mobile terminal device 140
  • the motion signal collected by the wearable device 130 may be transmitted to the processing device 110 through the network 120 .
  • the confirmation result of the action signal in the processing device 110 may be transmitted to the mobile terminal device 140 through the network 120 .
  • network 120 may be any type of wired or wireless network.
  • the network 120 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an internal network, the Internet, a local area network (LAN), a wide area network (WAN), a wireless area network (WLAN), a metropolitan area network (MAN) , Public Switched Telephone Network (PSTN), Bluetooth network, ZigBee network, Near Field Communication (NFC) network, etc. or any combination of the above.
  • network 120 may include one or more network entry and exit points.
  • network 120 may contain wired or wireless network entry and exit points, such as base stations and/or Internet exchange points 120-1, 120-2, . . . , through which one or more components of motion monitoring system 100 may connect to network 120 to exchange data and/or information.
  • the wearable device 130 refers to a garment or device with a wearable function.
  • the wearable device 130 may include, but is not limited to, an upper garment assembly 130-1, a pants assembly 130-2, a wrist support assembly 130-3, shoes 130-4, and the like.
  • wearable device 130 may include multiple sensors. The sensor can acquire various motion signals (eg, myoelectric signal, posture signal, temperature information, heartbeat frequency, electrocardiogram signal, etc.) when the user is exercising.
  • various motion signals eg, myoelectric signal, posture signal, temperature information, heartbeat frequency, electrocardiogram signal, etc.
  • the senor may include, but is not limited to, one of an electromyographic sensor, a posture sensor, a temperature sensor, a humidity sensor, an electrocardiogram sensor, a blood oxygen saturation sensor, a Hall sensor, a skin electrical sensor, a rotation sensor, and the like or more.
  • an EMG sensor may be provided at the position of a human body muscle (eg, biceps brachii, triceps, latissimus dorsi, trapezius, etc.) in the upper garment device 130-1, and the EMG sensor may fit the user's skin and collect EMG signals when the user is exercising.
  • an electrocardiogram sensor may be provided near the left pectoral muscle of the human body in the upper garment device 130-1, and the electrocardiogram sensor may collect the electrocardiographic signal of the user.
  • a posture sensor may be provided at the position of a human body muscle (eg, gluteus maximus, vastus lateralis, vastus medialis, gastrocnemius, etc.) in the pants device 130-2, and the posture sensor may collect a user's posture signal.
  • the wearable device 130 may also provide feedback on the user's actions. For example, when the action of a certain part of the body does not meet the standard when the user is exercising, the myoelectric sensor corresponding to the part can generate a stimulation signal (eg, current stimulation or hitting signal) to remind the user.
  • a stimulation signal eg, current stimulation or hitting signal
  • the wearable device 130 is not limited to the jacket device 130-1, the pants device 130-2, the wristband device 130-3 and the shoe device 130-4 shown in FIG.
  • Equipment for motion monitoring such as helmet devices, knee pads, etc., is not limited here, and any equipment that can use the motion monitoring method contained in this specification is within the scope of protection of this application.
  • the mobile terminal device 140 may acquire information or data in the motion monitoring system 100 .
  • the mobile terminal device 140 may receive the motion data processed by the processing device 110, and feed back motion records and the like based on the processed motion data.
  • Exemplary feedback methods may include, but are not limited to, voice prompts, image prompts, video presentations, text prompts, and the like.
  • the user may obtain the action record during the movement of the user through the mobile terminal device 140 .
  • the mobile terminal device 140 can be connected with the wearable device 130 through the network 120 (eg, wired connection, wireless connection), and the user can obtain the action record during the user's exercise through the mobile terminal device 140, and the action record can be obtained through the mobile terminal device.
  • the mobile terminal device 140 may include one or any combination of a mobile device 140-1, a tablet computer 140-2, a notebook computer 140-3, and the like.
  • the mobile device 140-1 may include a cell phone, a smart home device, a smart mobile device, a virtual reality device, an augmented reality device, etc., or any combination thereof.
  • the smart home devices may include control devices for smart appliances, smart monitoring devices, smart TVs, smart cameras, etc., or any combination thereof.
  • the smart mobile device may include a smart phone, a personal digital assistant (PDA), a gaming device, a navigation device, a POS device, etc., or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a virtual reality headset, virtual reality glasses, virtual reality eyewear, augmented reality helmet, augmented reality glasses, augmented reality eyewear, etc., or any combination thereof.
  • the motion monitoring system 100 may also include a database.
  • the database may store data (eg, initially set threshold conditions, etc.) and/or instructions (eg, feedback instructions).
  • the database may store data obtained from the wearable device 130 and/or the mobile terminal device 140 .
  • a database may store information and/or instructions for execution or use by processing device 110 to perform the example methods described in this application.
  • the database may include mass storage, removable storage, volatile read-write memory (eg, random access memory RAM), read only memory (ROM), etc., or any combination thereof.
  • the database may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, public cloud, hybrid cloud, community cloud, decentralized cloud, internal cloud, etc., or any combination thereof.
  • the database may be connected to network 120 to communicate with one or more components of motion monitoring system 100 (eg, processing device 110, wearable device 130, mobile terminal device 140, etc.). One or more components of the motion monitoring system 100 may access data or instructions stored in the database via the network 120 . In some embodiments, the database may be directly connected or communicated with one or more components in the motion monitoring system 100 (eg, the processing device 110 , the wearable device 130 , the mobile terminal device 140 ). In some embodiments, the database may be part of the processing device 110 .
  • FIG. 2 is a schematic diagram of exemplary hardware and/or software of a wearable device according to some embodiments of the present application.
  • the wearable device 130 may include an acquisition module 210, a processing module 220 (also referred to as a processor), a control module 230 (also referred to as a main control, an MCU, a controller), a communication module 240, a power supply module 250 and input/output module 260.
  • the acquiring module 210 may be used to acquire the motion signal of the user when exercising.
  • the acquisition module 210 may include a sensor unit, and the sensor unit may be used to acquire one or more motion signals when the user moves.
  • the sensor unit may include, but is not limited to, myoelectric sensor, attitude sensor, ECG sensor, respiration sensor, temperature sensor, humidity sensor, inertial sensor, blood oxygen saturation sensor, Hall sensor, electrical skin sensor, One or more of a rotation sensor, etc.
  • the motion signal may include one or more of an electromyography signal, a posture signal, an electrocardiogram signal, a respiratory rate, a temperature signal, a humidity signal, and the like.
  • the sensor units can be placed in different positions of the wearable device 130 according to the type of motion signal to be acquired.
  • an EMG sensor also referred to as an electrode element
  • the EMG sensor may be configured to collect EMG signals when the user moves.
  • the EMG signal and its corresponding feature information eg, frequency information, amplitude information, etc.
  • the gesture sensor may be disposed at different positions of the human body (eg, positions corresponding to the trunk, limbs, and joints in the wearable device 130 ), and the gesture sensor may be configured to collect gesture signals when the user moves.
  • the gesture signal and its corresponding feature information can reflect the gesture of the user's movement.
  • the ECG sensor can be arranged at a position around the chest of the human body, and the ECG sensor can be configured to collect ECG data when the user is exercising.
  • the respiration sensor can be arranged at a position around the chest of the human body, and the respiration sensor can be configured to collect respiration data (eg, respiration frequency, respiration amplitude, etc.) when the user moves.
  • the temperature sensor may be configured to collect temperature data (eg, body surface temperature) while the user is exercising.
  • the humidity sensor may be configured to collect humidity data of the external environment when the user is exercising.
  • the processing module 220 may process data from the acquisition module 210 , the control module 230 , the communication module 240 , the power supply module 250 and/or the input/output module 260 .
  • the processing module 220 may process the motion signal from the acquisition module 210 during the user's movement.
  • the processing module 220 may preprocess the motion signals (eg, myoelectric signals, gesture signals) obtained by the obtaining module 210 .
  • the processing module 220 performs segmentation processing on the EMG signal or the gesture signal when the user moves.
  • the processing module 220 may perform preprocessing (for example, filtering processing, signal correction processing) on the EMG signal when the user is exercising, so as to improve the quality of the EMG signal.
  • processing module 220 may determine feature information corresponding to the gesture signal based on the gesture signal when the user moves.
  • processing module 220 may process instructions or operations from input/output module 260 .
  • the processed data may be stored in memory or hard disk.
  • the processing module 220 may transmit its processed data to one or more components in the motion monitoring system 100 via the communication module 240 or the network 120 .
  • the processing module 220 may send the monitoring result of the user's movement to the control module 230, and the control module 230 may execute subsequent operations or instructions according to the action determination result.
  • the control module 230 can be connected with other modules in the wearable device 130 .
  • the control module 230 may control the operation status of other modules (eg, the communication module 240 , the power supply module 250 , the input/output module 260 ) in the wearable device 130 .
  • the control module 230 may control the power supply state (eg, normal mode, power saving mode), power supply time, and the like of the power supply module 250 .
  • the control module 230 may control the power supply module 250 to enter a power saving mode or issue a prompt message about replenishing power.
  • control module 230 may control the input/output module 260 according to the user's action determination result, and then may control the mobile terminal device 140 to send the user's motion feedback result.
  • the control module 230 can control the input/output module 260, and then can control the mobile terminal device 140 to give feedback to the user, so that the user can know his own movement state in real time and Adjust the action.
  • the control module 230 may also control one or more sensors or other modules in the acquisition module 210 to provide feedback to the human body. For example, when a certain muscle exerts too much force during the user's exercise, the control module 230 may control the electrode module at the position of the muscle to electrically stimulate the user to prompt the user to adjust the action in time.
  • the communication module 240 may be used for the exchange of information or data.
  • communication module 240 may be used for communication between wearable device 130 internal components (eg, acquisition module 210, processing module 220, control module 230, power supply module 250, input/output module 260).
  • the acquisition module 210 may send a user action signal (eg, an EMG signal, a gesture signal, etc.) to the communication module 240
  • the communication module 240 may send the action signal to the processing module 220 .
  • the communication module 240 may also be used for communication between the wearable device 130 and other components in the motion monitoring system 100 (eg, the processing device 110 , the mobile terminal device 140 ).
  • the communication module 240 may send the state information (eg, switch state) of the wearable device 130 to the processing device 110, and the processing device 110 may monitor the wearable device 130 based on the state information.
  • the communication module 240 may adopt wired, wireless and wired/wireless hybrid technologies. Wired technology may be based on a combination of one or more fiber optic cables such as metallic cables, hybrid cables, fiber optic cables, and the like. Wireless technologies may include Bluetooth, Wi-Fi, ZigBee, Near Field Communication (NFC), Radio Frequency Identification (RFID), cellular networks (including GSM) , CDMA, 3G, 4G, 5G, etc.), cellular-based Narrow Band Internet of Things (NBIoT), etc.
  • Wired technology may be based on a combination of one or more fiber optic cables such as metallic cables, hybrid cables, fiber optic cables, and the like.
  • Wireless technologies may include Bluetooth, Wi-Fi, ZigBee, Near Field Communication (NFC), Radio Frequency Identification (RFID), cellular networks (including GSM
  • the communication module 240 may use one or more encoding methods to encode the transmitted information, for example, the encoding methods may include phase encoding, non-return-to-zero code, differential Manchester code, and the like. In some embodiments, the communication module 240 may select different transmission and encoding modes according to the data type or network type to be transmitted. In some embodiments, the communication module 240 may include one or more communication interfaces for different communication methods. In some embodiments, the other modules shown in the motion monitoring system 100 may be dispersed on multiple devices. In this case, each of the other modules may include one or more communication modules 240 respectively for inter-module communication. Information transfer. In some embodiments, the communication module 240 may include a receiver and a transmitter. In other embodiments, the communication module 240 may be a transceiver.
  • power module 250 may provide power to other components in motion monitoring system 100 (eg, acquisition module 210, processing module 220, control module 230, communication module 240, input/output module 260).
  • the power supply module 250 may receive control signals from the processing module 220 to control the power output of the wearable device 130 .
  • the power supply module 250 may only supply the memory Power is supplied to put the wearable device 130 into standby mode.
  • the power supply module 250 can be disconnected By supplying power to other components, the data in the motion monitoring system 100 can be transferred to the hard disk, so that the wearable device 130 enters a standby mode or a sleep mode.
  • the power supply module 250 may include at least one battery.
  • the battery may include one or a combination of dry cells, lead storage batteries, lithium batteries, solar cells, wind power generation cells, mechanical power generation cells, thermal power generation cells, and the like.
  • the solar cell can convert light energy into electrical energy and store it in the power supply module 250 .
  • the wind power generation battery can convert wind energy into electrical energy and store it in the power supply module 250 .
  • the mechanical energy generating battery may convert mechanical energy into electrical energy and store it in the power supply module 250 .
  • the solar cells may include silicon solar cells, thin film solar cells, nanocrystalline chemical solar cells, fuel sensitized solar cells, plastic solar cells, and the like. The solar cells may be distributed on the wearable device 130 in the form of panels.
  • the thermal power generation battery can convert the user's body temperature into electrical energy and store it in the power supply module 250 . In some embodiments, when the power of the power supply module 250 is less than a power threshold (eg, 10% of the total power), the processing module 220 may send a control signal to the power supply module 250 .
  • a power threshold eg, 10% of the total power
  • the control signal may include information that the power supply module 250 is insufficient in power.
  • the power supply module 250 may contain a backup power source.
  • the power supply module 250 may further include a charging interface.
  • the power supply module 250 can be powered by an electronic device (such as a mobile phone, a tablet computer) or a power bank that the user carries with him. Temporary charging.
  • the input/output module 260 can acquire, transmit and send signals. Input/output module 260 may interface or communicate with other components in motion monitoring system 100 . Other components in the motion monitoring system 100 may be connected or communicated through the input/output module 260 .
  • the input/output module 260 can be a wired USB interface, a serial communication interface, a parallel communication interface, or a wireless Bluetooth, infrared, radio-frequency identification (RFID), wireless local area network authentication and security infrastructure (WLAN). Authentication and Privacy Infrastructure, WAPI), General Packet Radio Service (GPRS), Code Division Multiple Access (Code Division Multiple Access, CDMA), etc., or any combination thereof.
  • the input/output module 260 may be connected to the network 120 and obtain information through the network 120 .
  • the input/output module 260 may obtain the motion signal during the user's movement from the obtaining module 210 through the network 120 or the communication module 240 and output the user's movement information.
  • the input/output module 260 may include VCC, GND, RS-232, RS-485 (eg, RS485-A, RS485-B), general network interfaces, etc., or any combination thereof.
  • the input/output module 260 may transmit the obtained user movement information to the obtaining module 210 through the network 120 .
  • the input/output module 260 may use one or more encoding methods to encode the transmitted signal.
  • the encoding manner may include phase encoding, non-return-to-zero code, differential Manchester code, etc., or any combination thereof.
  • the system and its modules of one or more embodiments of this specification may not only have semiconductors such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable logic devices such as field programmable gate arrays, programmable logic devices, etc.
  • a hardware circuit implementation of a programmed hardware device can also be implemented in software such as software executed by various types of processors, and can also be implemented in a combination of the above-described hardware circuit and software (eg, firmware).
  • the above description of the motion monitoring system and its modules is only for convenience of description, and does not limit one or more embodiments of this specification to the scope of the illustrated embodiments. It can be understood that for those skilled in the art, after understanding the principle of the system, it is possible to arbitrarily combine each module, or form a subsystem to connect with other modules, or to connect the modules without departing from the principle.
  • One or more of the modules are omitted.
  • the acquiring module 210 and the processing module 220 may be one module, and the module may have the function of acquiring and processing user action signals.
  • the processing module 220 may not be provided in the wearable device 130 but integrated in the processing device 110 . Variations such as these are within the scope of protection of one or more embodiments of this specification.
  • computing device 300 may include internal communication bus 310 , processor 320 , read only memory 330 , random access memory 340 , communication port 350 , input/output interface 360 , hard disk 370 , and user interface 380 .
  • the internal communication bus 310 may enable data communication among the various components in the computing device 300 .
  • processor 320 may send data to memory or other hardware such as input/output interface 360 via internal communication bus 310 .
  • the internal communication bus 310 may be an Industry Standard (ISA) bus, an Extended Industry Standard (EISA) bus, a Video Electronics Standard (VESA) bus, a Peripheral Component Interconnect Standard (PCI) bus, or the like.
  • ISA Industry Standard
  • EISA Extended Industry Standard
  • VESA Video Electronics Standard
  • PCI Peripheral Component Interconnect Standard
  • the internal communication bus 310 may be used to connect various modules in the motion monitoring system 100 shown in FIG. ).
  • the processor 320 may execute computational instructions (program code) and perform the functions of the motion monitoring system 100 described herein.
  • the computing instructions may include programs, objects, components, data structures, procedures, modules, and functions (the functions refer to the specific functions described in this application).
  • the processor 320 may process motion signals (eg, myoelectric signals, gesture signals) obtained from the wearable device 130 or/and the mobile terminal device 140 of the motion monitoring system 100 when the user is exercising.
  • the motion signal monitors the motion of the user's movement.
  • processor 320 may include a microcontroller, microprocessor, reduced instruction set computer (RISC), application specific integrated circuit (ASIC), application specific instruction set processor (ASIP), central processing unit (CPU) , Graphics Processing Unit (GPU), Physical Processing Unit (PPU), Microcontroller Unit, Digital Signal Processor (DSP), Field Programmable Gate Array (FPGA), Advanced Reduced Instruction Set Computer (ARM), Programmable Logic Device and any circuits, processors, etc., capable of performing one or more functions, or any combination thereof.
  • RISC reduced instruction set computer
  • ASIC application specific integrated circuit
  • ASIP application specific instruction set processor
  • CPU Central processing unit
  • GPU Graphics Processing Unit
  • PPU Physical Processing Unit
  • DSP Digital Signal Processor
  • FPGA Field Programmable Gate Array
  • ARM Advanced Reduced Instruction Set Computer
  • Programmable Logic Device any circuits, processors, etc., capable of performing one or more functions, or any combination thereof.
  • the computing device 300 in FIG. 3 depicts only one processor, but it should be noted that the computing device
  • the memory (eg, read only memory (ROM) 330 , random access memory (RAM) 340 , hard disk 370 , etc.) of computing device 300 may store data/information obtained from any other component of motion monitoring system 100 .
  • the memory of computing device 300 may be located in wearable device 130 as well as in processing device 110 .
  • Exemplary ROMs may include mask ROM (MROM), programmable ROM (PROM), erasable programmable ROM (PEROM), electrically erasable programmable ROM (EEPROM), compact disk ROM (CD-ROM), and digital Universal disk ROM, etc.
  • Exemplary RAMs may include dynamic RAM (DRAM), double rate synchronous dynamic RAM (DDR SDRAM), static RAM (SRAM), thyristor RAM (T-RAM), zero capacitance (Z-RAM), and the like.
  • Exemplary display devices may include liquid crystal displays (LCDs), light emitting diode (LED) based displays, flat panel displays, curved displays, television equipment, cathode ray tubes (CRTs), the like, or any combination thereof.
  • Communication port 350 may connect to a network for data communication.
  • the connection may be a wired connection, a wireless connection, or a combination of both.
  • Wired connections may include electrical cables, fiber optic cables, or telephone lines, among others, or any combination thereof.
  • Wireless connections may include Bluetooth, Wi-Fi, WiMax, WLAN, ZigBee, mobile networks (eg, 3G, 4G or 5G, etc.), etc., or any combination thereof.
  • the communication port 350 may be a standardized port such as RS232, RS485, or the like.
  • communication port 350 may be a specially designed port.
  • Hard disk 370 may be used to store information and data generated by or received from processing device 110 .
  • the hard disk 370 may store the user's user confirmation information.
  • the hard disk 370 may include a mechanical hard disk (HDD), a solid state disk (SSD), a hybrid hard disk (HHD), or the like.
  • hard disk 370 may be provided in processing device 110 or in wearable device 130 .
  • User interface 380 may enable interaction and exchange of information between computing device 300 and a user.
  • user interface 380 may be used to present motion recordings generated by motion monitoring system 100 to a user.
  • the user interface 380 may include a physical display, such as a display with speakers, an LCD display, an LED display, an OLED display, an electronic ink display (E-Ink), and the like.
  • FIG. 4 is an exemplary structural diagram of a wearable device according to some embodiments of the present application.
  • the upper garment is used as an exemplary illustration, as shown in FIG.
  • the top garment 410 may include a top garment base 4110, at least one top garment processing module 4120, at least one top garment feedback module 4130, at least one top garment acquisition module 4140, and the like.
  • the upper clothing base 4110 may refer to clothing worn on the upper body of the human body.
  • the upper garment substrate 4110 may comprise a short-sleeved T-shirt, a long-sleeved T-shirt, a shirt, a jacket, and the like.
  • At least one upper garment processing module 4120 and at least one upper garment acquisition module 4140 may be located on the upper garment base 4110 in areas that fit with different parts of the human body.
  • the at least one upper garment feedback module 4130 may be located at any position on the upper garment base 4110, and the at least one upper garment feedback module 4130 may be configured to feed back the motion state information of the upper body of the user.
  • Exemplary feedback methods may include, but are not limited to, voice prompts, text prompts, pressure prompts, electrical stimulation, and the like.
  • the at least one shirt acquisition module 4140 may include, but is not limited to, one of a posture sensor, an electrocardiogram sensor, an electromyography sensor, a temperature sensor, a humidity sensor, an inertial sensor, an acid-base sensor, a sound wave transducer, and the like or more.
  • the sensors in the upper garment acquisition module 4140 may be placed at different positions on the user's body according to different signals to be measured. For example, when the posture sensor is used to acquire the posture signal during the user's movement, the posture sensor may be placed in the position corresponding to the torso, arms, and joints of the upper garment base 4110 .
  • the myoelectric sensor when used to acquire the myoelectric signal during the user's exercise, the myoelectric sensor may be located near the muscle to be measured by the user.
  • the attitude sensor may include, but is not limited to, an acceleration triaxial sensor, an angular velocity triaxial sensor, a magnetic force sensor, etc., or any combination thereof.
  • an attitude sensor can include an acceleration triaxial sensor and an angular velocity triaxial sensor.
  • the attitude sensor may also include a strain gauge sensor.
  • the strain sensor may refer to a sensor that can be based on the strain generated by the force and deformation of the object to be measured.
  • strain gauge sensors may include, but are not limited to, one or more of strain gauge load cells, strain gauge pressure sensors, strain gauge torque sensors, strain gauge displacement sensors, strain gauge acceleration sensors, and the like.
  • a strain gauge sensor can be set at the user's joint position, and by measuring the resistance of the strain gauge sensor that changes with the stretched length, the bending angle and bending direction of the user's joint can be obtained.
  • the top clothing 410 may also include other modules, such as a power supply module, a communication module, an input/output module modules etc.
  • the jacket processing module 4120 is similar to the processing module 220 in FIG. 2
  • the jacket acquisition module 4140 is similar to the acquisition module 210 in FIG. 2 .
  • the various modules in the jacket 410 please refer to the relevant information in FIG. 2 of the present application. description, which is not repeated here.
  • FIG. 5 is an exemplary flowchart of a motion monitoring method according to some embodiments of the present application. As shown in FIG. 5, the process 500 may include:
  • step 510 a motion signal of the user when exercising is acquired.
  • this step 510 may be performed by the acquisition module 210 .
  • the action signal refers to the human body parameter information when the user is exercising.
  • the human body parameter information may include, but is not limited to, one or more of electromyographic signals, posture signals, electrocardiographic signals, temperature signals, humidity signals, blood oxygen concentration, respiratory rate, and the like.
  • the EMG sensor in the acquisition module 210 can collect EMG signals of the user during exercise. For example, when the user clamps the chest in a sitting position, the EMG sensor in the wearable device corresponding to the position of the human chest muscle, latissimus dorsi, etc. can collect the EMG signal of the user's corresponding muscle position.
  • the EMG sensor in the wearable device corresponding to the position of the human gluteus maximus, quadriceps femoris, etc. can collect the EMG signal of the user's corresponding muscle position.
  • the EMG sensor in the wearable device corresponding to the position of the gastrocnemius muscle of the human body can collect the EMG signals of the position of the gastrocnemius muscle of the human body.
  • the gesture sensor in the acquisition module 210 can collect gesture signals when the user moves.
  • the posture sensor in the wearable device corresponding to the position of the triceps brachii of the human body can collect the posture signals of the position of the triceps brachii of the user.
  • the posture sensor disposed at the position of the human deltoid muscle and the like can collect the posture signal of the position of the user's deltoid muscle and the like.
  • the number of gesture sensors in the acquiring module 210 may be multiple, the multiple gesture sensors may acquire gesture signals of multiple parts when the user moves, and the gesture signals of multiple parts may reflect the relative relationship between different parts of the human body sports situation.
  • the posture signal at the arm and the posture signal at the torso can reflect the movement of the arm relative to the torso.
  • the gesture signal is associated with a type of gesture sensor.
  • the attitude sensor is an angular velocity triaxial sensor
  • the acquired attitude signal is angular velocity information.
  • the attitude sensor is an angular velocity triaxial sensor and an acceleration triaxial sensor
  • the acquired attitude signals are angular velocity information and acceleration information.
  • the strain sensor can be set at the joint position of the user.
  • the obtained attitude signal can be displacement information, stress etc., the bending angle and bending direction at the joints of the user can be characterized by these gesture signals.
  • the parameter information that can be used to reflect the relative movement of the user's body can be the feature information corresponding to the gesture signal, and different types of gesture sensors can be used to obtain it according to the type of the feature information.
  • the motion signal may include an electromyographic signal of a specific part of the user's body and a gesture signal of the specific part.
  • the EMG signal and posture signal can reflect the motion state of a specific part of the user's body from different angles. Simply put, the gesture signal of a specific part of the user's body can reflect the action type, action amplitude, action frequency, etc. of the specific part.
  • the EMG signal can reflect the muscle state of the specific part during exercise. In some embodiments, the EMG signal and/or the posture signal of the same body part can better assess whether the movement of the part is normal.
  • step 520 the motion of the user's movement is monitored based on at least the feature information corresponding to the myoelectric signal or the feature information corresponding to the gesture signal.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the feature information corresponding to the EMG signal may include, but is not limited to, one or more of frequency information, amplitude information, and the like.
  • the feature information corresponding to the gesture signal refers to parameter information used to characterize the relative movement of the user's body.
  • the feature information corresponding to the attitude signal may include, but is not limited to, one or more of an angular velocity direction, an angular velocity value, an acceleration value of the angular velocity, and the like.
  • the feature information corresponding to the attitude signal may further include angle, displacement information (for example, the stretched length in the strain gauge sensor), stress, and the like.
  • the strain sensor when the attitude sensor is a strain sensor, the strain sensor can be set at the joint position of the user, and the obtained attitude signal can be displacement information, stress, etc. by measuring the resistance of the strain sensor that changes with the stretched length. , the bending angle and bending direction of the user's joints can be characterized by these gesture signals.
  • the processing module 220 and/or the processing device 110 may extract feature information (eg, frequency information, amplitude information) corresponding to the EMG signal or feature information (eg, angular velocity direction, angular velocity value, angular velocity acceleration value, angle, displacement information, stress, etc.), and monitor the user's movement based on the characteristic information corresponding to the EMG signal or the characteristic information corresponding to the attitude signal.
  • feature information eg, frequency information, amplitude information
  • feature information eg, angular velocity direction, angular velocity value, angular velocity acceleration value, angle, displacement information, stress, etc.
  • monitoring the action of the user's movement includes monitoring the information related to the user's action.
  • the action-related information may include one or more of user action type, action quantity, action quality (eg, whether the user action meets a standard), action time, and the like.
  • Action type refers to the fitness action that the user takes when exercising.
  • the movement types may include, but are not limited to, one or more of seated chest clips, squats, deadlifts, planks, running, swimming, and the like.
  • the number of actions refers to the number of times the user performs an action during exercise. For example, the user performs 10 seated chest clips during the exercise, where the 10 times are the number of movements.
  • Action quality refers to the standard degree of the fitness action performed by the user relative to the standard fitness action.
  • the processing device 110 may determine the action type of the user's action based on the characteristic information corresponding to the action signals (the EMG signal and the posture signal) of the specific muscle position (gluteal muscle, quadriceps, etc.) , and judge the motion quality of the user's squat motion based on the motion signal of the standard squat motion.
  • Action time refers to the time corresponding to one or more action types of the user or the total time of the exercise process.
  • the processing device 110 may utilize one or more motion recognition models to identify and monitor the motion of the user's motion. For example, the processing device 110 may input the feature information corresponding to the EMG signal and/or the feature information corresponding to the gesture signal into the action recognition model, and the action recognition model outputs information related to the user's action.
  • the motion recognition model may include different types of motion recognition models, eg, a model for identifying the type of user's motion, or a model for identifying the quality of the user's motion, and the like.
  • the above description about the process 500 is only for example and illustration, and does not limit the scope of application of this specification.
  • various modifications and changes can be made to the process 500 under the guidance of this specification.
  • these corrections and changes are still within the scope of this specification.
  • the extraction of the feature information corresponding to the electromyogram signal or the feature information corresponding to the gesture signal in step 520 may be completed by the processing device 110 , and in some embodiments, may also be completed by the processing module 220 .
  • the user's action signal is not limited to the above-mentioned EMG signal, posture signal, ECG signal, temperature signal, humidity signal, blood oxygen concentration, and respiratory rate, but can also be other human physiological parameter signals, such as those involved in human movement.
  • Physiological parameter signals can be regarded as action signals in the embodiments of this specification.
  • FIG. 6 is an exemplary flowchart of monitoring a user's motion according to some embodiments of the present application. As shown in FIG. 6, the process 600 may include:
  • step 610 the action signal is segmented based on the feature information corresponding to the myoelectric signal or the feature information corresponding to the gesture signal.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the collection process of action signals eg, myoelectric signals, posture signals
  • the actions of the user when exercising may be a combination of multiple groups of actions or action combinations of different action types.
  • the processing module 220 may segment the user's action signal based on the feature information corresponding to the EMG signal or the feature information corresponding to the gesture signal during the user's movement. Segmenting the action signal here refers to dividing the action signal into signal segments with the same or different durations, or extracting one or more signal segments with a specific duration from the action signal.
  • each segment of the action signal may correspond to one or more complete actions of the user. For example, when the user performs a squat, the user goes from a standing posture to a squatting posture, and then gets up and returns to a standing posture, which can be regarded as the user completing a squatting movement, and the action signal collected by the acquisition module 210 during this process can be regarded as a segment (or One cycle) action signal, after that, the action signal generated by the user completing the next squat action collected by the acquisition module 210 is regarded as another action signal.
  • each action signal may also correspond to a partial action of the user, and the partial action here may be understood as a partial action in a complete action.
  • the EMG signal and posture signal of the corresponding part will change.
  • the EMG signals and posture signals of the muscles corresponding to the corresponding parts of the body eg, arms, legs, buttocks, abdomen
  • the EMG signal and posture signal at the muscles corresponding to the corresponding parts of the user's body will fluctuate greatly.
  • the processing module 220 may segment the user's action signal based on the feature information corresponding to the EMG signal or the feature information corresponding to the gesture signal. For details on segmenting the action signal based on the feature information corresponding to the EMG signal or the feature information corresponding to the gesture signal, please refer to FIG. 7 and FIG. 8 and their related descriptions in this specification.
  • step 620 the motion of the user's movement is monitored based on at least one motion signal.
  • monitoring the motion of the user's movement based on the at least one segment of the motion signal may include matching the at least one segment of the motion signal with the at least one segment of the preset motion signal, and determining the motion type when the user is exercising.
  • the at least one segment of preset action signals refers to standard action signals corresponding to different actions preset in the database.
  • the action type of the user when exercising can be determined by judging the degree of matching between at least one segment of the motion signal and at least one segment of the preset motion signal.
  • monitoring the motion of the user's movement based on the at least one segment of the motion signal may further include matching based on the feature information corresponding to the at least one segment of the EMG signal and the feature information corresponding to the EMG signal in the at least one segment of the preset motion signal , which determines the type of action when the user is exercising.
  • the matching degree of one or more feature information (eg, frequency information, amplitude information) in a piece of EMG signal and one or more feature information in a preset motion signal is calculated respectively, and the one or more feature information is determined. Whether the weighted matching degree or average matching degree is within the range of the first matching threshold, and if so, determine the action type of the user when exercising according to the action type corresponding to the preset action signal.
  • monitoring the motion of the user's movement based on the at least one segment of the motion signal may further include matching based on the feature information corresponding to the at least one segment of the gesture signal and the feature information corresponding to the gesture signal in the at least one segment of the preset motion signal, and determining The action type when the user is exercising.
  • one or more feature information for example, angular velocity value, angular velocity direction and angular velocity acceleration value, angle, displacement information, stress, etc.
  • a piece of attitude signal and one or more feature information in a piece of preset motion signal are calculated respectively. Determine whether the weighted matching degree or average matching degree of one or more feature information is within the first matching threshold range, and if so, determine the action type of the user when exercising according to the action type corresponding to the preset action signal.
  • monitoring the motion of the user's movement based on at least one segment of the motion signal may further include feature information corresponding to the EMG signal in the at least one segment of the motion signal, feature information corresponding to the gesture signal, and at least one segment of preset motion
  • the characteristic information corresponding to the EMG signal and the characteristic information corresponding to the posture signal in the signal are matched to determine the action type of the user when exercising.
  • monitoring the motion of the user's movement based on the at least one segment of the motion signal may include matching the at least one segment of the motion signal with the at least one segment of the preset motion signal to determine the quality of the user's motion when exercising. Further, if the matching degree between the motion signal and the preset motion signal is within the second matching threshold range (eg, greater than 90%), the motion quality of the user when exercising meets the standard.
  • determining the action of the user's movement based on the at least one segment of the motion signal may include performing an action based on one or more feature information in the at least one segment of the motion signal and one or more feature information in the at least one segment of the preset motion signal. Match, to determine the quality of the user's motion when exercising.
  • an action signal can be an action signal of a complete action, or an action signal of a part of an action in a complete action.
  • an action signal of the user can improve the real-time monitoring of the user's action.
  • the user's action may also be determined through an action recognition model or a manually preset model.
  • FIG. 7 is an exemplary flowchart of motion signal segmentation according to some embodiments of the present application. As shown in FIG. 7, process 700 may include:
  • step 710 based on the time domain window of the myoelectric signal or the attitude signal, at least one target feature point is determined from the time domain window according to a preset condition.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the time domain window of the EMG signal includes the EMG signal within a period of time
  • the time domain window of the attitude signal includes the attitude signal within the same period of time.
  • the target feature point refers to a signal with target features in the action signal, which can represent the stage of the user's action. For example, when a user performs a seated chest clamping, at the beginning, the user's arms are stretched left and right in the horizontal direction, then the arms begin to rotate inward, then the arms are closed, and finally the arms return to the stretched state in the horizontal direction again, this process is as follows: A full seated chest clip.
  • the feature information corresponding to the EMG signal or the posture signal at each stage is different.
  • the characteristic information eg, angular velocity value, angular velocity direction, acceleration value of angular velocity, angle, displacement information, stress, etc.
  • one or more target feature points may be determined from within a time domain window according to preset conditions.
  • the preset conditions may include that the direction of the angular velocity corresponding to the attitude signal changes, the angular velocity corresponding to the attitude signal is greater than or equal to the angular velocity threshold, the angle corresponding to the attitude signal reaches the angle threshold, and the change value of the angular velocity value corresponding to the attitude signal is The extreme value and the amplitude information corresponding to the EMG signal are greater than or equal to one or more of the EMG thresholds.
  • target feature points in different stages of an action may correspond to different preset conditions.
  • target feature points of different actions may correspond to different preset conditions.
  • the actions of the seated chest clipping action and the bicep curling action are different, and the preset conditions corresponding to the respective preset target points in the two actions are also different.
  • content of the preset conditions reference may be made to the descriptions of the action start point, the action middle point, and the action end point in this specification.
  • At least one target feature point may also be determined from the time domain window according to a preset condition based on the time domain window of the myoelectric signal and the attitude signal at the same time.
  • the time domain window of the EMG signal and the attitude signal corresponds to a time range containing the EMG signal and the attitude signal.
  • the time of the EMG signal corresponds to the time of the attitude signal. For example, the time point of the myoelectric signal when the user starts to exercise is the same as the time point of the gesture signal when the user starts to exercise.
  • the target feature point can be determined by combining the feature information (eg, amplitude information) corresponding to the EMG signal and the feature information (eg, angular velocity value, angular velocity direction, angular velocity acceleration value, angle, etc.) corresponding to the attitude signal.
  • feature information eg, amplitude information
  • feature information eg, angular velocity value, angular velocity direction, angular velocity acceleration value, angle, etc.
  • step 720 the motion signal is segmented based on the at least one target feature point.
  • this step 720 may be performed by the processing module 220 and/or the processing device 110 .
  • there may be one or more target feature points in the myoelectric signal or the posture signal and the action signal may be divided into multiple segments by one or more target feature points.
  • the target feature point can divide the EMG signal into two segments, where the two segments can include the EMG signal before the target feature point and the EMG signal after the target feature point .
  • the processing module 220 and/or the processing device 110 may extract an electromyographic signal within a certain time range around the target feature point as a section of electromyographic signal.
  • the EMG signal when the EMG signal has multiple target feature points (for example, n, and the first target feature point is not the start point of the time domain window, and the nth target feature point is not the end point of the time domain window), The EMG signal can be divided into n+1 segments according to the n target feature points.
  • the EMG signal when the EMG signal has multiple target feature points (for example, n, and the first target feature point is the start point of the time domain window, and the nth target feature point is not the end point of the time domain window), you can According to the n target feature points, the EMG signal is divided into n segments.
  • the EMG signal has multiple target feature points (for example, n, and the first target feature point is the start point of the time domain window, and the nth target feature point is the end point of the time domain window), the The N target feature points divide the EMG signal into n-1 segments.
  • the action stages corresponding to the target feature points may include one or more.
  • the action signals may be segmented using multiple target feature points as benchmarks.
  • the action stage corresponding to the target feature point may include an action start point and an action end point, the action start point is before the action end point, and the action signal between the action start point and the next action start point can be regarded as a segment of action signal. .
  • the target feature point may include one or more of an action start point, an action midpoint, or an action end point.
  • the target feature point includes an action start point, an action middle point and an action end point as an exemplary illustration, wherein the action start point can be regarded as the start point of the user action cycle.
  • different actions may correspond to different preset conditions.
  • the preset condition may be that the angular velocity direction of the action after the action start point changes relative to the angular velocity direction of the action before the action start point, or the angular velocity value of the action start point is approximately 0, and the angular velocity value of the action start point is approximately 0.
  • the acceleration value of the angular velocity is greater than 0.
  • the starting point of the action can be set as the time point when the arms extend left and right in the horizontal direction and start to internally rotate.
  • the preset condition may be that the angle of arm lift is greater than or equal to the angle threshold. Specifically, when the user performs the bicep curling action, the lifting angle when the user's arm is horizontal is 0°, when the arm is drooping, the angle is negative, and when the arm is raised, the angle is positive. When the user's arm is raised from the horizontal position, the angle of arm lift is greater than 0.
  • the time point when the angle of the user's arm raising reaches the angle threshold can be regarded as the action start point.
  • the angle threshold may be -70° to -20°, or preferably, the angle threshold may be -50° to -25°.
  • the preset condition may further include: the angular velocity of the arm within a certain time range after the action start point may be greater than or equal to the angular velocity threshold.
  • the angular velocity threshold may range from 5°/s to 50°/s; preferably, the angular velocity threshold may range from 10°/s to 30°/s.
  • the angular velocity of the arm continues to be greater than Angular velocity threshold.
  • the preset condition is continued to be executed until an action start point is determined.
  • the midpoint of the action may be a point within one action cycle from the start point.
  • the starting point of the action can be set as the time point when the arms extend left and right in the horizontal direction and start internal rotation, and the time point when the arms are closed can be set as the middle point of the user's action.
  • the preset condition may be that the direction of the angular velocity at the time point after the middle point of the action changes relative to the direction of the angular velocity at the time point before the middle point of the action, and the angular velocity value at the middle point of the action is approximately 0, wherein the value of the angular velocity at the middle point of the action is approximately 0.
  • the change speed of the angular velocity (the acceleration of the angular velocity) within the first specific time range (eg, 0.05s, 0.1s, 0.5s) after the midpoint of the action may be greater than the angular velocity acceleration threshold (for example, 0.05rad/s).
  • the amplitude information corresponding to the middle point of the action in the EMG signal is greater than the EMG threshold.
  • the EMG threshold is related to the user action and the target EMG signal.
  • the EMG signal at the pectoral muscle is the target EMG signal.
  • the position corresponding to the middle point of the action also referred to as the "middle position” can be approximately regarded as the maximum point of muscle exertion, and the EMG signal will have a larger value at this time.
  • the EMG signal at the corresponding part of the user's body is relative to the EMG signal of the corresponding part when the user does not exercise (the muscles in a specific part can be regarded as a resting state at this time).
  • the signal is greatly improved, for example, the amplitude of the EMG signal of the corresponding part where the user's action reaches the middle position is 10 times that of the resting state.
  • the relationship between the amplitude of the EMG signal at the corresponding part of the movement to the middle position (the middle point of the action) and the amplitude of the EMG signal in the resting state will also be different. The relationship between the two will be different. It can be adaptively adjusted according to the actual movement.
  • the amplitude corresponding to the second specific time range (eg, 0.05s, 0.1s, 0.5s) after the midpoint of the action may continue to be greater than the EMG threshold.
  • the determination of the middle point of the action in addition to satisfying the above-mentioned preset conditions (for example, the angular velocity and the amplitude condition of the EMG signal), can also make the Euler angle between the middle point of the action and the starting position (also is called an angle) satisfying certain conditions.
  • the Euler angle of the middle point of the action relative to the start point of the action can be greater than one or more Euler angle thresholds (also called angle thresholds), for example, taking the front and rear directions of the human body as the X-axis, the left and right directions of the human body The direction is used as the Y axis, and the height direction of the human body is used as the Z axis.
  • the change of the Euler angle in the X and Y directions can be less than 25°, and the change of the Euler angle in the Z direction can be greater than 40°.
  • the EMG threshold and/or the Euler angle threshold may be pre-stored in the memory or hard disk of the wearable device 130, or may be stored in the processing device 110, or calculated and adjusted in real time according to actual conditions .
  • the processing module 220 may determine the middle point of the action from the time domain window of the time point after the action start point according to a preset condition based on the time domain window of the EMG signal or the gesture signal. In some implementations, after the middle point of the action is determined, it is possible to re-verify whether there are other time points that meet the preset conditions within the time range from the start point of the action to the middle point of the action, and if so, select the action closest to the middle point of the action The starting point serves as the best starting point for the action.
  • the action middle point is invalid, according to The preset conditions re-determine the start point of the action and the middle point of the action.
  • the action end point may be a point in time within one action cycle from the action start point and after the middle point of the action, for example, the action end point may be set as a point one action cycle away from the action start point, which
  • the time action end point can be considered as the end point of an action cycle of the user.
  • the starting point of the action can be set as the time point when the arms extend left and right in the horizontal direction and start internal rotation
  • the time point when the arms are folded can be used as the middle point of the user's action, and the arms are in the horizontal direction.
  • the time point when the direction returns to the stretched state again may correspond to the end point of the user action.
  • the preset condition may be that the change value of the angular velocity value corresponding to the attitude signal is an extreme value.
  • the change of the Euler angle should exceed a certain Euler angle threshold, for example, 20°.
  • the processing module 220 may determine the end point of the action from the time domain window after the middle point of the action according to a preset condition based on the time domain window of the EMG signal and the posture signal.
  • the action start point and the action middle point are both invalid, then Re-determine the action start point, the action middle point and the action end point according to the preset conditions.
  • At least one group of action start points, action middle points, and action end points in the action signal can be repeatedly determined, and based on at least one group of action start points, action middle points, and action end points as target feature points to pair actions
  • the signal is segmented. This step may be performed by processing module 220 and/or processing device 110 . It should be noted that the segmentation of the action signal is not limited to the above-mentioned action start point, action middle point, and action end point, and may also include other time points. For example, five time points can be selected according to the above steps for the seated chest-clamping action.
  • the first time point can be the start point of the action
  • the second time point can be the time when the internal rotation angular velocity is the largest
  • the third time point can be the mid-action point.
  • the fourth time point can be the moment when the external rotation angular velocity is the largest
  • the fifth time point can be the moment when the arms return to the left and right extension, and the angular velocity is 0, that is, the end point of the action.
  • the action end point described in the previous embodiment is used by adding the second time point as the 1/4 mark point of the action cycle.
  • a fifth time point is added as the end point of the complete action.
  • the identification of the action quality can be completed based on the signals in the first 3/4 of the action cycle (that is, the identification of the action quality of a single cycle does not depend on the complete analysis of the entire cycle. signal), can complete the monitoring and feedback of the user's action before the action of the current cycle is completed, and at the same time can completely record all the signals in the whole action process, so as to upload the signal to the cloud or mobile terminal device, so as to More methods can be used to monitor the user's actions.
  • the cycle of an action will be very long, and each stage has a different force-generating mode.
  • the above method for determining each time point may be used to divide the action into multiple stages, and for each stage The signals of the stages are individually identified and fed back to improve the real-time feedback of user actions.
  • the action start point may also be based on , any one or more of the action midpoint and the action end point as the target feature point to segment and monitor the user's action signal.
  • the action signal can also be segmented and monitored by taking the action start point as the target feature point.
  • the action start point and the action end point can also be used as a set of target feature points to segment and monitor the action signal, and other time points or time ranges that can play the role of target feature points are all within the protection scope of this specification. .
  • steps 710 and 720 may be performed in the processing module 220 at the same time.
  • steps 710 and 720 may be performed simultaneously in the processing module 220 and the processing device 110, respectively.
  • FIG. 8 is a schematic diagram of action signal segmentation according to some embodiments of the present application.
  • the abscissa in FIG. 8 may represent the time when the user is exercising, and the ordinate may represent the amplitude information of the EMG signal of the corresponding muscle part (for example, the pectoralis major) during the training of the user in a sitting position with chest clipping.
  • Figure 8 also includes the angular velocity change curve and the Euler angle change curve corresponding to the wrist position and attitude signal during the user's movement, wherein the angular velocity change curve is used to represent the speed change of the user during the movement, and the Euler angle curve is used to represent the user The position of body parts during exercise. As shown in FIG.
  • point A1 is determined as the action start point according to preset conditions. Specifically, the angular velocity direction at the time point after the user action start point A1 changes with respect to the angular velocity direction at the time point before the action start point A1. Further, the angular velocity value of the action start point A1 is approximately 0, and the acceleration value of the angular velocity at the action start point A1 is greater than 0.
  • the point B1 is determined as the middle point of the action according to the preset condition. Specifically, the angular velocity direction at the time point after the user's action midpoint B1 changes relative to the angular velocity direction at the time point before the action midpoint B1, and the angular velocity value at the action midpoint B1 is approximately 0.
  • the angular velocity direction of the action midpoint B1 is related to the action
  • the angular velocity of the starting point A1 is in the opposite direction.
  • the amplitude of the myoelectric signal shown as "EMG signal" in FIG. 8 ) corresponding to the middle point B1 of the action is greater than the myoelectric threshold.
  • point C1 is determined as the action end point according to preset conditions. Specifically, the change value of the angular velocity value at the action end point C1 is the extreme value from the action start point A1 to the action end point C1.
  • the process 700 can complete the action segment shown in FIG. 8 , and the action signal from the action start point A1 to the action end point C1 shown in FIG. 8 can be regarded as a segment of the user's motion signal.
  • the processing module 220 may re-determine the action start point to determine Accuracy of action segmentation.
  • the characteristic time threshold here may be stored in the memory or hard disk of the wearable device 130, or may be stored in the processing device 110, or calculated or adjusted according to the actual situation of the user's movement.
  • the processing module 220 can re-determine the action start point, thereby improving the accuracy of action segmentation.
  • the segmentation of the action signal is not limited to the above-mentioned action start point A1, action middle point B1 and action end point C1, but can also include other time points, and the selection of time points can be performed according to the complexity of the action.
  • the abrupt EMG signal may be described by singular points, and exemplary singular points may include spur signals, discontinuous signals, and the like.
  • monitoring the motion of the user based on at least the feature information corresponding to the EMG signal or the feature information corresponding to the gesture signal may further include: preprocessing the EMG signal in the frequency domain or the time domain, The feature information corresponding to the EMG signal is acquired based on the preprocessed EMG signal, and the user's movement action is monitored according to the feature information corresponding to the EMG signal or the feature information corresponding to the posture signal.
  • preprocessing the EMG signal in the frequency domain or the time domain may include filtering the EMG signal in the frequency domain to select or retain specific frequencies in the EMG signal in the frequency domain range of ingredients.
  • the frequency range of the EMG signal obtained by the obtaining module 210 is 1 Hz-1000 Hz, which can be filtered and then selected from the EMG signal in a specific frequency range (eg, 30 Hz-150 Hz) for subsequent processing.
  • the specific frequency range may be 10 Hz-500 Hz.
  • the specific frequency range may be 15Hz-300Hz. More preferably, the specific frequency range may be 30Hz-150Hz.
  • the filtering process may include low-pass filter processing.
  • the low pass filter may include an LC passive filter, an RC passive filter, an RC active filter, a passive filter composed of special components.
  • passive filters composed of special elements may include one or more of piezoelectric ceramic filters, crystal filters, and surface acoustic filters. It should be noted that the specific frequency range is not limited to the above-mentioned range, and may also be other ranges, which may be selected according to actual conditions.
  • the specific frequency range is not limited to the above-mentioned range, and may also be other ranges, which may be selected according to actual conditions.
  • preprocessing the EMG signal in the frequency domain or the time domain may further include performing signal correction processing on the EMG signal in the time domain.
  • Signal correction processing refers to the correction of singular points (for example, spur signals, discontinuous signals, etc.) in the EMG signal.
  • performing signal correction processing on the EMG signal in the time domain may include determining singular points in the EMG signal, ie, determining a sudden change in the EMG signal. The singular point can be the sudden change of the amplitude of the EMG signal at a certain moment, resulting in the discontinuity of the signal.
  • the EMG signal is relatively smooth in shape, and the amplitude of the EMG signal does not change suddenly, but the first-order differential of the EMG signal is abruptly generated, and the first-order differential is discontinuous.
  • the method of determining singular points in the EMG signal in the EMG signal may include, but is not limited to, one or more of Fourier transform, wavelet transform, fractal dimension, and the like.
  • performing signal correction processing on the EMG signal in the time domain may include removing singular points in the EMG signal, eg, removing the singular point and signals within a period of time near the singular point.
  • performing signal correction processing on the EMG signal in the time domain may include correcting the singular point of the EMG signal according to the characteristic information of the EMG signal in a specific time range, for example, correcting the singular point according to the signals around the singular point.
  • the amplitude is adjusted.
  • the characteristic information of the EMG signal may include one or more of amplitude information and statistical information of the amplitude information.
  • the statistical information of the amplitude information (also called the amplitude entropy) refers to the distribution of the amplitude information of the EMG signal in the time domain.
  • the singular point after the position of the singular point (for example, the corresponding time point) in the EMG signal is determined through a signal processing algorithm (for example, Fourier transform, wavelet transform, fractal dimension), the singular point can be determined according to the The EMG signal within a specific time range before or after the location is corrected for singularities.
  • the characteristic information for example, the amplitude information, the statistical information of the amplitude information
  • the characteristic information of the EMG signal in a specific time range for example, 5ms-60ms
  • the EMG signal at the mutant trough is complemented.
  • FIG. 9 is an exemplary flowchart of EMG signal preprocessing according to some embodiments of the present application. As shown in FIG. 9, the process 900 may include:
  • step 910 based on the time domain window of the EMG signal, different time windows are selected from the time domain window of the EMG signal, wherein the different time windows cover different time ranges respectively.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the different windows may include at least one particular window.
  • a specific window refers to a window with a specific time length selected in the time domain window. For example, when the time length of the time domain window of the EMG signal is 3s, the time length of the specific window may be 100ms.
  • a particular window may include multiple distinct time windows.
  • the specific window may include a first time window and a second time window, and the first time window may refer to a window corresponding to a partial time length within the specific window.
  • the first time window may be 80ms.
  • the second time window may refer to another window corresponding to a partial time length within the specific window.
  • the specific window is 100 ms
  • the second time window may be 20 ms.
  • the first time window and the second time window may be consecutive time windows within the same specific window.
  • the first time window and the second time window may also be two discontinuous or overlapping time windows within the same specific window. For example, when the time length of the window in a specific time range is 100ms, the time length of the first time window may be 80ms, and the time length of the second time window may be 25ms.
  • the processing module 220 may, based on the time domain window of the EMG signal, slide and update the specific window sequentially from the time starting point of the time domain window of the EMG signal according to a specific time length, and may continue the updated specific window Divided into a first time window and a second time window.
  • the specific time length mentioned here can be less than 1s, 2s, 3s and so on.
  • the processing module 220 may select a specific window with a specific time length of 100 ms, and divide the specific window into a first time window of 80 ms and a second time window of 20 ms. Further, the specific window can be updated by sliding in the time direction.
  • the sliding distance here may be the time length of the second time window (eg, 20ms), or may be other suitable time lengths, such as 30ms, 40ms, and so on.
  • step 920 the spur signal is determined based on the characteristic information corresponding to the myoelectric signal in the different time window.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the feature information corresponding to the electromyographic signal may include at least one of amplitude information and statistical information of the amplitude information.
  • the processing module 220 may obtain amplitude information corresponding to the EMG signal or statistical information of the amplitude information in different time windows (eg, the first time window, the second time window) to determine the location of the spur signal. For a specific description of determining the position of the spur signal based on the feature information corresponding to the EMG signal in different time windows, reference may be made to FIG. 10 and its related descriptions.
  • the above description about the process 900 is only for example and illustration, and does not limit the scope of application of this specification.
  • various modifications and changes can be made to the process 900 under the guidance of this specification.
  • the specific window is not limited to include the above-mentioned first time window and second time window, and may also include other time windows, such as a third time window, a fourth time window, and the like.
  • the specific range of the time before or after the position of the glitch signal can be adaptively adjusted according to the length of the glitch signal, which is not further limited herein. However, these corrections and changes are still within the scope of this specification.
  • FIG. 10 is an exemplary flowchart of a deglitch signal according to some embodiments of the present application. As shown in FIG. 10, the process 1000 may include:
  • step 1010 first amplitude information corresponding to the EMG signal in the first time window and second amplitude information corresponding to the EMG signal in the second time window are determined.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the processing module 220 may select the time length of the first time window and the second time window, and extract the first amplitude information and the second time window time corresponding to the EMG signal within the time length of the first time window The second amplitude information corresponding to the EMG signal in the length.
  • the first amplitude information may include an average amplitude of the EMG signal within the first time window
  • the second amplitude information may include the average amplitude of the EMG signal within the second time window.
  • the processing module 220 can select the time length of the first time window to be 80ms and extract the first amplitude information corresponding to the EMG signal in the first time window, and the processing module 220 can select the time length of the second time window to be 20ms and extract the second time window.
  • the second amplitude information corresponding to the EMG signal within the time window can be selected.
  • the selection of the time length of the first time window and the time length of the second time window is related to the shortest glitch signal length and the calculation amount of the system.
  • the first time window duration and the second time window duration may be selected according to the characteristics of the glitch signal.
  • the time length of the ECG glitch signal is 40ms-100ms, the time interval between the two glitch signals in the ECG signal can be about 1s, the two sides of the peak point of the glitch signal are basically symmetrical, and the amplitude distribution on both sides of the glitch signal is relatively average, etc.
  • a time length less than the glitch signal for example, half of the glitch signal length may be selected as the time length of the second time window, and the time length of the first time window may be greater than The length of the second time window is, for example, 4 times the length of the second time window. In some embodiments, the time length of the first time window may only be within the range of the glitch interval (about 1 s) minus the length of the second time window.
  • time length of the first time window and the time length of the second time window selected above are not limited to the above description, as long as the sum of the time length of the second time window and the time length of the first time window is less than The time interval of two adjacent glitch signals, or the time length of the second time window is less than the length of a single glitch signal, or the EMG signal amplitude in the second time window and the EMG signal amplitude in the first time window have a better value. Discrimination is enough.
  • step 1020 it is determined whether the ratio of the second amplitude information to the first amplitude information is greater than a threshold.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the processing module 220 may determine whether the ratio of the second amplitude information corresponding to the EMG signal in the second time window to the first amplitude information corresponding to the EMG signal in the first time window is greater than a threshold.
  • the threshold here may be stored in the memory or hard disk of the wearable device 130, or may be stored in the processing device 110, or adjusted according to the actual situation.
  • step 1020 may proceed to step 1030 .
  • step 1020 may proceed to step 1040 .
  • step 1030 signal correction processing is performed on the electromyographic signals within the second time window.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the processing module 220 may perform signal processing on the electromyographic signals in the second time window according to the judgment result of the relationship between the ratio of the second amplitude information and the first amplitude information and the threshold in step 1020 Correction processing. For example, in some embodiments, if the ratio of the second amplitude information to the first amplitude information is greater than a threshold, the EMG signal in the second time window corresponding to the second amplitude information is a spur signal.
  • processing the EMG signal within the second time window may include performing signal correction processing on the EMG signal within the second time window based on the EMG signal within a specific time range before or after the second time window.
  • the manner of performing signal correction processing on the EMG signal within the second time window may include, but is not limited to, filling, interpolation, and the like.
  • the specific time range may be 5ms-60ms.
  • the specific time range may be 10ms-50ms.
  • the specific time range may be 20ms-40ms. It should be noted that the specific time range is not limited to the above-mentioned range, for example, the specific time range may also be greater than 60ms, or less than 5ms and other ranges. In practical application scenarios, adaptive adjustment can be made according to the time length of the glitch signal.
  • step 1040 the EMG signals within the second time window are retained.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the processing module 220 may execute the retention of the EMG signal within the second time window according to the determination result of the relationship between the ratio of the second amplitude information and the first amplitude information and the threshold in step 1020 . For example, in some embodiments, if the ratio of the second amplitude information to the first amplitude information is not greater than a threshold, the EMG signal in the second time window corresponding to the second amplitude information is a normal EMG signal, which is normal. The EMG signal can be retained, that is, the EMG signal within the second time window is retained.
  • judging and removing the glitch signal in the EMG signal based on the process 1000 can realize real-time processing of the glitch signal, so that the wearable device 130 or the mobile terminal device 140 can feed back its motion state to the user in real time, helping the user. Users can exercise more scientifically.
  • the time length corresponding to the first time window may be greater than the time length corresponding to the second time window.
  • the specific time length corresponding to the specific window may be less than 1 s.
  • the ratio of the time length corresponding to the first time window to the time length corresponding to the second time window may be greater than 2.
  • the selection of the time length corresponding to the first time window, the time length corresponding to the second time window, and the specific time length corresponding to the specific window can ensure that the shortest glitch signal length (for example, 40ms) can be On the other hand, it can make the calculation amount of the system relatively small, reduce the repeated calculation of the system, and reduce the time complexity, thereby improving the calculation efficiency and calculation accuracy of the system.
  • the above description about the process 1000 is only for example and illustration, and does not limit the scope of application of this specification.
  • various modifications and changes can be made to the process 1000 under the guidance of this specification.
  • the above process 1000 is only an example in which the singular point is a burr signal.
  • the above steps eg, step 1010, step 1020, step 1030, etc.
  • their solutions can be adjusted or other methods can be adopted Perform signal correction processing.
  • these corrections and changes are still within the scope of this specification.
  • other methods may also be used to perform signal correction processing on singular points of the EMG signal, for example, a high-pass method, a low-pass method, a band-pass method, a wavelet transform reconstruction method, and the like.
  • a 100 Hz high-pass filter can be used to remove spur signals.
  • other signal processing methods may also be performed on the EMG signal, such as filtering, signal amplification, phase adjustment, and the like.
  • the user's EMG signal collected by the EMG sensor can be converted into a digital EMG signal through an analog-to-digital converter (ADC), and the converted digital EMG signal can be filtered, and the filtering process can filter out Power frequency signal and its harmonic signal, etc.
  • ADC analog-to-digital converter
  • the processing of the electromyographic signal may also include removing motion artifacts of the user.
  • the motion artifact here refers to the signal noise generated by the relative movement of the muscle at the position to be measured relative to the EMG module when the user moves during the acquisition of the EMG signal.
  • the gesture signal may be acquired by a gesture sensor on the wearable device 130 .
  • the gesture sensors on the wearable device 130 may be distributed on the limbs (eg, arms, legs, etc.) of the human body, the torso (eg, chest, abdomen, back, waist, etc.) of the human body, the head of the human body, and the like.
  • the attitude sensor can realize the acquisition of attitude signals of other parts of the human body, such as limbs and torso.
  • the attitude sensor may also be a sensor of an attitude measurement unit (AHRS) with attitude fusion algorithm.
  • AHRS attitude measurement unit
  • the attitude fusion algorithm can fuse the data of a nine-axis inertial measurement unit (IMU) with a three-axis acceleration sensor, a three-axis angular velocity sensor, and a three-axis geomagnetic sensor into Euler angles or quaternions to obtain the position of the user's body where the attitude sensor is located.
  • attitude signal The processing module 220 and/or the processing device 110 may determine feature information corresponding to the gesture based on the gesture signal.
  • the feature information corresponding to the attitude signal may include, but is not limited to, an angular velocity value, a direction of the angular velocity, an acceleration value of the angular velocity, and the like.
  • the gesture sensor may be a strain sensor, and the strain sensor may acquire the bending direction and bending angle of the joint of the user, thereby acquiring the gesture signal when the user moves.
  • a strain sensor can be set at the user's knee joint. When the user moves, the user's body part acts on the strain sensor, and the bending direction and bending angle of the user's knee joint can be calculated based on the resistance or length change of the strain sensor. Thereby, the posture signal of the user's leg is obtained.
  • the attitude sensor may further comprise a fiber optic sensor, and the attitude signal may be characterized by a change in the direction of the light ray of the fiber optic sensor after bending.
  • the attitude sensor can also be a magnetic flux sensor, and the attitude signal can be characterized by the transformation of the magnetic flux.
  • the type of the attitude sensor is not limited to the above-mentioned sensors, and may also be other sensors, and the sensors that can obtain the user's attitude signal are all within the scope of the attitude sensor in this specification.
  • FIG. 11 is an exemplary flowchart of determining feature information corresponding to a gesture signal according to some embodiments of the present application. As shown in FIG. 11, the process 1100 may include:
  • step 1110 a target coordinate system and a transformation relationship between the target coordinate system and at least one original coordinate system are acquired.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the original coordinate system refers to the coordinate system corresponding to the gesture sensor disposed on the human body.
  • the posture sensors on the wearable device 130 are distributed in different parts of the human body, so that the installation angles of the posture sensors on the human body are different, and the posture sensors in different parts are based on the coordinate systems of their respective bodies.
  • the attitude sensors of different parts have different original coordinate systems.
  • the gesture signals obtained by each gesture sensor may be expressed in its corresponding original coordinate system.
  • the target coordinate system refers to a human body coordinate system established based on the human body.
  • the length direction of the human torso that is, the direction perpendicular to the transverse plane of the human body
  • the front-to-back direction of the human torso that is, the direction perpendicular to the coronal plane of the human body
  • the left and right directions of the human torso can be used as the X axis.
  • the direction ie, the direction perpendicular to the sagittal plane of the human body
  • the direction is used as the Y-axis.
  • a conversion relationship exists between the target coordinate system and the original coordinate system, and the coordinate information in the original coordinate system can be converted into coordinate information in the target coordinate system through the conversion relationship.
  • the transformation relationship may be represented as one or more rotation matrices.
  • step 1120 based on the conversion relationship, the coordinate information in at least one original coordinate system is converted into coordinate information in the target coordinate system.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the coordinate information in the original coordinate system refers to the three-dimensional coordinate information in the original coordinate system.
  • the coordinate information in the target coordinate system refers to the three-dimensional coordinate information in the target coordinate system.
  • the coordinate information v 1 in the original coordinate system can be converted into coordinate information v 2 in the target coordinate system according to the conversion relationship.
  • the coordinate information v 1 and the coordinate information v 2 can be converted by a rotation matrix, and the rotation matrix here can be understood as the conversion relationship between the original coordinate system and the target coordinate system.
  • the coordinate information v1 in the original coordinate system can be converted into coordinate information v1-1 through the first rotation matrix, and the coordinate information v1-1 can be transformed into coordinate information v1-2 through the second rotation matrix, and the coordinate information v 1 -2 can be transformed into coordinate information v 1 -3 through the third rotation matrix, and the coordinate information v 1 -3 is the coordinate information v 2 in the target coordinate system.
  • the rotation matrix is not limited to the above-mentioned first rotation matrix, second rotation matrix and third rotation matrix, and may also include fewer or more rotation matrices. In some alternative embodiments, the rotation matrix may also be one rotation matrix or a combination of multiple rotation matrices.
  • step 1130 the feature information corresponding to the gesture signal is determined based on the coordinate information in the target coordinate system.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • determining the feature information corresponding to the user gesture signal based on the coordinate information in the target coordinate system may include determining the feature information corresponding to the user gesture signal based on a plurality of coordinate information in the target coordinate system during the user's movement. For example, when the user performs a sitting posture with chest clipping, the user's arm can be raised to the front to correspond to the first coordinate information in the target coordinate system, and when the user's arm is opened to the same plane as the torso, it can correspond to the second coordinate in the target coordinate system. information, and based on the first coordinate information and the second coordinate information, feature information corresponding to the user gesture signal can be calculated. For example, angular velocity, direction of angular velocity, acceleration value of angular velocity, etc.
  • the relative motion between different moving parts of the user's body can also be determined by using the feature information corresponding to the gesture sensors located at different positions of the user's body.
  • the relative movement between the arm and the torso during the user's movement can be determined by using the feature information corresponding to the posture sensor on the user's arm and the feature information corresponding to the posture sensor on the user's torso.
  • FIG. 12 is an exemplary flowchart of determining relative motion between different moving parts of a user according to some embodiments of the present application. As shown in Figure 12, process 1200 may include:
  • step 1210 the feature information corresponding to the at least two sensors is determined based on the conversion relationship between the original coordinate system and the target coordinate system.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • different sensors have different conversion relationships between the original coordinate system corresponding to the sensor and the target coordinate system due to different installation positions on the human body.
  • the processing device 110 may convert the coordinate information in the original coordinate system corresponding to the sensors of different parts of the user (for example, the forearm, the upper arm, the torso, etc.) into the coordinate information in the target coordinate system, so as to The feature information corresponding to the at least two sensors is determined respectively. Relevant descriptions about the transformation of the coordinate information in the original coordinate system into the coordinate information in the target coordinate system can be found elsewhere in this application, for example, Fig. 11 , which will not be repeated here.
  • step 1220 the relative motion between different moving parts of the user is determined based on the feature information corresponding to the at least two sensors respectively.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the moving part may refer to a limb on the human body that can move independently, for example, the lower arm, the upper arm, the lower leg, the thigh, and the like. Only as an example, when the user lifts the dumbbells with the arms, the coordinate information in the target coordinate system corresponding to the sensor set on the forearm is combined with the coordinate information in the target coordinate system corresponding to the sensor set on the big arm. The relative motion between the user's forearm and the upper arm can be determined, so that the user's arm lifts the dumbbell can be determined.
  • multiple sensors of the same or different types can also be set on the same moving part of the user, and the coordinate information in the original coordinate system corresponding to the multiple sensors of the same or different types can be converted into coordinates in the target coordinate system respectively.
  • information can be set at different positions of the user's forearm, and multiple coordinate information in the target coordinate system corresponding to the multiple sensors of the same or different types can simultaneously represent the movement of the user's forearm.
  • the coordinate information in the target coordinate system corresponding to a plurality of sensors of the same type can be averaged, thereby improving the accuracy of the coordinate information of the moving part during the user's movement.
  • the coordinate information in the target coordinate system may be obtained by using a fusion algorithm (eg, Kalman filter, etc.) for the coordinate information in the coordinate systems corresponding to multiple different types of sensors.
  • a fusion algorithm eg, Kalman filter, etc.
  • FIG. 13 is an exemplary flowchart of determining the conversion relationship between the original coordinate system and a specific coordinate system according to some embodiments of the present application.
  • the process of determining the transformation relationship between the original coordinate system and the specific coordinate system may also be called a calibration process. As shown in Figure 13, process 1300 may include:
  • step 1310 a specific coordinate system is constructed.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the transformation relationship between the at least one original coordinate system and the target coordinate system may be obtained through a calibration process.
  • the specific coordinate system refers to the reference coordinate system used to determine the transformation relationship between the original coordinate system and the target coordinate system during the calibration process.
  • the constructed specific coordinate system may take the length direction of the torso as the Z axis when the human body is standing, the X axis in the front and rear direction of the human body, and the Y axis in the left and right directions of the human torso.
  • the particular coordinate system is related to the orientation of the user during calibration.
  • the front of the user's body faces a certain fixed direction (for example, north), then the front (north) direction of the human body is the X-axis.
  • the direction of the X-axis is fixed.
  • step 1320 first coordinate information in at least one original coordinate system when the user is in the first posture is acquired.
  • this step may be performed by acquisition module 210 .
  • the first posture may be a posture in which the user maintains an approximately standing posture.
  • the obtaining module 210 eg, a sensor
  • the obtaining module 210 may obtain the first coordinate information in the original coordinate system based on the first gesture of the user.
  • step 1330 second coordinate information in at least one original coordinate system when the user is in the second posture is acquired.
  • this step may be performed by acquisition module 210 .
  • the second gesture may be a gesture in which the part of the user's body (eg, the arm) on which the sensor is located leans forward.
  • the acquisition module 210 eg, a sensor
  • the acquisition module 210 may acquire the second coordinate information in the original coordinate system based on the user's second gesture (eg, a forward tilt gesture).
  • step 1340 the conversion relationship between at least one original coordinate system and the specific coordinate system is determined according to the first coordinate information, the second coordinate information and the specific coordinate system.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the first rotation matrix may be determined according to the first coordinate information corresponding to the first gesture. In the first posture, since the Euler angles in the X and Y directions of a specific coordinate system in the ZYX rotation order are 0, and the Euler angles in the X and Y directions of the original coordinate system are not necessarily 0, then the first rotation matrix is the The original coordinate system is reversely rotated around the X axis, and then the resulting rotation matrix is reversely rotated around the Y axis.
  • the second rotation matrix may be determined from second coordinate information of the second gesture (eg, the body part where the sensor is located leaning forward).
  • the first The second rotation matrix is the rotation matrix obtained by reversely rotating the original coordinate system around the Y direction, and then reversely rotating around the Z 3 direction.
  • the conversion relationship between the original coordinate system and the specific coordinate system can be determined through the above-mentioned first rotation matrix and second rotation matrix.
  • the above-mentioned method can be used to determine the conversion relationship between each original coordinate system and a specific coordinate system.
  • first posture is not limited to a posture in which the user maintains an approximately standing posture
  • second posture is not limited to a posture in which the user's body part (for example, an arm) where the sensor is located is inclined forward.
  • the pose can be approximated as a stationary pose during the calibration process.
  • the first gesture and/or the second gesture may also be a dynamic gesture during the calibration process.
  • the walking posture of the user is a relatively fixed posture
  • the angles and angular velocities of the arms, legs, and feet can be extracted during the walking process, and actions such as stepping forward and swinging the arms forward can be identified.
  • the second gesture is not limited to one action, and multiple actions may be extracted as the second gesture.
  • the coordinate information of multiple actions is fused to obtain a more accurate rotation matrix.
  • some signal processing algorithms can be used to dynamically correct the rotation matrix to obtain a better transformation matrix in the whole calibration process.
  • machine learning algorithms or other algorithms can be used to automatically identify some specific actions to update the rotation matrix in real time. For example, the machine learning algorithm recognizes that the current user is walking or standing, and the calibration process starts automatically. In this case, the wearable device does not need an explicit calibration process, and the rotation matrix will be used when the user uses the wearable device. dynamic update in .
  • the installation position of the attitude sensor can be relatively fixed, and a rotation matrix can be preset in the corresponding algorithm, which can make the recognition process of a specific action more accurate. Further, in the process of using the wearable device, the user continues to correct the rotation matrix, so that the obtained rotation matrix is closer to the real situation.
  • FIG. 14 is an exemplary flowchart of determining the transformation relationship between the original coordinate system and the target coordinate system according to some embodiments of the present application. As shown in Figure 14, process 1400 may include:
  • step 1410 the conversion relationship between the specific coordinate system and the target coordinate system is obtained.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • Both the specific coordinate system and the target coordinate system take the length direction of the human torso as the Z axis, so through the conversion relationship between the X axis of the specific coordinate system and the X axis of the target coordinate system, and the Y axis of the specific coordinate system and the Y axis of the target coordinate system
  • the transformation relationship between the axes can obtain the transformation relationship between a specific coordinate system and the target coordinate system.
  • FIG. 13 For the principle of acquiring the conversion relationship between the specific coordinate relationship and the target coordinate system, reference may be made to FIG. 13 and its related contents.
  • the specific coordinate system may take the length direction of the human body torso as the Z axis, and the front and rear direction of the human body as the calibrated X axis. Since the front and rear directions of the user's body will change during the movement (eg, turning movement) and cannot be maintained in the calibrated coordinate system, it is necessary to determine a coordinate system that can rotate with the human body, that is, the target coordinate system. In some embodiments, the target coordinate system may change with the orientation of the user, and the X-axis of the target coordinate system is always directly in front of the human torso.
  • step 1420 the conversion relationship between the at least one original coordinate system and the target coordinate system is determined according to the conversion relationship between the at least one original coordinate system and the specific coordinate system, and the conversion relationship between the specific coordinate system and the target coordinate system.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the processing device 110 may, according to the conversion relationship between the at least one original coordinate system and the specific coordinate system determined in the process 1300, and the conversion relationship between the specific coordinate system and the target coordinate system determined in step 1410, A conversion relationship between at least one original coordinate system and the target coordinate system is determined, so that coordinate information in the original coordinate system can be converted into coordinate information in the target coordinate system.
  • the position of the posture sensor provided on the wearable device 130 may change and/or the installation angle of the posture sensor on the human body may be different, and the user performs the same movement, and the posture data returned by the posture sensor may be larger difference.
  • the frame line part may represent Euler angle data (coordinate information) in the original coordinate system corresponding to the forearm position when the user performs the same action.
  • the result of the Euler angle vector in the Z-axis direction (shown as "Z" in Fig. 15A) within the frame line portion is approximately in the range of -180°-(-80°), and the Y-axis direction (Fig.
  • the Euler angle vector results for the X-axis direction (shown as "X” in Figure 15A) fluctuate approximately around 0°
  • the Euler angle vector results for the X-axis direction (shown as "X” in Figure 15A) approximately fluctuate around -80°.
  • the fluctuation range here can be 20°.
  • FIG. 15B is an exemplary vector coordinate diagram of Euler angle data in another original coordinate system at the position of the human forearm according to some embodiments of the present application.
  • the frame line part can represent the Euler angle data in the original coordinate system corresponding to another position of the forearm when the user performs the same action (the same action as the action shown in FIG. 15A ).
  • the result of the Euler angle vector in the Z-axis direction (shown as "Z ' " in Fig. 15B) within the frame line portion is approximately in the range of -180°-180°, and the Y-axis direction (in Fig.
  • the Euler angle data shown in FIG. 15A and FIG. 15B are based on the original coordinate system obtained when the user performs the same action at different positions of the human forearm (it can also be understood that the installation angle of the attitude sensor at the position of the human forearm is different). Euler angle data (coordinate information) in . Comparing Fig. 15A and Fig. 15B, it can be seen that depending on the installation angle of the attitude sensor on the human body, when the user performs the same action, the Euler angle data in the original coordinate system returned by the attitude sensor can be quite different. For example, the result of the Euler angle vector in the Z-axis direction in Fig.
  • the Euler angle data in the original coordinate system corresponding to the sensors with different installation angles can be converted into Euler angle data in the target coordinate system, so as to facilitate the analysis of the attitude signals of the sensors at different positions.
  • the straight line on which the left arm is located can be abstracted as a unit vector from the elbow to the wrist, where the unit vector is a coordinate value in the target coordinate system.
  • the target coordinate system here is defined as the axis pointing to the back of the human body is the X axis, the axis pointing to the right side of the human body is the Y axis, and the axis pointing above the human body is the Z axis, which conforms to the right-handed coordinate system.
  • the coordinate value [-1, 0, 0] in the target coordinate system indicates that the arm is raised horizontally forward; the coordinate value [0, -1, 0] of the target coordinate system indicates that the arm is raised horizontally to the left.
  • 16A is an exemplary vector coordinate diagram of Euler angle data in a target coordinate system at a position of a human forearm according to some embodiments of the present application.
  • Fig. 16A is a graph obtained based on the forearm in Fig. 15A after the Euler angle data of the original coordinates are converted into vector coordinates in the target coordinate system, wherein the frame line part can represent the target coordinates at the position of the forearm when the user performs an action Euler angle data in the system. As shown in Fig.
  • the forearm vector [x, y, z] in the frame line portion reciprocates between a first position and a second position, where the first position is [0.2, -0.9, -0.38], and the second position The positions are [0.1, -0.95, -0.3]. It should be noted that each time the forearm reciprocates, there will be a small deviation between the first position and the second position.
  • FIG. 16B is an exemplary vector coordinate diagram of Euler angle data in the target coordinate system at another position of the human forearm according to some embodiments of the present application.
  • Fig. 16B is a graph obtained based on the forearm in Fig. 15B after the Euler angle data of the original coordinates are converted into vector coordinates in the target coordinate system, wherein the frame line part can indicate that the user performs the same action (same as the action shown in Fig. 16A ). The same action) Euler angle data in the target coordinate system at another location of the forearm.
  • the forearm vector [x, y, z] also reciprocates between a first position and a second position, where the first position is [0.2, -0.9, -0.38] and the second position is [ 0.1, -0.95, -0.3].
  • FIG. 17 is a vector coordinate diagram of a limb vector in a target coordinate system according to some embodiments of the present application.
  • the left forearm (17-1), right forearm (17-2), left forearm (17-3), and right forearm (17-4) of the human body can be represented respectively,
  • FIG. 17 shows the vector coordinates of each position (eg, 17-1, 17-2, 17-3, 17-4, 17-5) in the target coordinate system when the human body moves.
  • the first 4200 points in Figure 17 are the calibration actions required to calibrate the limbs, such as standing, trunk forward, arm extension, arm lateral raise and so on.
  • the raw data collected by the attitude sensor can be converted into Euler angles in the target coordinate system.
  • it can be further converted into the coordinate vector of the arm vector in the target coordinate system.
  • the target coordinate system here is the X-axis pointing to the front of the torso, the Y-axis pointing to the left side of the torso, and the Z-axis pointing to the top of the torso.
  • the reciprocating movements in Figure 17 are from left to right: Action 1, Action 2, Action 3, Action 4, Action 5, and Action 6 are seated chest clamping, high pull-down, seated chest push, seated shoulder push, and barbell bicep bend. Raise and sit with chest clamps.
  • Action 1 and Action 6 both represent the seated chest clipping action, and the curves of these two movements have good repeatability.
  • the attitude data (eg, Euler angle, angular velocity, etc.) directly output by the module of the original coordinate system can be converted into the attitude data in the target coordinate system through the processes 1300 and 1400, so that a highly consistent attitude can be obtained Data (e.g. Euler angles, angular velocity, limb vector coordinates, etc.).
  • FIG. 18A is an exemplary vector coordinate diagram of raw angular velocity according to some embodiments of the present application.
  • the original angular velocity can be understood as converting the Euler angle data in the original coordinate system corresponding to the sensors with different installation angles into the Euler angle data in the target coordinate system.
  • factors such as shaking during the user's movement may affect the result of the angular velocity in the gesture data.
  • the vector coordinate curve of the original angular velocity presents a relatively obvious uneven curve. For example, there is a sudden change in the vector coordinate curve of the original angular velocity, so that the vector coordinate curve of the original angular velocity is not smooth.
  • FIG. 18B is a graph of exemplary results of filtered angular velocities according to some embodiments of the present application. As shown in FIG. 18B , after low-pass filtering of 1Hz-3Hz is performed on the original angular velocity, the influence of jitter on the angular velocity (for example, sudden change signal) can be eliminated, so that the vector coordinate graph corresponding to the angular velocity can present a relatively smooth curve.
  • the low-pass filtering processing of 1Hz-3Hz on the angular velocity can effectively avoid the influence of jitter on the attitude data (for example, Euler angles, angular velocity, etc.), which is more convenient for the subsequent process of signal segmentation.
  • the filtering process may also filter out the power frequency signal and its harmonic signal, spur signal, etc. in the action signal. It should be noted that the low-pass filtering processing of 1Hz-3Hz will introduce a system delay, so that the action point obtained by the attitude signal and the action point of the real EMG signal are temporally misaligned.
  • the system delay generated in the low-pass filtering process is subtracted to ensure the synchronization of the attitude signal and the EMG signal in time.
  • the system delay is associated with the center frequency of the filter.
  • the system delay is adaptively adjusted according to the center frequency of the filter.
  • the angle range of Euler angles is [-180°, +180°]
  • the obtained Euler angles may range from -180° to +180° ° or +180° to -180° jump. For example, when the angle is -181°, the Euler angle jumps to 179°. In the actual application process, the jump will affect the calculation of the angle difference, and the jump needs to be corrected first.
  • a motion recognition model may also be used to analyze the user's motion signal or feature information corresponding to the motion signal, so as to recognize the user's motion.
  • the action recognition model includes a machine learning model trained to recognize user actions.
  • the action recognition model may include one or more machine learning models.
  • the action recognition model may include, but is not limited to, a machine learning model for classifying user action signals, a machine learning model for identifying the quality of user actions, a machine learning model for identifying the number of user actions, and a machine learning model for identifying the fatigue level of the user performing actions one or more of the machine learning models.
  • the machine learning model may include a linear classification model (LR), a support vector machine model (SVM), a naive Bayesian model (NB), a K-nearest neighbor model (KNN), a decision tree model (DT), an ensemble model One or more of models (RF/GDBT, etc.), etc.
  • LR linear classification model
  • SVM support vector machine model
  • NB naive Bayesian model
  • KNN K-nearest neighbor model
  • DT decision tree model
  • RF/GDBT ensemble model One or more of models
  • FIG. 19 is an exemplary flowchart of a motion monitoring and feedback method according to some embodiments of the present application. As shown in Figure 19, process 1900 may include:
  • step 1910 a motion signal of the user when exercising is acquired.
  • this step may be performed by acquisition module 210 .
  • the action signal at least includes feature information corresponding to the myoelectric signal and feature information corresponding to the gesture signal.
  • the action signal refers to the human body parameter information when the user is exercising.
  • the human body parameter information may include, but is not limited to, one or more of electromyographic signals, posture signals, heart rate signals, temperature signals, humidity signals, blood oxygen concentration, and the like.
  • the motion signal may include at least an EMG signal and a posture signal.
  • the electromyography sensor in the acquisition module 210 can collect the electromyographic signal when the user moves, and the posture sensor in the acquisition module 210 can collect the posture signal when the user moves.
  • step 1920 through the motion recognition model, the user's motion actions are monitored based on the motion signal, and motion feedback is performed based on the output result of the motion recognition model.
  • this step may be performed by processing module 220 and/or processing device 110 .
  • the output result of the action recognition model may include, but is not limited to, one or more of action type, action quality, action quantity, fatigue index, and the like.
  • the action recognition model can identify the user's action type as sitting and chest clamping according to the action signal.
  • one machine learning model in the action recognition model can first identify the user's action type as sitting and chest clamping according to the action signal, and another machine learning model in the action recognition model information, frequency information and/or the angular velocity, angular velocity direction, and angular velocity acceleration value of the attitude signal) to output the action quality of the user action as a standard action or an error action.
  • the action feedback may include issuing prompt information.
  • the prompt information may include, but is not limited to, voice prompts, text prompts, image prompts, video prompts, and the like.
  • the processing device 110 may control the wearable device 130 or the mobile terminal device 140 to issue a voice prompt (for example, information such as "Irregular action") to the user to remind the user to adjust the fitness in time action.
  • a voice prompt for example, information such as "Irregular action
  • the wearable device 130 or the mobile terminal device 140 may not issue prompt information, or prompt information similar to "action standard" may be generated.
  • the motion feedback may also include the wearable device 130 stimulating a corresponding part of the user to move.
  • the elements of the wearable device 130 stimulate the corresponding parts of the user's actions through vibration feedback, electrical stimulation feedback, pressure feedback, and the like.
  • the processing device 110 can control the elements of the wearable device 130 to stimulate the corresponding part of the user's movement.
  • the motion feedback may further include outputting a motion record when the user is exercising.
  • the exercise record here may refer to one or more of the user's action type, exercise duration, action quantity, action quality, fatigue index, physiological parameter information during exercise, and the like.
  • FIG. 20 is an exemplary flowchart of the application of model training according to some embodiments of the present application. As shown in Figure 20, the process 2000 may include:
  • step 2010 sample information is obtained.
  • this step may be performed by acquisition module 210 .
  • the sample information may include motion signals during exercise by professionals (eg, fitness trainers) and/or non-professionals.
  • the sample information may include myoelectric signals and/or posture signals generated by professionals and/or non-professionals performing the same type of action (e.g., chest clamping in a sitting position).
  • the EMG signal and/or the gesture signal in the sample information can be processed by segmentation in the process 700, the glitch process in the process 900, and the conversion process in the process 1300, etc., to form at least a section of the EMG signal and/or the gesture Signal.
  • the at least one segment of the electromyographic signal and/or the posture signal can be used as an input to the machine learning model to train the machine learning model.
  • at least the feature information corresponding to a piece of myoelectric signal and/or the feature information corresponding to the posture signal can also be used as the input of the machine learning model to train the machine learning model.
  • the frequency information and amplitude information of the EMG signal can be used as the input of the machine learning model.
  • the angular velocity of the attitude signal and the acceleration value of the angular velocity direction/angular velocity can be used as the input of the machine learning model.
  • the action start point, the action middle point, and the action end point of the action signal can be used as the input of the machine learning model.
  • the sample information may be obtained from a storage device of the processing device 110 . In some embodiments, the sample information may be obtained from the acquisition module 210 .
  • step 2020 an action recognition model is trained.
  • the action recognition model may include one or more machine learning models.
  • the action recognition model may include, but is not limited to, a machine learning model for classifying user action signals, a machine learning model for recognizing the quality of user actions, a machine learning model for recognizing the number of user actions, and a machine learning model for recognizing the fatigue level of the user performing actions one or more of.
  • the machine learning model may include a linear classification model (LR), a support vector machine model (SVM), a naive Bayesian model (NB), a K-nearest neighbor model (KNN), a decision tree model (DT), an ensemble model One or more of models (RF/GDBT, etc.), etc.
  • training the machine learning model may include obtaining sample information.
  • the sample information may include motion signals during exercise by professionals (eg, fitness trainers) and/or non-professionals.
  • the sample information may include myoelectric signals and/or posture signals generated when professionals and/or non-professionals perform the same type of action (eg, chest clamping in a sitting position).
  • the EMG signal and/or the gesture signal in the sample information can be processed by segmentation in the process 700, the glitch process in the process 900, and the conversion process in the process 1300, etc., to form at least a section of the EMG signal and/or the gesture Signal.
  • the at least one segment of the electromyographic signal and/or the posture signal can be used as an input to the machine learning model to train the machine learning model.
  • at least the feature information corresponding to a piece of myoelectric signal and/or the feature information corresponding to the posture signal can also be used as the input of the machine learning model to train the machine learning model.
  • the frequency information and amplitude information of the EMG signal can be used as the input of the machine learning model.
  • the angular velocity of the attitude signal and the acceleration value of the angular velocity direction/angular velocity can be used as the input of the machine learning model.
  • the signals corresponding to the action start point, the action middle point and/or the action end point of the action signal can be used as the input of the machine learning model.
  • the sample information (each segment of EMG signal or/posture signal) from different action types may be labeled.
  • the sample information comes from the EMG signal and/or the posture signal generated when the user performs a sitting position to clamp the chest, and can be marked as "1", where "1" is used to represent "sitting position chest clamp”; the sample information comes from the user's performance of a bicep bend
  • the EMG signal and/or posture signal generated when lifting can be marked as "2", where "2" is used to represent "bicep curl”.
  • the characteristic information of EMG signals for example, frequency information, amplitude information
  • the characteristic information of attitude signals for example, angular velocity, angular velocity direction, and angular velocity value of angular velocity
  • the characteristic information corresponding to the EMG signal and/or the posture signal in the sample information is used as the input of the machine learning model to train the machine learning model, and the action recognition model for identifying the user's action type can be obtained.
  • Input action signal can output corresponding action type.
  • the action recognition model may further include a machine learning model for judging the quality of the user's action.
  • the sample information here may include standard action signals (also referred to as positive samples) and non-standard action signals (also referred to as negative samples).
  • Standard action signals may include action signals generated when the professional performs standard actions. For example, an action signal generated by a professional when performing a standard sitting posture chest-clamping exercise is a standard action signal.
  • the non-standard action signal may include an action signal generated by a user performing a non-standard action (eg, a wrong action).
  • the EMG signal and/or the gesture signal in the sample information can be processed by segmentation in the process 700, the glitch process in the process 900, and the conversion process in the process 1300, etc., to form at least a section of the EMG signal and/or the gesture Signal.
  • the at least one segment of the electromyographic signal and/or the posture signal can be used as an input to the machine learning model to train the machine learning model.
  • the positive samples and negative samples in the sample information (each segment of EMG signal or/posture signal) can be labeled. For example, positive samples are marked with "1" and negative samples are marked with "0".
  • the trained machine learning model can output different labels according to the input sample information (eg, positive samples, negative samples).
  • the action recognition model may include one or more machine learning models for analyzing and recognizing the quality of user actions, and different machine learning models may analyze and recognize sample information from different action types respectively.
  • the motion recognition model may further include a model that recognizes the number of motions of the user's fitness motion.
  • the action signals for example, the EMG signal and/or the posture signal
  • the segmental processing of the process 700 to obtain at least one group of action start points, action middle points, and action end points, and for each group of actions
  • the start point, the middle point of the action and the end point of the action are marked respectively.
  • the start point of the action is marked as 1
  • the middle point of the action is marked as 2
  • the end point of the action is marked as 3.
  • the mark is used as the input of the machine learning model.
  • Input a group of consecutive "1", “2", "3” to output 1 action. For example, entering 3 consecutive sets of "1", "2", “3” in a machine learning model can output 3 actions.
  • the action recognition model may also include a machine learning model for identifying the user fatigue index.
  • the sample information here may also include other physiological parameter signals such as ECG signals, respiratory rate, temperature signals, humidity signals, and the like.
  • ECG signals ECG signals
  • respiratory rate respiratory rate
  • temperature signals temperature signals
  • humidity signals and the like.
  • the different frequency ranges of the ECG signal can be used as input data for the machine learning model, and the frequency of the ECG signal is marked as "1" (normal) between 60 beats/min-100 beats/min, less than 60 beats/min or greater than 100 Times/min is marked as "2" (not normal).
  • further segmentation can be performed according to the frequency of the user's ECG signal and different indices can be marked as input data, and the trained machine learning model can output the corresponding fatigue index according to the frequency of the ECG signal.
  • the machine learning model can also be trained in combination with physiological parameter signals such as respiratory rate and temperature signal.
  • the sample information may be obtained from a storage device of the processing device 110 .
  • the sample information may be obtained from the acquisition module 210 .
  • the action recognition model may be any one of the above-mentioned machine learning models, or may be a combination of the above-mentioned multiple machine learning models, or include other machine learning models, which may be selected according to actual conditions.
  • the training input to the machine learning model is not limited to a segment (one cycle) of action signals, but may also be part of the action signals in a segment of signals, or multiple segments of action signals, and the like.
  • this step may be performed by processing device 110 .
  • the processing device 110 and/or the processing module 220 may extract the motion recognition model.
  • the motion recognition model may be stored in the processing device 110, the processing module 220 or the mobile terminal.
  • step 2040 a user action signal is acquired.
  • this step may be performed by acquisition module 210 .
  • the myoelectric sensor in the acquisition module 210 may acquire the user's myoelectric signal
  • the gesture sensor in the acquisition module 210 may acquire the user's gesture signal.
  • the user action signal may also include other physiological parameter signals, such as an electrocardiogram signal, a breathing signal, a temperature signal, a humidity signal, and the like when the user is exercising.
  • the motion signal (for example, the EMG signal and/or the gesture signal) may be subjected to the segmentation processing of the process 700, the glitch processing of the process 900, and the conversion process of the process 1300, etc., to form at least A segment of EMG and/or posture signals.
  • step 2050 the user action is determined based on the user action signal through the action recognition model.
  • This step may be performed by processing device 110 and/or processing module 220 .
  • the processing device 110 and/or the processing module 220 may determine the user action based on the action recognition model.
  • the trained action recognition model may include one or more machine learning models.
  • the action recognition model may include, but is not limited to, a machine learning model for classifying user action signals, a machine learning model for identifying the quality of user actions, a machine learning model for identifying the number of user actions, and a fatigue index for identifying user actions performed. one or more of the machine learning models. Different machine learning models can have different recognition effects. For example, a machine learning model for classifying user action signals can take the user action signals as input data and then output corresponding action types.
  • a machine learning model for identifying the quality of a user's action may take the user's action signal as input data and then output the quality of the action (eg, standard action, wrong action).
  • the machine learning model for identifying the fatigue index of the user performing actions may take the user's action signal (eg, the frequency of the ECG signal) as input data and then output the user's fatigue index.
  • the user action signal and the judgment result (output) of the machine learning model can also be used as sample information for training the action recognition model, and the action recognition model is trained to optimize the relevant parameters of the action recognition model.
  • the action recognition model is not limited to the above-mentioned trained machine learning model, but can also be a preset model, such as a manually preset condition judgment algorithm or an artificial increase on the basis of the trained machine learning model. parameters (eg, confidence), etc.
  • step 2060 the user action is fed back based on the judgment result.
  • this step may be performed by the wearable device 130 and/or the mobile terminal device 140 .
  • the processing device 110 and/or the processing module 220 sends a feedback instruction to the wearable device 130 and/or the mobile terminal device 140 based on the judgment result of the user's action, and the wearable device 130 and/or the mobile terminal device 140 sends a feedback instruction to the user based on the feedback instruction.
  • the feedback may include issuing prompt information (eg, text information, picture information, video information, voice information, indicator light information, etc.) and/or performing corresponding actions (electrical stimulation, vibration, pressure change, heat change, etc.) way) stimulates the user's body.
  • the input/output module 260 in the wearable device 130 for example, a vibration reminder
  • the mobile terminal device 140 for example, a smart watch, a smart phone, etc.
  • feedback actions for example, on the user's body Vibration is applied to the part, voice prompts are issued, etc.
  • the mobile terminal device 140 can output the corresponding motion record. , so that users can understand their movements during exercise.
  • the feedback when providing feedback to the user, can be matched to user perception. For example, when the user's action is not standard, vibration stimulation is applied to the area corresponding to the user's action, and the user can know that the action is not standard based on the vibration stimulation, and the vibration stimulation is within the acceptable range of the user. Further, a matching model can be established based on the user action signal and user perception, to find the best balance between user perception and real feedback.
  • the motion recognition model can also be trained according to the user motion signal.
  • training the motion recognition model based on the user motion signal may include evaluating the user motion signal to determine a confidence level of the user motion signal.
  • the magnitude of the confidence level can represent the quality of the user action signal. For example, the higher the confidence, the better the quality of the user action signal.
  • evaluating the user motion signal may be performed in stages of acquiring the motion signal, preprocessing, segmentation, and/or identification.
  • training the action recognition model according to the user action signal may further include judging whether the confidence level is greater than a confidence level threshold (for example, 80), and if the confidence level is greater than or equal to the confidence level threshold, then based on the user action corresponding to the confidence level The signal is used as sample data to train the action recognition model; if the confidence level is less than the confidence level threshold, the user action signal corresponding to the confidence level is not used as the sample data to train the action recognition model.
  • the confidence level may include, but is not limited to, the confidence level of any stage of acquisition of motion signals, signal preprocessing, signal segmentation, or signal identification. For example, the confidence level of the action signal collected by the acquisition module 210 is used as the judgment criterion.
  • the confidence level may also be a joint confidence level of any stages of acquisition of motion signals, signal preprocessing, signal segmentation, or signal identification.
  • the joint confidence can be calculated based on the confidence of each stage and by averaging or weighting.
  • training the action recognition model according to the user action signal may be performed in real time, periodically (eg, one day, one week, one month, etc.) or to meet a certain amount of data.
  • aspects of this application may be illustrated and described in several patentable categories or situations, including any new and useful process, machine, product, or combination of matter, or combinations of them. of any new and useful improvements. Accordingly, various aspects of the present application may be performed entirely by hardware, entirely by software (including firmware, resident software, microcode, etc.), or by a combination of hardware and software.
  • the above hardware or software may be referred to as a "data block”, “module”, “engine”, “unit”, “component” or “system”.
  • aspects of the present application may be embodied as a computer product comprising computer readable program code embodied in one or more computer readable media.
  • the computer program coding required for the operation of the various parts of this application may be written in any one or more programming languages, including object-oriented programming languages such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB.NET, Python Etc., conventional procedural programming languages such as C language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
  • the program code may run entirely on the user's computer, or as a stand-alone software package on the user's computer, or partly on the user's computer and partly on a remote computer, or entirely on the remote computer or processing device.
  • the remote computer can be connected to the user's computer through any network, such as a local area network (LAN) or wide area network (WAN), or to an external computer (eg, through the Internet), or in a cloud computing environment, or as a service Use eg software as a service (SaaS).
  • LAN local area network
  • WAN wide area network
  • SaaS software as a service

Abstract

一种运动监控方法(500),包括:获取用户运动时的动作信号,其中,动作信号至少包括肌电信号或姿态信号(510);以及至少基于肌电信号对应的特征信息或姿态信号对应的特征信息对用户运动的动作进行监控(520)。

Description

一种运动监控方法及其系统 技术领域
本申请涉及可穿戴设备技术领域,特别涉及一种运动监控方法和系统。
背景技术
随着人们对科学运动和身体健康的关注,运动监控设备正在极大的发展。目前运动监控设备主要是对用户运动过程中的部分生理参数信息(例如,心率、体温、步频、血氧等)进行监控,而无法准确地对用户的动作进行监控和反馈。在实际场景中,对用户的动作进行监控和反馈过程往往需要专业人员的参与。例如,在健身场景中,用户一般只能在健身教练的指导下对健身动作进行不断改正。
因此,希望提供一种可以指导人运动的运动监控设备,从而帮助用户科学的进行运动。
发明内容
本申请的一个方面提供一种运动监控方法,包括:获取用户运动时的动作信号,其中,所述动作信号至少包括肌电信号或姿态信号;以及至少基于所述肌电信号对应的特征信息或所述姿态信号对应的特征信息对所述用户运动的动作进行监控。
在一些实施例中,至少基于所述肌电信号对应的特征信息或所述姿态信号对应的特征信息对所述用户运动的动作进行监控包括:基于与所述肌电信号对应的特征信息或与所述姿态信号对应的特征信息对所述动作信号进行分段;以及基于至少一段所述动作信号对所述用户运动的动作进行监控。
在一些实施例中,所述肌电信号对应的特征信息至少包括频率信息或幅值信息,所述姿态信号对应的特征信息至少包括角速度方向、角速度值和角速度的加速度值、角度、位移信息、应力中的其中一个。
在一些实施例中,所述基于与所述肌电信号对应的特征信息或所述姿态信号对应的特征信息对所述动作信号进行分段包括:基于所述肌电信号或所述姿态信号的时域窗口,根据预设条件从所述时域窗口内确定至少一个目标特征点;以及基于所述至少一个目标特征点对所述动作信号进行分段。
在一些实施例中,所述至少一个目标特征点包括动作开始点、动作中间点、动作结束点中的一种。
在一些实施例中,所述预设条件包括所述姿态信号对应的角速度方向发生变化、所述姿态信号对应的角速度大于或等于角速度阈值、所述姿态信号对应的角速度值的变化值为极值、所述姿态信号对应的角度达到角度阈值、所述肌电信号对应的幅值信息大于或等于肌电阈值中的一个或多个。
在一些实施例中,所述预设条件还包括所述姿态信号对应的角速度的加速度在第一特定时间范围内持续大于或等于所述角速度的加速度阈值。
在一些实施例中,所述预设条件还包括所述肌电信号对应的幅值在第二特定时间范围内持续大于所述肌电阈值。
在一些实施例中,所述至少基于所述肌电信号对应的特征信息或所述姿态信号对应的特征信息对所述用户运动的动作进行监控包括:在频域或时域上对所述肌电信号进行预处理;以及基于预处理后的所述肌电信号获取所述肌电信号对应的特征信息,并根据所述肌电信号对应的特征信息或所述姿态信号对应的特征信息对所述用户运动的动作进行监控。
在一些实施例中,所述在频域或时域上对所述肌电信号进行预处理包括:对所述肌电信号进行滤波以在频域上选取所述肌电信号中特定频率范围的成分。
在一些实施例中,所述在频域或时域上对所述肌电信号进行预处理包括在时域上对所述肌电信号进行信号校正处理。
在一些实施例中,所述在时域上对所述肌电信号进行信号校正处理包括:确定所述肌电信号中的奇异点,所述奇异点对应所述肌电信号中的突变信号;以及对所述肌电信号的奇异点进行信号校正处理。
在一些实施例中,所述对所述肌电信号的奇异点进行信号校正处理包括去除所述奇异点或者根据所述奇异点周围的信号对所述奇异点进行修正。
在一些实施例中,所述奇异点包括毛刺信号,所述确定所述肌电信号中的奇异点包括:基于所述肌电信号的时域窗口,从所述肌电信号的时域窗口内选取不同的时间窗口,其中,所述不同的时间窗口分别覆盖不同的时间范围;以及基于所述不同的时间窗口中肌电信号对应的特征信息确定所述毛刺信号。
在一些实施例中,还包括基于所述姿态信号确定与所述姿态信号对应的特征信息,其中,所述姿态信号包括至少一个原始坐标系中的坐标信息;所述基于所述姿态信号确定与所述姿态信号对应的特征信息包括:获取目标坐标系以及所述目标坐标系与所述至少一个原始坐标系之间的转换关系;基于所述转换关系,将所述至少一个原始坐标 系中的坐标信息转换为所述目标坐标系中的坐标信息;以及基于所述目标坐标系中的坐标信息,确定与所述姿态信号对应的特征信息。
在一些实施例中,所述姿态信号包括由至少两个传感器产生的坐标信息,所述至少两个传感器分别位于用户的不同运动部位并且对应不同的原始坐标系,所述基于所述姿态信号确定与所述姿态信号对应的特征信息包括:基于所述不同的原始坐标系与所述目标坐标系的转换关系,确定与所述至少两个传感器分别对应的特征信息;以及基于与所述至少两个传感器分别对应的特征信息,确定用户的不同运动部位之间的相对运动。
在一些实施例中,所述至少一个原始坐标系与所述目标坐标系之间的转换关系通过标定过程获得,所述标定过程包括:构建特定坐标系,所述特定坐标系与标定过程中用户的朝向有关;获取用户处于第一姿势时所述至少一个原始坐标系中的第一坐标信息;获取用户处于第二姿势时所述至少一个原始坐标系统的第二坐标信息;以及根据所述第一坐标信息、第二坐标信息和所述特定坐标系确定所述至少一个原始坐标系与所述特定坐标系的转换关系。
在一些实施例中,所述标定过程还包括:获取所述特定坐标系与所述目标坐标系的转换关系;以及根据所述至少一个原始坐标系与所述特定坐标系的转换关系,以及所述特定坐标系与所述目标坐标系的转换关系,确定所述至少一个原始坐标系与所述目标坐标系之间的转换关系。
在一些实施例中,所述目标坐标系随着用户的朝向变化而改变。
本申请的另一个方面提供一种动作识别模型的训练方法,包括:获取样本信息,所述样本信息包括用户运动时的动作信号,所述动作信号至少包括肌电信号对应的特征信息和姿态信号对应的特征信息;以及基于所述样本信息训练所述动作识别模型。
本申请的另一个方面还提供一种运动监控和反馈方法,包括:获取用户运动时的动作信号,其中,所述动作信号至少包括肌电信号和姿态信号;以及通过动作识别模型,基于所述肌电信号对应的特征信息和所述姿态信号对应的特征信息对用户的动作进行监控,并基于动作识别模型的输出结果进行动作反馈。
在一些实施例中,所述动作识别模型包括经过训练的机器学习模型或预先设定的模型。
在一些实施例中,所述动作反馈至少包括发出提示信息、刺激用户的运动部位、输出用户运动时的运动记录中的一种。
附图说明
本申请将以示例性实施例的方式进一步说明,这些示例性实施例将通过附图进行详细描述。这些实施例并非限制性的,在这些实施例中,相同的编号表示相同的结构,其中:
图1是根据本申请一些实施例所示的运动监控系统的应用场景示意图;
图2是根据本申请一些实施例所示的可穿戴设备的示例性硬件和/或软件的示意图;
图3是根据本申请一些实施例所示的计算设备的示例性硬件和/或软件的示意图;
图4是根据本申请一些实施例所示的可穿戴设备的示例性结构图;
图5是根据本申请一些实施例所示的运动监控方法的示例性流程图;
图6是根据本申请一些实施例所示的对用户运动动作进行监控的示例性流程图;
图7是根据本申请一些实施例所示的动作信号分段的示例性流程图;
图8是根据本申请一些实施例所示的动作信号分段的示例性归一化结果图;
图9是根据本申请一些实施例所示的肌电信号预处理的示例性流程图;
图10是根据本申请一些实施例所示的去毛刺信号的示例性流程图;
图11是根据本申请一些实施例所示的确定姿态信号对应的特征信息的示例性流程图;
图12是根据本申请一些实施例所示的确定用户的不同运动部位之间的相对运动的示例性流程图;
图13是根据本申请一些实施例所示的确定原始坐标系与特定坐标系的转换关系的示例性流程图;
图14是根据本申请一些实施例所示的确定原始坐标系与目标坐标系之间的转换关系的示例性流程图;
图15A是根据本申请一些实施例所示的人体小臂位置处原始坐标系中的欧拉角数据的示例性向量坐标图;
图15B是根据本申请一些实施例所示的人体小臂位置另一处原始坐标系中的欧拉角数据的示例性向量坐标图;
图16A是根据本申请一些实施例所示的人体小臂位置处的目标坐标系中的欧拉角数据的示例性向量坐标图;
图16B是根据本申请一些实施例所示的人体小臂位置另一处的目标坐标系中的欧拉角数据的示例性向量坐标图;
图17是根据本申请一些实施例所示的多传感器的目标坐标系中的欧拉角数据的示例性向量坐标图;
图18A是根据本申请一些实施例所示的原始角速度的示例性结果图;
图18B是根据本申请一些实施例所示的滤波处理后的角速度的示例性结果图;
图19是根据本申请一些实施例所示的运动监控和反馈方法的示例性流程图;
图20是根据本申请一些实施例所示的模型训练的应用的示例性流程图。
具体实施方式
为了更清楚地说明本申请实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本申请的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本申请应用于其它类似情景。除非从语言环境中显而易见或另做说明,图中相同标号代表相同结构或操作。
应当理解,本文使用的“系统”、“装置”、“单元”和/或“模组”是用于区分不同级别的不同组件、元件、部件、部分或装配的一种方法。然而,如果其他词语可实现相同的目的,则可通过其他表达来替换所述词语。
如本申请和权利要求书中所示,除非上下文明确提示例外情形,“一”、“一个”、“一种”和/或“该”等词并非特指单数,也可包括复数。一般说来,术语“包括”与“包含”仅提示包括已明确标识的步骤和元素,而这些步骤和元素不构成一个排它性的罗列,方法或者设备也可能包含其它的步骤或元素。
本申请中使用了流程图用来说明根据本申请的实施例的系统所执行的操作。应当理解的是,前面或后面操作不一定按照顺序来精确地执行。相反,可以按照倒序或同时处理各个步骤。同时,也可以将其他操作添加到这些过程中,或从这些过程移除某一步或数步操作。
本说明书中提供一种运动监控系统,该运动监控系统可以获取用户运动时的动作信号,其中,动作信号至少包括肌电信号、姿态信号、心电信号、呼吸频率信号等。该系统可以至少基于肌电信号对应的特征信息或姿态信号对应的特征信息对用户运动的动作进行监控。例如,通过肌电信号对应的频率信息、幅值信息和姿态信号对应的角 速度、角速度方向和角速度的角速度值、角度、位移信息、应力等确定用户的动作类型、动作数量、动作质量动作时间、或者用户实施动作时的生理参数信息等。在一些实施例中,运动监控系统还可以根据对用户健身动作的分析结果,生成对用户健身动作的反馈,以对用户的健身进行指导。例如,用户的健身动作不标准时,运动监控系统可以对用户发出提示信息(例如,语音提示、振动提示、电流刺激等)。该运动监控系统可以应用于可穿戴设备(例如,服装、护腕、头盔)、医学检测设备(例如,肌电测试仪)、健身设备等,该运动监控系统通过获取用户运动时的动作信号可以对用户的动作进行精准地监控和反馈,而不需要专业人员的参与,可以在提高用户的健身效率的同时降低用户健身的成本。
图1是根据本申请一些实施例所示的运动监控系统的应用场景示意图。如图1所示,运动监控系统100可以包括处理设备110、网络120、可穿戴设备130和移动终端设备140。运动监控系统100可以获取用于表征用户运动动作的动作信号(例如,肌电信号、姿态信号、心电信号、呼吸频率信号等)并根据用户的动作信号对用户运动时的动作进行监控和反馈。
例如,运动监控系统100可以对用户健身时的动作进行监控和反馈。当用户穿戴可穿戴设备130进行健身运动时,可穿戴设备130可以获取用户的动作信号。处理设备110或移动终端设备可以接收并对用户的动作信号进行分析,以判断用户的健身动作是否规范,从而对用户的动作进行监控。具体地,对用户的动作进行监控可以包括确定动作的动作类型、动作数量、动作质量、动作时间、或者用户实施动作时的生理参数信息等。进一步地,运动监控系统100可以根据对用户健身动作的分析结果,生成对用户健身动作的反馈,以对用户的健身进行指导。
再例如,运动监控系统100可以对用户跑步时的动作进行监控和反馈。例如,当用户穿戴可穿戴设备130进行跑步运动时,运动监控系统100可以监控用户跑步动作是否规范,跑步时间是否符合健康标准等。当用户跑步时间过长或者跑步动作不正确时,健身设备可以向用户反馈其运动状态,以提示用户需要调整跑步动作或者跑步时间。
在一些实施例中,处理设备110可以用于处理与用户运动相关的信息和/或数据。例如,处理设备110可以接收用户的动作信号(例如,肌电信号、姿态信号、心电信号、呼吸频率信号等),并进一步提取动作信号对应的特征信息(例如,动作信号中的肌电信号对应的特征信息、姿态信号对应的特征信息)。在一些实施例中,处理设备110可以对可穿戴设备130采集的肌电信号或姿态信号进行特定的信号处理,例如信号分段、 信号预处理(例如,信号校正处理、滤波处理等)等。在一些实施例中,处理设备110也可以基于用户的动作信号判断用户动作是否正确。例如,处理设备110可以基于肌电信号对应的特征信息(例如,幅值信息、频率信息等)判断用户动作是否正确。又例如,处理设备110可以基于姿态信号对应的特征信息(例如,角速度、角速度方向、角速度的加速度、角度、位移信息、应力等)判断用户动作是否正确。再例如,处理设备110可以基于肌电信号对应的特征信息和姿态信号对应的特征信息判断用户动作是否正确。在一些实施例中,处理设备110还可以判断用户运动时的生理参数信息是否符合健康标准。在一些实施例中,处理设备110还可以发出相应指令,用以反馈用户的运动情况。例如,用户进行跑步运动时,运动监控系统100监控到用户跑步时间过长,此时处理设备110可以向移动终端设备140发出指令以提示用户调整跑步时间。需要注意的是,姿态信号对应的特征信息并不限于上述的角速度、角速度方向、角速度的加速度、角度、位移信息、应力等,还可以为其它特征信息,凡是能够用于体现用户身体发生相对运动的参数信息都可以为姿态信号对应的特征信息。例如,当姿态传感器为应变式传感器时,通过测量应变式传感器中随着拉伸长度而变化的电阻的大小,可以获取用户关节处的弯曲角度和弯曲方向。
在一些实施例中,处理设备110可以是本地的或者远程的。例如,处理设备110可以通过网络120访问存储于可穿戴设备130和/或移动终端设备140中的信息和/或资料。在一些实施例中,处理设备110可以直接与可穿戴设备130和/或移动终端设备140连接以访问存储于其中的信息和/或资料。例如,处理设备110可以位于可穿戴设备130中,并通过网络120实现与移动终端设备140的信息交互。再例如,处理设备110可以位于移动终端设备140中,并通过网络实现与可穿戴设备130的信息交互。在一些实施例中,处理设备110可以在云平台上执行。例如,该云平台可以包括私有云、公共云、混合云、社区云、分散式云、内部云等中的一种或其任意组合。
在一些实施例中,处理设备110可以处理与运动监控有关的数据和/或信息以执行一个或多个本申请中描述的功能。在一些实施例中,处理设备可以获取可穿戴设备130采集的用户运动时的动作信号。在一些实施例中,处理设备可以向可穿戴设备130或移动终端设备140发送控制指令。控制指令可以控制可穿戴设备130及其各传感器的开关状态。还可以控制控制移动终端设备140发出提示信息。在一些实施例中,处理设备110可以包含一个或多个子处理设备(例如,单芯处理设备或多核多芯处理设备)。仅仅作为范例,处理设备110可包含中央处理器(CPU)、专用集成电路(ASIC)、专 用指令处理器(ASIP)、图形处理器(GPU)、物理处理器(PPU)、数字信号处理器(DSP)、现场可编程门阵列(FPGA)、可编辑逻辑电路(PLD)、控制器、微控制器单元、精简指令集电脑(RISC)、微处理器等或以上任意组合。
网络120可以促进运动监控系统100中数据和/或信息的交换。在一些实施例中,运动监控系统100中的一个或多个组件(例如,处理设备110、可穿戴设备130、移动终端设备140)可以通过网络120发送数据和/或信息给运动监控系统100中的其他组件。例如,可穿戴设备130采集的动作信号可以通过网络120传输至处理设备110。又例如,处理设备110中关于动作信号的确认结果可以通过网络120传输至移动终端设备140。在一些实施例中,网络120可以是任意类型的有线或无线网络。例如,网络120可以包括缆线网络、有线网络、光纤网络、电信网络、内部网络、网际网络、区域网络(LAN)、广域网络(WAN)、无线区域网络(WLAN)、都会区域网络(MAN)、公共电话交换网络(PSTN)、蓝牙网络、ZigBee网络、近场通讯(NFC)网络等或以上任意组合。在一些实施例中,网络120可以包括一个或多个网络进出点。例如,网络120可以包含有线或无线网络进出点,如基站和/或网际网络交换点120-1、120-2、…,通过这些进出点,运动监控系统100的一个或多个组件可以连接到网络120上以交换数据和/或信息。
可穿戴设备130是指具有穿戴功能的服装或设备。在一些实施例中,可穿戴设备130可以包括但不限于上衣装置130-1、裤子装置130-2、护腕装置130-3和鞋子130-4等。在一些实施例中,可穿戴设备130可以包括多个传感器。传感器可以获取用户运动时的各种动作信号(例如,肌电信号、姿态信号、温度信息、心跳频率、心电信号等)。在一些实施例中,传感器可以包括但不限于肌电传感器、姿态传感器、温度传感器、湿度传感器、心电传感器、血氧饱和度传感器、霍尔传感器、皮电传感器、旋转传感器等中的一种或多种。例如,上衣装置130-1中人体肌肉位置(例如,肱二头肌、肱三头肌、背阔肌、斜方肌等)处可以设置肌电传感器,肌电传感器可以贴合用户皮肤并采集用户运动时的肌电信号。又例如,上衣装置130-1中人体左侧胸肌附近可以设置心电传感器,心电传感器可以采集用户的心电信号。再例如,裤子装置130-2中人体肌肉位置(例如,臀大肌、股外侧肌、股内侧肌、腓肠肌等)处可以设置姿态传感器,姿态传感器可以采集用户的姿态信号。在一些实施例中,可穿戴设备130还可以对用户的动作进行反馈。例如,用户运动时身体某一部位的动作不符合标准时,该部位对应的肌电传感器可以产生刺激信号(例如,电流刺激或者击打信号)以提醒用户。
需要注意的是,可穿戴设备130并不限于图1中所示的上衣装置130-1、裤子装置130-2、护腕装置130-3和鞋子装置130-4,还可以包括应用在其他需要进行运动监控的设备,例如、头盔装置、护膝装置等,在此不做限定,任何可以使用本说明书所包含的运动监控方法的设备都在本申请的保护范围内。
在一些实施例中,移动终端设备140可以获取运动监控系统100中的信息或数据。在一些实施例中,移动终端设备140可以接收处理设备110处理后的运动数据,并基于处理后的运动数据反馈运动记录等。示例性的反馈方式可以包括但不限于语音提示、图像提示、视频展示、文字提示等。在一些实施例中,用户可以通过移动终端设备140获取自身运动过程中的动作记录。例如,移动终端设备140可以与可穿戴设备130通过网络120连接(例如,有线连接、无线连接),用户可以通过移动终端设备140获取用户运动过程中的动作记录,该动作记录可通过移动终端设备140传输至处理设备110。在一些实施例中,移动终端设备140可以包括移动装置140-1、平板电脑140-2、笔记本电脑140-3等中的一种或其任意组合。在一些实施例中,移动装置140-1可以包括手机、智能家居装置、智能行动装置、虚拟实境装置、增强实境装置等,或其任意组合。在一些实施例中,智能家居装置可以包括智能电器的控制装置、智能监测装置、智能电视、智能摄像机等,或其任意组合。在一些实施例中,智能行动装置可以包括智能电话、个人数字助理(PDA)、游戏装置、导航装置、POS装置等,或其任意组合。在一些实施例中,虚拟实境装置和/或增强实境装置可以包括虚拟实境头盔、虚拟实境眼镜、虚拟实境眼罩、增强实境头盔、增强实境眼镜、增强实境眼罩等,或其任意组合。
在一些实施例中,运动监控系统100还可以包括数据库。数据库可以存储资料(例如,初始设置的阈值条件等)和/或指令(例如,反馈指令)。在一些实施例中,数据库可以存储从可穿戴设备130和/或移动终端设备140获取的资料。在一些实施例中,数据库可以存储供处理设备110执行或使用的信息和/或指令,以执行本申请中描述的示例性方法。在一些实施例中,数据库可以包括大容量存储器、可移动存储器、挥发性读写存储器(例如,随机存取存储器RAM)、只读存储器(ROM)等,或其任意组合。在一些实施例中,数据库可以在云平台上实现。例如,该云平台可以包括私有云、公共云、混合云、社区云、分散式云、内部云等,或其任意组合。
在一些实施例中,数据库可以与网络120连接以与运动监控系统100的一个或多个组件(例如,处理设备110、可穿戴设备130、移动终端设备140等)通讯。运动监控系统100的一个或多个组件可以通过网络120访问存储于数据库中的资料或指令。 在一些实施例中,数据库可以直接与运动监控系统100中的一个或多个组件(如,处理设备110、可穿戴设备130、移动终端设备140)连接或通讯。在一些实施例中,数据库可以是处理设备110的一部分。
图2是根据本申请一些实施例所示的可穿戴设备的示例性硬件和/或软件的示意图。如图2所示,可穿戴设备130可以包括获取模块210、处理模块220(也被称为处理器)、控制模块230(也被称为主控、MCU、控制器)、通讯模块240、供电模块250以及输入/输出模块260。
获取模块210可以用于获取用户运动时的动作信号。在一些实施例中,获取模块210可以包括传感器单元,传感器单元可以用于获取用户运动时的一种或多种动作信号。在一些实施例中,传感器单元可以包括但不限于肌电传感器、姿态传感器、心电传感器、呼吸传感器、温度传感器、湿度传感器、惯性传感器、血氧饱和度传感器、霍尔传感器、皮电传感器、旋转传感器等中的一种或多种。在一些实施例中,动作信号可以包括肌电信号、姿态信号、心电信号、呼吸频率、温度信号、湿度信号等中的一种或多种。传感器单元可以根据所要获取的动作信号类型放置在可穿戴设备130的不同位置。例如,在一些实施例中,肌电传感器(也被称为电极元件)可以设置于人体肌肉位置,肌电传感器可以被配置为采集用户运动时的肌电信号。肌电信号及其对应的特征信息(例如,频率信息、幅值信息等)可以反映用户运动时肌肉的状态。姿态传感器可以设置于人体的不同位置(例如,可穿戴设备130中与躯干、四肢、关节对应的位置),姿态传感器可以被配置为采集用户运动时的姿态信号。姿态信号及其对应的特征信息(例如,角速度方向、角速度值、角速度加速度值、角度、位移信息、应力等)可以反映用户运动的姿势。心电传感器可以设置于人体胸口周侧的位置,心电传感器可以被配置为采集用户运动时的心电数据。呼吸传感器可以设置于人体胸口周侧的位置,呼吸传感器可以被配置为采集用户运动时的呼吸数据(例如,呼吸频率、呼吸幅度等)。温度传感器可以被配置为采集用户运动时的温度数据(例如,体表温度)。湿度传感器可以被配置为采集用户运动时的外部环境的湿度数据。
处理模块220可以处理来自获取模块210、控制模块230、通讯模块240、供电模块250和/或输入/输出模块260的数据。例如,处理模块220可以处理来自获取模块210的用户运动过程中的动作信号。在一些实施例中,处理模块220可以将获取模块210获取的动作信号(例如,肌电信号、姿态信号)进行预处理。例如,处理模块220对用户运动时的肌电信号或姿态信号进行分段处理。又例如,处理模块220可以对用户运动 时的肌电信号进行预处理(例如,滤波处理、信号校正处理),以提高肌电信号质量。再例如,处理模块220可以基于用户运动时的姿态信号确定与姿态信号对应的特征信息。在一些实施例中,处理模块220可以处理来自输入/输出模块260的指令或操作。在一些实施例中,处理后的数据可以存储到存储器或硬盘中。在一些实施例中,处理模块220可以将其处理后的数据通过通讯模块240或网络120传送到运动监控系统100中的一个或者多个组件中。例如,处理模块220可以将用户运动的监控结果发送给控制模块230,控制模块230可以根据动作确定结果执行后续的操作或指令。
控制模块230可以与可穿戴设备130中其他模块相连接。在一些实施例中,控制模块230可以控制可穿戴设备130中其它模块(例如,通讯模块240、供电模块250、输入/输出模块260)的运行状态。例如,控制模块230可以控制供电模块250的供电状态(例如,正常模式、省电模式)、供电时间等。当供电模块250的剩余电量到达一定阈值(如,10%)以下时,控制模块230可以控制供电模块250进入省电模式或发出关于补充电量的提示信息。又例如,控制模块230可以根据用户的动作确定结果控制输入/输出模块260,进而可以控制移动终端设备140向用户发送其运动的反馈结果。当用户运动时的动作出现问题(例如,动作不符合标准)时,控制模块230可以控制输入/输出模块260,进而可以控制移动终端设备140向用户进行反馈,使得用户可以实时了解自身运动状态并对动作进行调整。在一些实施例中,控制模块230还可以控制获取模块210中的一个或多个传感器或者其它模块对人体进行反馈。例如,当用户运动过程中某块肌肉发力强度过大,控制模块230可以控制该肌肉位置处的电极模块对用户进行电刺激以提示用户及时调整动作。
在一些实施例中,通讯模块240可以用于信息或数据的交换。在一些实施例中,通讯模块240可以用于可穿戴设备130内部组件(例如,获取模块210、处理模块220、控制模块230、供电模块250、输入/输出模块260)之间的通信。例如,获取模块210可以发送用户动作信号(例如,肌电信号、姿态信号等)到通讯模块240,通讯模块240可以将所述动作信号发送给处理模块220。在一些实施例中,通讯模块240还可以用于可穿戴设备130和运动监控系统100中的其他组件(例如,处理设备110、移动终端设备140)之间的通信。例如,通讯模块240可以将可穿戴设备130的状态信息(例如,开关状态)发送到处理设备110,处理设备110可以基于所述状态信息对可穿戴设备130进行监控。通讯模块240可以采用有线、无线以及有线/无线混合技术。有线技术可以基于诸如金属电缆、混合电缆、光缆等一种或多种光缆组合的方式。无线技术可以包括 蓝牙(Bluetooth)、无线网(Wi-Fi)、紫蜂(ZigBee)、近场通信(Near Field Communication,NFC)、射频识别技术(Radio Frequency Identification,RFID)、蜂窝网络(包括GSM、CDMA、3G、4G、5G等)、基于蜂窝的窄带物联网(Narrow Band Internet of Things,NBIoT)等。在一些实施例中,通讯模块240可以采用一种或多种编码方式对传输的信息进行编码处理,例如,编码方式可以包括相位编码、不归零制码、差分曼彻斯特码等。在一些实施例中,通讯模块240可以根据需要传输的数据类型或网络类型,选择不同的传输和编码方式。在一些实施例中,通讯模块240可以包括一个或多个通信接口,用于不同的通信方式。在一些实施例中,运动监控系统100的图示其他模块可以是分散在多个设备上的,在这种情况下,其他各个模块可以分别包括一个或多个通讯模块240,来进行模块间的信息传输。在一些实施例中,通讯模块240可以包括一个接收器和一个发送器。在另一些实施例中,通讯模块240可以是一个收发器。
在一些实施例中,供电模块250可以为运动监控系统100中的其他组件(例如,获取模块210、处理模块220、控制模块230、通讯模块240、输入/输出模块260)提供电力。供电模块250可以从处理模块220接收控制信号以控制可穿戴设备130的电力输出。例如,可穿戴设备130在一定时间段(例如,1s、2s、3s或4s)内没有接收到任何操作的情况下(例如,获取模块210未检测到动作信号),供电模块250可以仅向存储器供电,使可穿戴设备130进入待机模式。又例如,可穿戴设备130在一定时间段(例如,1s、2s、3s或4s)内没有接收到任何操作的情况下(例如,获取模块210未检测到动作信号),供电模块250可以断开对其它组件的供电,运动监控系统100中的数据可以转存到硬盘中,使可穿戴设备130进入待机模式或睡眠模式。在一些实施例中,供电模块250可以包括至少一个电池。所述电池可以包括干电池、铅蓄电池、锂电池、太阳能电池、风能发电电池、机械能发电电池、热能发电电池等中的一种或几种的组合。所述太阳能电池可以将光能转化为电能并存储在供电模块250中。所述风能发电电池可以将风能转化为电能并存储在供电模块250中。所述机械能发电电池可以将机械能转化为电能并存储在供电模块250中。所述太阳能电池可以包括硅太阳能电池、薄膜太阳能电池、纳米晶化学太阳能电池、燃料敏化太阳能电池、塑料太阳能电池等。所述太阳能电池可以以电池板的形式分布在可穿戴设备130上。所述热能发电电池可以将用户体温转换为电能并存储在供电模块250中。在一些实施例中,当供电模块250的电量小于电量阈值(例如,总电量的10%)时,处理模块220可以向供电模块250发送控制信号。该控制信号可以包括所述供电模块250电量不足的信息。在一些实施例中,供电模块250 可以包含备用电源。在一些实施例中,供电模块250还可以包括充电接口。例如,供电模块250在紧急情况(如供电模块250电量为0,外部电力系统停电无法供电)下,可以使用用户随身携带的电子设备(如,手机、平板电脑)或充电宝对供电模块250进行临时充电。
输入/输出模块260可以获取、传输和发送信号。输入/输出模块260可以与运动监控系统100中的其他组件进行连接或通信。运动监控系统100中的其他组件可以通过输入/输出模块260实现连接或通信。输入/输出模块260可以是有线的USB接口、串行通信接口、并行通信口,或是无线的蓝牙、红外、无线射频识别(Radio-frequency identification,RFID)、无线局域网鉴别与保密基础结构(Wlan Authentication and Privacy Infrastructure,WAPI)、通用分组无线业务(General Packet Radio Service,GPRS)、码分多址(Code Division Multiple Access,CDMA)等,或其任意组合。在一些实施例中,输入/输出模块260可以与网络120连接,并通过网络120获取信息。例如,输入/输出模块260可以通过网络120或通讯模块240从获取模块210中获取用户运动过程中的动作信号并将用户运动信息进行输出。在一些实施例中,输入/输出模块260可以包括VCC、GND、RS-232、RS-485(例如,RS485-A,RS485-B)和通用网络接口等,或其任意组合。在一些实施例中,输入/输出模块260可以将获取到的用户运动信息,通过网络120传送给获取模块210。在一些实施例中,输入/输出模块260可以采用一种或多种编码方式对传输的信号进行编码处理。所述编码方式可以包括相位编码、不归零制码、差分曼彻斯特码等,或其任意组合。
应当理解,图2所示的系统及其模块可以利用各种方式来实现。例如,在一些实施例中,系统及其模块可以通过硬件、软件或者软件和硬件的结合来实现。其中,硬件部分可以利用专用逻辑来实现;软件部分则可以存储在存储器中,由适当的指令执行系统,例如微处理器或者专用设计硬件来执行。本领域技术人员可以理解上述的方法和系统可以使用计算机可执行指令和/或包含在处理器控制代码中来实现,例如在诸如磁盘、CD或DVD-ROM的载体介质、诸如只读存储器(固件)的可编程的存储器或者诸如光学或电子信号载体的数据载体上提供了这样的代码。本说明书的一个或多个实施例的系统及其模块不仅可以有诸如超大规模集成电路或门阵列、诸如逻辑芯片、晶体管等的半导体、或者诸如现场可编程门阵列、可编程逻辑设备等的可编程硬件设备的硬件电路实现,也可以用例如由各种类型的处理器所执行的软件实现,还可以由上述硬件电路和软件的结合(例如,固件)来实现。
需要注意的是,以上对于运动监控系统及其模块的描述,仅为描述方便,并不能把本说明书的一个或多个实施例限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该系统的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接,或者对其中的一个或多个模块进行省略。例如,获取模块210和处理模块220可以为一个模块,该模块可以具有获取和处理用户动作信号的功能。又例如,处理模块220还可以不设置于可穿戴设备130中,而集成在处理设备110中。诸如此类的变形,均在本说明书的一个或多个实施例的保护范围之内。
图3是根据本申请一些实施例所示的计算设备的示例性硬件和/或软件的示意图。在一些实施例中,处理设备110和/或移动终端设备140可以在计算设备300上实现。如图3所示,计算设备300可以包括内部通信总线310、处理器320、只读存储器330、随机存储器340、通信端口350、输入/输出接口360、硬盘370以及用户界面380。
内部通信总线310可以实现计算设备300中各组件间的数据通信。例如,处理器320可以通过内部通信总线310将数据发送到存储器或输入/输出接口360等其它硬件中。在一些实施例中,内部通信总线310可以为工业标准(ISA)总线、扩展工业标准(EISA)总线、视频电子标准(VESA)总线、外部部件互联标准(PCI)总线等。在一些实施例中,内部通信总线310可以用于连接图1所示的运动监控系统100中的各个模块(例如,获取模块210、处理模块220、控制模块230、通讯模块240、输入输出模块260)。
处理器320可以执行计算指令(程序代码)并执行本申请描述的运动监控系统100的功能。所述计算指令可以包括程序、对象、组件、数据结构、过程、模块和功能(所述功能指本申请中描述的特定功能)。例如,处理器320可以处理从运动监控系统100的可穿戴设备130或/和移动终端设备140中获取的用户运动时的动作信号(例如,肌电信号、姿态信号),并根据用户运动时的动作信号对用户的运动的动作进行监控。在一些实施例中,处理器320可以包括微控制器、微处理器、精简指令集计算机(RISC)、专用集成电路(ASIC)、应用特定指令集处理器(ASIP)、中央处理器(CPU)、图形处理单元(GPU)、物理处理单元(PPU)、微控制器单元、数字信号处理器(DSP)、现场可编程门阵列(FPGA)、高级精简指令集计算机(ARM)、可编程逻辑器件以及能够执行一个或多个功能的任何电路和处理器等,或其任意组合。仅为了说明,图3中的计算设备300只描述了一个处理器,但需要注意的是,本申请中的计算设备300还可 以包括多个处理器。
计算设备300的存储器(例如,只读存储器(ROM)330、随机存储器(RAM)340、硬盘370等)可以存储从运动监控系统100的任何其他组件中获取的数据/信息。在一些实施例中,计算设备300的存储器可以位于可穿戴设备130中,也可以位于处理设备110中。示例性的ROM可以包括掩模ROM(MROM)、可编程ROM(PROM)、可擦除可编程ROM(PEROM)、电可擦除可编程ROM(EEPROM)、光盘ROM(CD-ROM)和数字通用盘ROM等。示例性的RAM可以包括动态RAM(DRAM)、双倍速率同步动态RAM(DDR SDRAM)、静态RAM(SRAM)、晶闸管RAM(T-RAM)和零电容(Z-RAM)等。
输入/输出接口360可以用于输入或输出信号、数据或信息。在一些实施例中,输入/输出接口360可以使用户与运动监控系统100进行交互。例如,输入/输出接口360可以包括通讯模块240,以实现运动监控系统100的通信功能。在一些实施例中,输入/输出接口360可以包括输入装置和输出装置。示例性输入装置可以包括键盘、鼠标、触摸屏和麦克风等,或其任意组合。示例性输出装置可以包括显示设备、扬声器、打印机、投影仪等或其任意组合。示例性显示装置可以包括液晶显示器(LCD)、基于发光二极管(LED)的显示器、平板显示器、曲面显示器、电视设备、阴极射线管(CRT)等或其任意组合。通信端口350可以连接到网络以便数据通信。所述连接可以是有线连接、无线连接或两者的组合。有线连接可以包括电缆、光缆或电话线等或其任意组合。无线连接可以包括蓝牙、Wi-Fi、WiMax、WLAN、ZigBee、移动网络(例如,3G、4G或5G等)等或其任意组合。在一些实施例中,通信端口350可以是标准化端口,如RS232、RS485等。在一些实施例中,通信端口350可以是专门设计的端口。
硬盘370可以用于存储处理设备110所产生的或从处理设备110所接收到的信息及数据。例如,硬盘370可以储存用户的用户确认信息。在一些实施例中,硬盘370可以包括机械硬盘(HDD)、固态硬盘(SSD)或混合硬盘(HHD)等。在一些实施例中,硬盘370可以设置于处理设备110中或可穿戴设备130中。用户界面380可以实现计算设备300和用户之间的交互和信息交换。在一些实施例中,用户界面380可以用于将运动监控系统100生成的运动记录呈现给用户。在一些实施例中,用户界面380可以包括一个物理显示器,如带扬声器的显示器、LCD显示器、LED显示器、OLED显示器、电子墨水显示器(E-Ink)等。
图4是根据本申请一些实施例所示的可穿戴设备的示例性结构图。为了进一步 对可穿戴设备进行描述,将上衣服装作为示例性说明,如图4所示,可穿戴设备400可以包括上衣服装410。上衣服装410可以包括上衣服装基底4110、至少一个上衣处理模块4120、至少一个上衣反馈模块4130、至少一个上衣获取模块4140等。上衣服装基底4110可以是指穿戴于人体上身的衣物。在一些实施例中,上衣服装基底4110可以包括短袖T恤、长袖T恤、衬衫、外套等。至少一个上衣处理模块4120、至少一个上衣获取模块4140可以位于上衣服装基底4110上与人体不同部位贴合的区域。至少一个上衣反馈模块4130可以位于上衣服装基底4110的任意位置,至少一个上衣反馈模块4130可以被配置为反馈用户上身运动状态信息。示例性的反馈方式可以包括但不限于语音提示、文字提示、压力提示、电流刺激等。在一些实施例中,至少一个上衣获取模块4140可以包括但不限于姿态传感器、心电传感器、肌电传感器、温度传感器、湿度传感器、惯性传感器、酸碱传感器、声波换能器等中的一种或多种。上衣获取模块4140中的传感器可以根据待测量的信号不同而放置在用户身体的不同位置。例如,姿态传感器用于获取用户运动过程中的姿态信号时,姿态传感器可以放置于上衣服装基底4110中与人体躯干、双臂、关节对应的位置。又例如,肌电传感器用于获取用户运动过程中的肌电信号时,肌电传感器可以位于用户待测量的肌肉附近。在一些实施例中,姿态传感器可以包括但不限于加速度三轴传感器、角速度三轴传感器、磁力传感器等,或其任意组合。例如,一个姿态传感器可以包含加速度三轴传感器、角速度三轴传感器。在一些实施例中,姿态传感器还可以包括应变式传感器。应变式传感器可以是指可以基于待测物受力变形产生的应变的传感器。在一些实施例中,应变式传感器可以包括但不限于应变式测力传感器、应变式压力传感器、应变式扭矩传感器、应变式位移传感器、应变式加速度传感器等中的一种或多种。例如,应变式传感器可以设置在用户的关节位置,通过测量应变式传感器中随着拉伸长度而变化的电阻的大小,可以获取用户关节处的弯曲角度和弯曲方向。需要注意的是,上衣服装410除了上述的上衣服装基底4110、上衣处理模块4120、上衣反馈模块4130、上衣获取模块4140之外,还可以包括其它模块,例如,供电模块、通讯模块、输入/输出模块等。上衣处理模块4120与图2中的处理模块220相类似、上衣获取模块4140与图2中的获取模块210相类似,关于上衣服装410中的各个模块的具体描述可以参考本申请图2中的相关描述,在此不做赘述。
图5是根据本申请一些实施例所示的运动监控方法的示例性流程图。如图5所示,流程500可以包括:
在步骤510中,获取用户运动时的动作信号。
在一些实施例中,该步骤510可以由获取模块210执行。动作信号是指用户运动时的人体参数信息。在一些实施例中,人体参数信息可以包括但不限于肌电信号、姿态信号、心电信号、温度信号、湿度信号、血氧浓度、呼吸频率等中的一种或多种。在一些实施例中,获取模块210中的肌电传感器可以采集用户在运动过程中的肌电信号。例如,当用户进行坐姿夹胸时,可穿戴设备中与人体胸肌、背阔肌等位置对应的肌电传感器可以采集用户相应肌肉位置的肌电信号。又例如,当用户进行深蹲动作时,可穿戴设备中与人体臀大肌、股四头肌等位置对应的肌电传感器可以采集用户相应肌肉位置的肌电信号。再例如,用户进行跑步运动时,可穿戴设备中与人体腓肠肌等位置对应的肌电传感器可以采集人体腓肠肌等位置的肌电信号。在一些实施例中,获取模块210中的姿态传感器可以采集用户运动时的姿态信号。例如,当用户进行杠铃卧推运动时,可穿戴设备中与人体肱三头肌等位置对应的姿态传感器可以采集用户肱三头肌等位置的姿态信号。又例如,当用户进行哑铃飞鸟动作时,设置在人体三角肌等位置处的姿态传感器可以采集用户三角肌等位置的姿态信号。在一些实施例中,获取模块210中的姿态传感器的数量可以为多个,多个姿态传感器可以获取用户运动时多个部位的姿态信号,多个部位姿态信号可以反映人体不同部位之间的相对运动情况。例如,手臂处的姿态信号和躯干处的姿态信号可以反映手臂相对于躯干的运动情况。在一些实施例中,姿态信号与姿态传感器的类型相关联。例如,当姿态传感器为角速度三轴传感器时,获取的姿态信号为角速度信息。又例如,当姿态传感器为角速度三轴传感器和加速度三轴传感器,获取的姿态信号为角速度信息和加速度信息。再例如,姿态传感器为应变式传感器时,应变式传感器可以设置在用户的关节位置,通过测量应变式传感器中随着拉伸长度而变化的电阻的大小,获取的姿态信号可以为位移信息、应力等,通过这些姿态信号可以表征用户关节处的弯曲角度和弯曲方向。需要注意的是,能够用于体现用户身体发生相对运动的参数信息都可以为姿态信号对应的特征信息,根据特征信息的类型可以采用不同类型的姿态传感器进行获取。
在一些实施例中,所述动作信号可以包括用户身体特定部位的肌电信号以及该特定部位的姿态信号。肌电信号和姿态信号可以从不同角度反映出用户身体特定部位的运动状态。简单来说,用户身体特定部位的姿态信号可以反映该特定部位的动作类型、动作幅度、动作频率等。肌电信号可以反映出该特定部位在运动时的肌肉状态。在一些实施例中,通过相同身体部位的肌电信号和/或姿态信号,可以更好地评估该部位的动作是否规范。
在步骤520中,至少基于肌电信号对应的特征信息或姿态信号对应的特征信息对用户运动的动作进行监控。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,肌电信号对应的特征信息可以包括但不限于频率信息、幅值信息等中的一种或多种。姿态信号对应的特征信息是指用于表征用户身体发生相对运动的参数信息。在一些实施例中,姿态信号对应的特征信息可以包括但不限于角速度方向、角速度值、角速度的加速度值等中的一种或多种。在一些实施例中,姿态信号对应的特征信息还可以包括角度、位移信息(例如应变式传感器中的拉伸长度)、应力等。例如,姿态传感器为应变式传感器时,应变式传感器可以设置在用户的关节位置,通过测量应变式传感器中随着拉伸长度而变化的电阻的大小,获取的姿态信号可以为位移信息、应力等,通过这些姿态信号可以表征用户关节处的弯曲角度和弯曲方向。在一些实施例中,处理模块220和/或处理设备110可以提取肌电信号对应的特征信息(例如,频率信息、幅值信息)或姿态信号对应的特征信息(例如,角速度方向、角速度值、角速度的加速度值、角度、位移信息、应力等),并基于肌电信号对应的特征信息或姿态信号对应的特征信息对用户运动的动作进行监控。这里对用户运动的动作进行监控包括对用户动作相关的信息进行监控。在一些实施例中,动作相关的信息可以包括用户动作类型、动作数量、动作质量(例如,用户动作是否符合标准)、动作时间等中的一个或多个。动作类型是指用户运动时采取的健身动作。在一些实施例中,动作类型可以包括但不限于坐姿夹胸、深蹲运动、硬拉运动、平板支撑、跑步、游泳等中的一种或多种。动作数量是指用户运动过程中执行动作的次数。例如,用户在运动过程中进行了10次坐姿夹胸,这里的10次为动作次数。动作质量是指用户执行的健身动作相对于标准健身动作的标准度。例如,当用户进行深蹲动作时,处理设备110可以基于特定肌肉位置(臀大肌、股四头肌等)的动作信号(肌电信号和姿态信号)对应的特征信息判断用户动作的动作类型,并基于标准深蹲动作的动作信号判断用户深蹲动作的动作质量。动作时间是指用户一个或多个动作类型对应的时间或运动过程的总时间。关于基于肌电信号对应的特征信息和/或姿态信号对应的特征信息对用户运动的动作进行监控的详细内容可以参考本申请图6及其相关描述。
在一些实施例中,处理设备110可以利用一个或多个动作识别模型对用户运动的动作进行识别和监控。例如,处理设备110可以将肌电信号对应的特征信息和/或姿态信号对应的特征信息输入动作识别模型,由动作识别模型输出用户动作相关的信息。 在一些实施例中,动作识别模型可以包括不同类型的动作识别模型,例如,用于识别用户动作类型的模型、或者用于识别用户动作质量的模型等。
应当注意的是,上述有关流程500的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程500进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。例如,步骤520中提取肌电信号对应的特征信息或姿态信号对应的特征信息可以由处理设备110完成,在一些实施例中,也可以由处理模块220完成。又例如,用户的动作信号不限于上述的肌电信号、姿态信号、心电信号、温度信号、湿度信号、血氧浓度、呼吸频率,还可以为其它人体生理参数信号,人体运动时所涉及的生理参数信号都可以视为本说明书实施例中的动作信号。
图6是根据本申请一些实施例所示的对用户运动动作进行监控的示例性流程图。如图6所示,流程600可以包括:
在步骤610中,基于与肌电信号对应的特征信息或姿态信号对应的特征信息对动作信号进行分段。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。用户运动时的动作信号(例如,肌电信号、姿态信号)的采集过程是连续的,并且用户运动时的动作可以是多组动作的组合或不同动作类型的动作组合。为了对用户运动中的各个动作进行分析,处理模块220可以基于用户运动时的肌电信号对应的特征信息或姿态信号对应的特征信息对用户动作信号进行分段。这里所说的对动作信号进行分段是指将动作信号划分为相同或不同时长的信号段,或者从所述动作信号中提取一个或多个具有特定时长的信号段。在一些实施例中,每段动作信号可以对应用户一个或多个完整的动作。例如,用户进行深蹲运动时,用户从站立姿势到蹲下,再起身恢复站立姿势可以视为用户完成一次深蹲动作,获取模块210在这个过程中采集到的动作信号可以视为一段(或一个周期)动作信号,在此之后,获取模块210采集到的用户完成下一次深蹲动作产生的动作信号则视为另一段动作信号。在一些实施例中,每段动作信号还可以对应用户的部分动作,这里的部分动作可以理解为一个完整动作中的部分动作。例如,用户进行深蹲运动时,用户从站立姿势到蹲下可以视为一段动作,再起身恢复站立姿势可以视为另一段动作。用户在运动时每个动作步骤的变化会使得相应部位的肌电信号和姿态信号发生变化。例如,用户在进行深蹲动作时,用户站立时的身体相应部位(例如,手臂、腿部、臀部、腹部)对应的肌肉处的肌电信号和姿态信号的波动较小,当用户由站立姿势 进行下蹲时,用户身体相应部位对应的肌肉处的肌电信号和姿态信号会产生较大的波动,比如,肌电信号中不同频率的信号对应的幅值信息变大,又比如,姿态信号对应的角速度值、角速度方向、角速度的加速度值、角度、位移信息、应力等也会发生改变。当用户由下蹲状态起身到站立状态时,肌电信号对应的幅值信息和姿态信号对应的角速度值、角速度方向、角速度的加速度值、角度、位移信息、应力又会发生改变。基于这种情况,处理模块220可以基于肌电信号对应的特征信息或姿态信号对应的特征信息对用户的动作信号进行分段。关于基于与肌电信号对应的特征信息或姿态信号对应的特征信息对动作信号进行分段的详细内容可以参考本申请说明书图7和图8及其相关描述。
在步骤620中,基于至少一段动作信号对用户运动的动作进行监控。
该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,基于至少一段动作信号对用户运动的动作进行监控可以包括基于至少一段的动作信号和至少一段预设动作信号进行匹配,确定用户运动时的动作类型。至少一段预设动作信号是指在数据库中预先设置的不同动作对应的标准动作信号。在一些实施例中,通过判断至少一段动作信号和至少一段预设动作信号的匹配度可以确定用户运动时的动作类型。进一步地,判断动作信号与预设动作信号的匹配度是否在第一匹配阈值范围(例如,大于80%)内,如果是,则根据预设动作信号对应的动作类型确定用户运动时的动作类型。在一些实施例中,基于至少一段的动作信号对用户运动的动作进行监控还可以包括基于至少一段肌电信号对应的特征信息和至少一段的预设动作信号中肌电信号对应的特征信息进行匹配,确定用户运动时的动作类型。例如,分别计算一段肌电信号中一个或多个特征信息(例如,频率信息、幅值信息)与一段预设动作信号中的一个或多个特征信息的匹配度,判断一个或多个特征信息的加权匹配度或平均匹配度是否在第一匹配阈值范围内,如果是,则根据预设动作信号对应的动作类型确定用户运动时的动作类型。在一些实施例中,基于至少一段的动作信号对用户运动的动作进行监控还可以包括基于至少一段姿态信号对应的特征信息和至少一段的预设动作信号中姿态信号对应的特征信息进行匹配,确定用户运动时的动作类型。例如,分别计算一段姿态信号中一个或多个特征信息(例如,角速度值、角速度方向和角速度的加速度值、角度、位移信息、应力等)与一段预设动作信号中的一个或多个特征信息的匹配度,判断一个或多个特征信息的加权匹配度或平均匹配度是否在第一匹配阈值范围内,如果是,则根据预设动作信号对应的动作类型确定用户运动时的动作类型。在一些实施例中,基于至少一段的动作信号对用户运动的动作进行监控还可以包括基于至少一段动作信号中的肌电信号对应的特征信息、 姿态信号对应的特征信息和至少一段的预设动作信号中肌电信号对应的特征信息、姿态信号对应的特征信息进行匹配,确定用户运动时的动作类型。
在一些实施例中,基于至少一段动作信号对用户运动的动作进行监控可以包括基于至少一段动作信号和至少一段预设动作信号进行匹配,确定用户运动时的动作质量。进一步地,如果动作信号与预设动作信号的匹配度在第二匹配阈值范围(例如,大于90%)内,则用户运动时的动作质量符合标准。在一些实施例中,基于至少一段的动作信号确定用户运动的动作可以包括基于至少一段的动作信号中的一个或多个特征信息和至少一段的预设动作信号中的一个或多个特征信息进行匹配,确定用户运动时的动作质量。需要注意的是,一段动作信号可以是一个完整动作的动作信号,或者是一个完整动作中部分动作的动作信号。在一些实施例中,对于复杂的一个完整动作,在完整动作的不同阶段会有不同的发力方式,也就是说,在动作的不同阶段会有不同的动作信号,通过对一个完整动作不同阶段的动作信号进行监控可以提高对用户动作监控的实时性。
应当注意的是,上述有关流程600的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程600进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。例如,在一些实施例中,还可以通过动作识别模型或人工预先设定的模型确定用户的动作。
图7是根据本申请一些实施例所示的动作信号分段的示例性流程图。如图7所示,流程700可以包括:
在步骤710中,基于所述肌电信号或所述姿态信号的时域窗口,根据预设条件从所述时域窗口内确定至少一个目标特征点。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。肌电信号的时域窗口包含一段时间范围内的肌电信号,姿态信号的时域窗口包含同样一段时间范围内的姿态信号。目标特征点是指动作信号中具有目标特征的信号,其可以表征用户动作所处的阶段。例如,用户进行坐姿夹胸时,开始时,用户双臂在水平方向呈左右伸展状态,之后双臂开始内旋,然后双臂合拢,最后双臂在水平方向再次恢复到伸展状态,这个过程为一个完整的坐姿夹胸动作。当用户进行坐姿夹胸动作时,各个阶段的肌电信号或姿态信号对应的特征信息是不同的,通过对肌电信号对应的特征信息(例如,幅值信息、频率信息)或姿态信号对应的特征信息(例如,角速度值、角速度方向、角速度的加速度值、角度、位移信息、应力等)进行分析可以确定与用户动作所处的阶段相对应的目标特征点。在一些实施例中,根据预设条件从时域窗口内可以确定一个或多个目 标特征点。在一些实施例中,预设条件可以包括姿态信号对应的角速度方向发生变化、姿态信号对应的角速度大于或等于角速度阈值、姿态信号对应的角度达到角度阈值、姿态信号对应的角速度值的变化值为极值、肌电信号对应的幅值信息大于或等于肌电阈值中的一种或多种。在一些实施例中,一个动作不同阶段的目标特征点可以对应不同的预设条件。例如,坐姿夹胸动作中,用户双臂在水平方向呈左右伸展状态然后双臂开始内旋时为目标特征点的预设条件与双臂合拢时为目标特征点的预设条件不同。在一些实施例中,不同动作的目标特征点可以对应不同的预设条件。例如,坐姿夹胸动作和二头弯举动作的动作不同,关于这两个动作中各自的预设目标点对应的预设条件也不同。关于预设条件的示例内容可以参考本说明书中关于动作开始点、动作中间点和动作结束点的描述。
在其他实施例中,还可以同时基于所述肌电信号和所述姿态信号的时域窗口,根据预设条件从所述时域窗口内确定至少一个目标特征点。肌电信号和姿态信号的时域窗口对应包含肌电信号和姿态信号的一段时间范围。肌电信号的时间与姿态信号的时间相对应。例如,用户开始运动时的肌电信号的时间点与用户开始运动时的姿态信号的时间点相同。这里可以通过结合肌电信号对应的特征信息(例如,幅值信息)和姿态信号对应的特征信息(例如,角速度值、角速度方向、角速度的加速度值、角度等)确定目标特征点。
在步骤720中,基于所述至少一个目标特征点对所述动作信号进行分段。
在一些实施例中,该步骤720可以由处理模块220和/或处理设备110执行。在一些实施例中,肌电信号或姿态信号中的目标特征点可以为一个或多个,通过一个或多个目标特征点可以将动作信号分为多段。例如,当肌电信号中有一个目标特征点时,目标特征点可以将肌电信号分为两段,这里的两段可以包括目标特征点之前的肌电信号和目标特征点之后的肌电信号。或者,处理模块220和/或处理设备110可以提取目标特征点周围一定时间范围内的肌电信号作为一段肌电信号。又例如,当肌电信号有多个目标特征点(例如,n个,且第一个目标特征点不为时域窗口的始点,第n个目标特征点不为时域窗口的终点)时,可以根据n个目标特征点将肌电信号分为n+1段。再例如,当肌电信号有多个目标特征点(例如,n个,且第一个目标特征点为时域窗口的始点,第n个目标特征点不为时域窗口的终点)时,可以根据n个目标特征点将肌电信号分为n段。再例如,当肌电信号有多个目标特征点(例如,n个,且第一个目标特征点为时域窗口的始点,第n个目标特征点为时域窗口的终点)时,可以根据n个目标特征点将 肌电信号分为n-1段。需要注意的是,目标特征点对应的动作阶段可以包括一种或多种,当目标特征点对应的动作阶段为多种时,可以将多种目标特征点作为基准对动作信号进行分段。例如,目标特征点对应的动作阶段可以包括动作开始点和动作结束点,动作开始点在动作结束点之前,这里可以将动作开始点到下个动作开始点之间的动作信号视为一段动作信号。
在一些实施例中,目标特征点可以包括动作开始点、动作中间点或动作结束点中的一种或多种。
为了对动作信号的分段进行描述,以目标特征点同时包括动作开始点、动作中间点和动作结束点作为示例性说明,其中,动作开始点可以被认为是用户动作周期的开始点。在一些实施例中,不同的动作可以对应不同的预设条件。例如,在坐姿夹胸动作中,预设条件可以为动作开始点之后动作的角速度方向相对于动作开始点之前动作的角速度方向改变,或者动作开始点的角速度值近似为0,且动作开始点的角速度的加速度值大于0。也就是说,用户进行坐姿夹胸时,动作的开始点可以设为双臂在水平方向左右伸展并开始内旋时的时间点。再例如,在二头弯举动作中,预设条件可以为手臂抬起的角度大于或等于角度阈值。具体来说,用户在进行二头弯举动作时,用户手臂为水平时的抬起角度为0°,手臂下垂时角度为负,手臂上抬时角度为正。当用户手臂从水平位置上抬时,手臂抬起的角度大于0。用户手臂抬起的角度达到角度阈值时的时间点可以视为动作开始点。角度阈值可以是-70°~-20°,或者优选地,角度阈值可以为-50°~-25°。在一些实施例中,为了进一步保证选取的动作开始点的准确性,预设条件还可以包括:动作开始点之后特定时间范围内手臂的角速度可以大于或等于角速度阈值。角速度阈值的范围可以为5°/s~50°/s;优选的,角速度阈值的范围可以为10°/s~30°/s。例如,用户在进行二头弯举动作时,在经过角度阈值且用户手臂持续向上抬起时,在接下来的特定时间范围(比如,0.05s、0.1s、0.5s)内手臂的角速度持续大于角速度阈值。在一些实施例中,如果根据预设条件选取的动作开始点在特定时间范围内的角速度小于角速度阈值,则继续执行预设条件直到确定一个动作开始点。
在一些实施例中,动作中间点可以是距离开始点一个动作周期内的某个点。例如,用户进行坐姿夹胸时,动作的开始点可以设为双臂在水平方向左右伸展并开始内旋时的时间点,双臂合拢的时间点可以作为用户动作中间点。在一些实施例中,预设条件可以为动作中间点之后时间点的角速度方向相对于动作中间点之前时间点的角速度方向改变,且动作中间点的角速度值近似为0,其中,动作中间点的角速度方向与动作开 始点的角速度方向相反。在一些实施例中,为了提高动作中间点选取的准确度,动作中间点之后第一特定时间范围内(例如,0.05s、0.1s、0.5s)角速度的变化速度(角速度的加速度)可以大于角速度的加速度阈值(例如,0.05rad/s)。在一些实施中,在动作中间点满足上述预设条件的同时,肌电信号中与动作中间点对应的幅值信息大于肌电阈值。由于不同动作对应的肌电信号不同,肌电阈值与用户动作及目标肌电信号有关。在坐姿夹胸中,胸肌处的肌电信号为目标肌电信号。在一些实施例中,在动作中间点对应的位置(也可以叫做“中间位置”)可以近似视为肌肉发力的最大值点,此时肌电信号会具有较大值。需要说明的是,用户进行相应的运动动作时,用户身体对应的部位处的肌电信号相对于用户未进行运动(此时特定部位的肌肉可以视为静息状态)时的对应部位的肌电信号大幅提高,例如,用户的动作达到中间位置的对应部位的肌电信号的幅值是静息状态下的10倍。另外,用户进行动作的类型不同,运动到中间位置(动作中间点)的对应部位的肌电信号幅值与静息状态下的肌电信号的幅值关系也会不同,二者之间的关系可以根据实际运动的动作进行适应性调整。在一些实施例中,为了提高动作中间点选取的准确度,动作中间点之后的第二特定时间范围内(例如,0.05s、0.1s、0.5s)对应的幅值可以持续大于肌电阈值。在一些实施例中,对动作中间点的判定,除了需要满足上述的预设条件(例如,角速度及肌电信号的幅值条件),还可以使得动作中间点和开始位置的欧拉角(也被称为角度)满足一定的条件。例如,在坐姿夹胸中,动作中间点相对于动作开始点的欧拉角可以大于一个或多个欧拉角阈值(也被称为角度阈值),例如,以人体前后方向作为X轴,人体左右方向作为Y轴,以人体高度方向作为Z轴,X、Y方向欧拉角变化可以小于25°,Z方向欧拉角变化可以大于40°(坐姿夹胸这个动作主要是Z轴方向的旋转,以上参数也仅为参考示例)。在一些实施例中,肌电阈值和/或欧拉角阈值可以预先存储在可穿戴设备130的存储器或硬盘中,也可以存储于处理设备110中,或者根据实际情况计算得到并可以进行实时调整。
在一些实施例中,处理模块220可以基于肌电信号或姿态信号的时域窗口,根据预设条件从动作开始点之后时间点的时域窗口中确定动作中间点。在一些实施中,在确定动作中间点后,可以重新验证在动作开始点至动作中间点的时间范围内,是否存在其他符合预设条件的时间点,如果存在,选取距离动作中间点最近的动作开始点作为最佳动作开始点。在一些实施例中,如果动作中间点的时间与动作开始点的时间的差值大于特定时间阈值(例如,一个动作周期的1/2或2/3),则该动作中间点无效,则根据预设条件重新确定动作开始点和动作中间点。
在一些实施例中,动作结束点可以是距离动作开始点一个动作周期以内并且在动作中间点之后的某个时间点,例如,动作结束点可以设为距离动作开始点一个动作周期的点,此时动作结束点可以认为是用户一个动作周期的结束点。例如,用户进行坐姿夹胸时,动作的开始点可以设为双臂在水平方向左右伸展并开始内旋时的时间点,双臂合拢的时间点可以作为用户动作中间点,而双臂在水平方向再次恢复到伸展状态的时间点可以对应用户动作结束点。在一些实施例中,预设条件可以为姿态信号对应的角速度值的变化值为极值。在一些实施例中,为了防止抖动误判,在动作中间点至动作结束点的时间范围内,欧拉角的变化应该超过一定的欧拉角阈值,例如20°。在一些实施例中,处理模块220可以基于肌电信号和姿态信号的时域窗口,根据预设条件从动作中间点之后的时域窗口中确定动作结束点。在一些实施例中,如果动作结束点的时间与动作中间点的时间的差值大于特定时间阈值(例如,一个动作周期的1/2),则该动作开始点、动作中间点均无效,则重新根据预设条件确定动作开始点、动作中间点和动作结束点。
在一些实施例中,可以重复确定动作信号中的至少一组动作开始点、动作中间点和动作结束点,并基于至少一组动作开始点、动作中间点和动作结束点作为目标特征点对动作信号进行分段。该步骤可以由处理模块220和/或处理设备110执行。需要注意的是,对动作信号进行分段并不限于上述的动作开始点、动作中间点和动作结束点,还可以包括其它时间点。例如,坐姿夹胸动作可以根据上述步骤选择5个时间点,第一时间点可以为动作开始点,第二时间点可以为内旋角速度最大的时刻,第三时间点可以为中动作间点,第四时间点可以为外旋角速度最大的时刻,第五时间点可以为双臂回复左右伸展,角速度为0的时刻,即动作结束点。在该示例中,与上述步骤中的动作开始点、动作中间点和动作结束点相比较,通过增加第二时间点作为动作周期的1/4标志点,使用前述实施例所述的动作结束点作为第四时间点,用于标志动作周期的3/4位置,添加了第五时间点作为完整动作的结束点。对于坐姿夹胸动作来说,这里使用更多时间点,基于动作周期前3/4的信号就可以完成动作质量的识别(也就是说,对于单个周期动作质量的识别不依赖于完整分析整个周期的信号),可以在当前周期的动作没有结束时就完成对用户动作的监控和反馈,同时又可以完整地记录整个动作过程中的所有信号,以便于将信号上传到云端或移动终端设备,从而可以采用更多的方法来对用户的动作进行监控。针对较为复杂的动作,一个动作的周期会很长,而每一个阶段有不同的发力模式,在一些实施例中,可以采用上述确定各时间点的方法将动作分成多个阶段,针对每一个 阶段的信号进行单独的识别和反馈,提高用户动作反馈的实时性。
需要说明的是,上述根据动作开始点、动作中间点和动作结束点作为一组目标特征点对动作信号进行分段并监控仅作为示例性说明,在一些实施例中,还可以基于动作开始点、动作中间点、动作结束点中的任意一种或多种作为目标特征点对用户的动作信号进行分段并监控。例如,还可以以动作开始点作为目标特征点对动作信号进行分段并监控。又例如,还可以以动作开始点和动作结束点作为一组目标特征点对动作信号进行分段并监控,可以起到目标特征点作用的其他时间点或时间范围均在本说明书的保护范围内。
应当注意的是,上述有关流程700的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程700进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。例如,步骤710、步骤720的至少两个可以同时在处理模块220中进行。又例如,步骤710、步骤720、可以分别在处理模块220和处理设备110中同时进行。
图8是根据本申请一些实施例所示的动作信号分段的示意图。图8中横坐标可以表示用户运动的时间,纵坐标可以表示用户坐姿夹胸训练时对应的肌肉部位(例如胸大肌)肌电信号的幅值信息。图8中还包括了用户运动过程中手腕位置姿态信号对应的角速度变化曲线和欧拉角变化曲线,其中,角速度变化曲线用以表征用户运动时的速度变化情况,欧拉角曲线用以表征用户运动时的身体部位所处的位置情况。如图8所示,根据预设条件确定A1点为动作开始点。具体地,用户动作开始点A1之后时间点的角速度方向相对于动作开始点A1之前时间点的角速度方向改变。进一步地,动作开始点A1的角速度值近似为0,且动作开始点A1处角速度的加速度值大于0。
参照图8,根据预设条件确定B1点为动作中间点。具体地,用户动作中间点B1之后时间点的角速度方向相对于动作中间点B1之前时间点的角速度方向改变,动作中间点B1的角速度值近似为0,其中,动作中间点B1的角速度方向与动作开始点A1的角速度方向相反。另外,肌电信号(图8中以“肌电信号”示出)中与动作中间点B1对应的幅值大于肌电阈值。
继续参照图8,根据预设条件确定C1点为动作结束点。具体地,动作结束点C1的角速度值的变化值为动作开始点A1至动作结束点C1的极值。在一些实施例中,流程700可以完成图8所示的动作分段,如图8所示的动作开始点A1至动作结束点C1的动作信号可以视为用户运动的一段信号。
需要说明的是,在一些实施例中,若动作中间点与动作开始点的时间间隔大于特定时间阈值(例如,一个动作周期的1/2),则处理模块220可以重新确定动作开始点以确定动作分段的精准性。这里的特点时间阈值可以存储在可穿戴设备130的存储器或硬盘中,也可以存储于处理设备110中,或者根据用户运动的实际情况进行计算或调整。例如,若图8中动作开始点A1与动作中间点B1的时间间隔大于特定时间阈值,则处理模块220可以重新确定动作开始点,从而可以提高动作分段的精准性。另外,对动作信号进行分段并不限于上述的动作开始点A1、动作中间点B1和动作结束点C1,还可以包括其它时间点,关于时间点的选取可以根据动作的复杂程度进行。
在获取用户的动作信号时,用户的其他生理参数信息(例如,心率信号)、运动过程中获取模块210与人体发生相对移动或挤压等外界条件会影响动作信号的质量,比如导致肌电信号中存在突变,从而影响对用户动作的监控。为方便描述,突变的肌电信号可以用奇异点来描述,示例性的奇异点可以包括毛刺信号、不连续信号等。在一些实施例中,至少基于肌电信号对应的特征信息或姿态信号对应的特征信息对所述用户运动的动作进行监控还可以包括:在频域或时域上对肌电信号进行预处理,基于预处理后的肌电信号获取肌电信号对应的特征信息,并根据肌电信号对应的特征信息或姿态信号对应的特征信息并对用户运动的动作进行监控。在一些实施例中,在频域或时域上对肌电信号进行预处理可以包括在频域上对所述肌电信号进行滤波以在频域上选取或保留所述肌电信号中特定频率范围的成分。在一些实施例中,获取模块210获取的肌电信号的频率范围为1Hz-1000Hz,可以对其滤波并从中选取特定频率范围(例如,30Hz-150Hz)的肌电信号进行后续处理。在一些实施例中,特定频率范围可以为10Hz-500Hz。优选地,特定频率范围可以为15Hz-300Hz。更为优选地,特定频率范围可以为30Hz-150Hz。在一些实施例中,滤波处理可以包括低通滤波器处理。在一些实施例中,低通滤波器可以包括LC无源滤波器、RC无源滤波器、RC有源滤波器、由特殊元件组成的无源滤波器。在一些实施例中,由特殊元件组成的无源滤波器可以包括压电陶瓷滤波器、晶体滤波器、声表面滤波器中的一种或多种。需要注意的是,特定频率范围并不限于上述的范围,还可以为其它范围,可以根据实际情况进行选取。关于根据肌电信号对应的特征信息或姿态信号对应的特征信息并对用户运动的动作进行监控的内容可以参考本申请说明书图5、图6及其相关描述。
在一些实施例中,在频域或时域上对肌电信号进行预处理还可以包括在时域上对肌电信号进行信号校正处理。信号校正处理是指对肌电信号中的奇异点(例如,毛刺 信号、不连续信号等)进行校正。在一些实施例中,在时域上对肌电信号进行信号校正处理可以包括确定肌电信号中的奇异点,即确定肌电信号中的突变信号。奇异点可以是肌电信号某一时刻内,其幅值发生突变,造成信号的非连续。又例如,肌电信号形态上比较光滑,肌电信号的幅值没有发生突变,但肌电信号的一阶微分有突变产生,且一阶微分是不连续的。在一些实施例中,确定肌电信号中的肌电信号中奇异点的方法可以包括但不限于傅里叶变换、小波变换、分形维数等中的一种或多种。在一些实施例中,在时域上对肌电信号进行信号校正处理可以包括去除肌电信号中的奇异点,例如,删除奇异点及其附近一段时间范围内的信号。可替代地,在时域上对肌电信号进行信号校正处理可以包括根据特定时间范围内的肌电信号的特征信息对肌电信号的奇异点进行修正,例如根据奇异点周围的信号对奇异点的幅值进行调整。在一些实施例中,肌电信号的特征信息可以包括幅值信息、幅值信息的统计信息中的一种或多种。幅值信息的统计信息(也被称为幅值熵)是指肌电信号在时域上幅值信息的分布情况。在一些实施例中,通过信号处理算法(例如,傅里叶变换、小波变换、分形维数)确定了肌电信号中的奇异点的位置(例如,对应的时间点)之后,可以根据奇异点的位置之前或之后的特定时间范围内的肌电信号对奇异点进行修正。例如,奇异点为突变波谷时,可以根据突变波谷之前或之后的特定时间范围(例如,5ms-60ms)内的肌电信号的特征信息(例如,幅值信息、幅值信息的统计信息)对突变波谷处的肌电信号进行补充。
以奇异点为毛刺信号进行示例性说明,图9是根据本申请一些实施例所示的肌电信号预处理的示例性流程图。如图9所示,流程900可以包括:
在步骤910中,基于所述肌电信号的时域窗口,从所述肌电信号的时域窗口内选取不同的时间窗口,其中,所述不同的时间窗口分别覆盖不同的时间范围。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,不同窗口可以包括至少一个特定窗口。特定窗口是指在时域窗口中选取的具有特定时间长度的窗口。例如,肌电信号的时域窗口的时间长度为3s时,特定窗口的时间长度可以为100ms。在一些实施例中,特定窗口可以包括多个不同的时间窗口。仅作为示例性说明,特定窗口可以包括第一时间窗口和第二时间窗口,第一时间窗口可以是指特定窗口内对应部分时间长度的一个窗口,例如,特定窗口的时间长度为100ms时,第一时间窗口的时间长度可以为80ms。第二时间窗口可以是指特定窗口内对应部分时间长度的另一个窗口,例如,特定窗口为100ms时,第二时间窗口可以为20ms。在一些实施例中,第一时间窗口和第二时间窗口可以是同一个特定窗口内连续的时间窗口。 在一些实施例中,第一时间窗口和第二时间窗口也可以是同一个特定窗口内不连续的或者重叠的两个时间窗口。例如,特定时间范围内的窗口的时间长度为100ms时,第一时间窗口的时间长度可以为80ms,第二时间窗口的时间长度可以为25ms,这种情况下,第二时间窗口中的5ms与第一时间窗口是重叠的。在一些实施例中,处理模块220可以基于肌电信号的时域窗口,从肌电信号的时域窗口的时间始点按照特定时间长度依次滑动并更新特定窗口,并可以将更新后的特定窗口继续划分为第一时间窗口和第二时间窗口。这里所说的特定时间长度可以小于1s、2s、3s等。例如,处理模块220可以选取特定时间长度为100ms的特定窗口,并将该特定窗口划分为80ms的第一时间窗口和20ms的第二时间窗口。进一步地,该特定窗口可以沿时间方向滑动而更新。这里滑动的距离可以为第二时间窗口的时间长度(例如,20ms),也可以是其他合适的时间长度,例如,30ms,40ms等。
在步骤920中,基于所述不同的是时间窗口中所述肌电信号对应的特征信息确定所述毛刺信号。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,肌电信号对应的特征信息可以至少包括幅值信息、幅值信息的统计信息中的一种。在一些实施例中,处理模块220可以获取不同的时间窗口(例如,第一时间窗口、第二时间窗口)中肌电信号对应的幅值信息或幅值信息的统计信息确定毛刺信号的位置。关于基于不同的时间窗口中肌电信号对应的特征信息确定毛刺信号的位置的具体说明,可以参考图10及其相关描述。
应当注意的是,上述有关流程900的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程900进行各种修正和改变。例如,特定窗口不限于包括上述的第一时间窗口和第二时间窗口,还可以包括其它时间窗口,例如,第三时间窗口、第四时间窗口等。另外,毛刺信号位置之前或之后时刻的特定范围可以根据毛刺信号的长度进行适应性调整,在此不做进一步限定。然而,这些修正和改变仍在本说明书的范围之内。
图10是根据本申请一些实施例所示的去毛刺信号的示例性流程图。如图10所示,流程1000可以包括:
在步骤1010中,确定第一时间窗口内肌电信号对应的第一幅值信息和第二时间窗口内肌电信号对应的第二幅值信息。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些 实施例中,处理模块220可以选定第一时间窗口和第二时间窗口的时间长度,并提取第一时间窗口时间长度内肌电信号对应的第一幅值信息和第二时间窗口时间长度内肌电信号对应的第二幅值信息。在一些实施例中,第一幅值信息可以包括第一时间窗口内肌电信号的平均幅值,第二幅值信息可以包括第二时间窗口内肌电信号的平均幅值。例如,处理模块220可以选取第一时间窗口时间长度为80ms并提取第一时间窗口内肌电信号对应的第一幅值信息,处理模块220可以选取第二时间窗口时间长度为20ms并提取第二时间窗口内肌电信号对应的第二幅值信息。
在一些实施例中,第一时间窗口时间长度和第二时间窗口时间长度的选取与最短的毛刺信号长度以及系统的计算量有关。在一些实施例中,可以根据毛刺信号的特点选取第一时间窗口时间长度和第二时间窗口时间长度。心电毛刺信号的时间长度是40ms-100ms、心电信号中两个毛刺信号的时间间隔可以为1s左右、毛刺信号峰值点两边基本是对称的、毛刺信号两边的幅值分布比较平均等。在一些实施例中,当毛刺信号为心电信号时,可以选取小于毛刺信号的时间长度,例如,毛刺信号长度的一半,作为第二时间窗口的时间长度,第一时间窗口的时间长度可以大于第二时间窗口的长度,例如,为第二时间窗口时间长度的4倍。在一些实施例中,第一时间窗口的时间长度只要在毛刺信号间隔(约1s)减去第二时间窗口长度的范围内。还需要说明的是,上述选取的第一时间窗口的时间长度和第二时间窗口的时间长度不限于上述的描述,只要满足第二时间窗口的时间长度与第一时间窗口的时间长度之和小于相邻的两个毛刺信号时间间隔,或第二时间窗口的时间长度小于单个毛刺信号长度,或第二时间窗口内肌电信号幅值和第一时间窗口内肌电信号幅值具有较好的区分度即可。
在步骤1020中,判断第二幅值信息与第一幅值信息的比值是否大于阈值。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,处理模块220可以判断第二时间窗口内肌电信号对应的第二幅值信息与第一时间窗口内肌电信号对应的第一幅值信息的比值是否大于阈值。这里的阈值可以存储在可穿戴设备130的存储器或硬盘中,也可以存储于处理设备110中,或者根据实际情况进行调整。在一些实施例中,若处理模块220判断第二幅值信息与第一幅值信息的比值大于阈值,则步骤1020可以进行到步骤1030。在另一些实施例中,若处理模块220判断第二幅值信息与第一幅值信息的比值不大于阈值,则步骤1020可以进行到步骤1040。
在步骤1030中,对第二时间窗口内的肌电信号进行信号校正处理。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些 实施例中,处理模块220可以根据步骤1020中的第二幅值信息与第一幅值信息的比值与阈值的大小关系的判断结果,执行对第二时间窗口内的肌电信号进行信号校正处理。例如,在一些实施例中,第二幅值信息与第一幅值信息的比值大于阈值,则第二幅值信息对应的第二时间窗口内的肌电信号为毛刺信号。在一些实施例中,处理第二时间窗口内的肌电信号可以包括基于第二时间窗口之前或之后的特定时间范围内的肌电信号对第二时间窗口内的肌电信号进行信号校正处理。在一些实施例中,对第二时间窗口内的肌电信号进行信号校正处理的方式可以包括但不限于填充、插值等。在一些实施例中,特定时间范围可以为5ms-60ms。优选地,特定时间范围可以为10ms-50ms。进一步优选地,特定时间范围可以为20ms-40ms。需要注意的是,特定时间范围并不限于上述的范围,例如,特定时间范围还可以大于60ms,或小于5ms等其它范围。在实际应用场景中可以根据毛刺信号的时间长度进行适应性调整。
在步骤1040中,保留第二时间窗口内的肌电信号。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,处理模块220可以根据步骤1020中的第二幅值信息与第一幅值信息的比值与阈值的大小关系的判断结果,执行保留第二时间窗口内的肌电信号。例如,在一些实施例中,第二幅值信息与第一幅值信息的比值不大于阈值,则第二幅值信息对应的第二时间窗口内的肌电信号为正常肌电信号,该正常肌电信号可以被保留,即保留第二时间窗口内的肌电信号。
需要说明的是,用户肌肉发力过程中电荷逐渐累积,肌电信号的幅值是逐渐升高的,因此在不存在毛刺信号的情况下,相邻的两个时间窗口(例如,第一时间窗口和第二时间窗口)内的肌电信号幅值不会突变。在一些实施例中,基于流程1000来判断并去除肌电信号中的毛刺信号可以实现对毛刺信号的实时处理,从而可以使得可穿戴设备130或者移动终端设备140向用户实时反馈其运动状态,帮助用户更加科学的进行运动。
在一些实施例中,第一时间窗口对应的时间长度可以大于第二时间窗口对应的时间长度。在一些实施例中,特定窗口对应的特定时间长度可以小于1s。在一些实施例中,第一时间窗口对应的时间长度与第二时间窗口对应的时间长度的比值可以大于2。在一些实施例中,第一时间窗口对应的时间长度、第二时间窗口对应的时间长度、特定窗口对应的特定时间长度的选取,一方面可以保证最短的毛刺信号长度(例如,40ms)可以被去除且具有高信噪比,另一方面可以使得系统的计算量相对较小,减少系统的重 复计算,降低时间复杂度,从而可以提高系统的计算效率和计算精准性。
应当注意的是,上述有关流程1000的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程1000进行各种修正和改变。例如,上述流程1000仅是奇异点为毛刺信号的示例,当奇异点为波谷信号时,可以对上述各步骤(例如,步骤1010、步骤1020、步骤1030等)及其方案进行调整或采用其他方法进行信号校正处理。然而,这些修正和改变仍在本说明书的范围之内。
在一些实施例中,对肌电信号的奇异点进行信号校正处理还可以采用其他方法,例如,高通法、低通法、带通法、小波变换重构法等。在一些实施例中,对于低频信号不敏感的应用场景,可以采用100Hz高通滤波器进行毛刺信号的去除。在一些实施例中,除了对肌电信号进行信号校正处理之外,还可以对肌电信号进行其他方式的信号处理,例如滤波处理、信号放大、相位调节等。在一些实施例中,肌电传感器采集到的用户肌电信号可以通过模数转换器(ADC)被转换成数字肌电信号,转换后的数字肌电信号可以进行滤波处理,滤波处理可以滤除工频信号及其谐波信号等。在一些实施例中,对肌电信号的处理还可以包括去除用户的运动伪迹。这里的运动伪迹是指在获取肌电信号过程中,用户运动时待测位置的肌肉相对于肌电模块发生相对移动而产生的信号噪声。
在一些实施例中,姿态信号可以由可穿戴设备130上的姿态传感器进行获取。可穿戴设备130上的姿态传感器可以分布在人体四肢部位(例如,手臂、腿部等)、人体的躯干部位(例如,胸部、腹部、背部、腰部等)和人体的头部等。姿态传感器可以实现人体的四肢部位、躯干部位等其它部位的姿态信号采集。在一些实施例中,姿态传感器还可以为具有姿态融合算法的姿态测量单元(AHRS)的传感器。姿态融合算法可以将具有三轴加速度传感器、三轴角速度传感器、三轴地磁传感器的九轴惯性测量单元(IMU)的数据融合为欧拉角或四元数,以获取姿态传感器所在用户身体部位的姿态信号。在一些实施例中,处理模块220和/或处理设备110可以基于姿态信号确定姿态对应的特征信息。在一些实施例中,姿态信号对应的特征信息可以包括但不限于角速度值、角速度方向、角速度的加速度值等。在一些实施例中,姿态传感器可以为应变传感器,应变传感器可以获取用户关节处的弯曲方向和弯曲角度,从而获取用户运动时的姿态信号。例如,应变传感器可以设置于用户的膝关节处,当用户运动时,用户的身体部位作用于应变传感器,基于应变传感器的电阻或长度变化情况可以计算出用户膝关节处的弯曲方向和弯曲角度,从而获取用户腿部的姿态信号。在一些实施例中,姿态传感器还可 以包括光纤传感器,姿态信号可以由光纤传感器的光线弯曲后的方向变化来表征。在一些实施例中,姿态传感器还可以为磁通量传感器,姿态信号可以由磁通量的变换情况进行表征。需要注意的是,姿态传感器的类型不限于上述的传感器,还可以为其它传感器,能够获取用户姿态信号的传感器均在本说明书的姿态传感器的范围内。
图11是根据本申请一些实施例所示的确定姿态信号对应的特征信息的示例性流程图。如图11所示,流程1100可以包括:
在步骤1110中,获取目标坐标系以及该目标坐标系与至少一个原始坐标系之间的转换关系。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,原始坐标系是指设置在人体上的姿态传感器对应的坐标系。当用户使用可穿戴设备130时,可穿戴设备130上的各姿态传感器分布于人体的不同部位,使得各姿态传感器在人体上的安装角度不同,而不同部位的姿态传感器分别以各自本体的坐标系作为原始坐标系,因此不同部位的姿态传感器具有不同的原始坐标系。在一些实施例中,各个姿态传感器获取的姿态信号可以是在其对应的原始坐标系下的表达。通过将不同原始坐标系下的姿态信号转化到同一坐标系(例如,目标坐标系)中,便于确定人体不同部位之间的相对运动。在一些实施例中,目标坐标系是指基于人体建立的人体坐标系。例如,目标坐标系中可以将人体躯干的长度方向(即垂直于人体横切面的方向)作为Z轴,人体躯干的前后方向(即垂直于人体冠状面的方向)作为X轴,人体躯干的左右方向(即垂直于人体矢状面的方向)作为Y轴。在一些实施例中,目标坐标系与原始坐标系之间存在转换关系,通过该转换关系可以将原始坐标系中的坐标信息转换为目标坐标系中的坐标信息。在一些实施例中,该转换关系可以表示为一个或多个旋转矩阵。关于确定目标坐标系与原始坐标系之间的转换关系的详细内容可以参考参考本申请说明书图13及其相关描述。
在步骤1120中,基于转换关系,将至少一个原始坐标系中的坐标信息转换为目标坐标系中的坐标信息。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。原始坐标系中的坐标信息是指原始坐标系中三维坐标信息。目标坐标系中的坐标信息是指目标坐标系中三维坐标信息。仅作为示例性说明,原始坐标系中的坐标信息v 1,根据转换关系可以将原始坐标系中的坐标信息转换为目标坐标系中的坐标信息v 2。具体地,坐标信息v 1和坐标信息v 2之间可以用旋转矩阵进行转换,这里的旋转矩阵可以理解为原始坐标 系与目标坐标系之间的转换关系。具体地,原始坐标系中的坐标信息v 1可以通过第一旋转矩阵转换为坐标信息v 1-1,坐标信息v 1-1通过第二旋转矩阵可以变为坐标信息v 1-2,坐标信息v 1-2通过第三旋转矩阵可以变为坐标信息v 1-3,坐标信息v 1-3即为目标坐标系中的坐标信息v 2。需要注意的是,旋转矩阵不限于上述的第一旋转矩阵、第二旋转矩阵和第三旋转矩阵,还可以包括更少或更多的旋转矩阵。在一些替代性实施例中,旋转矩阵还可以为一个旋转矩阵或多个旋转矩阵的组合。
在步骤1130中,基于目标坐标系中的坐标信息,确定姿态信号对应的特征信息。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,基于目标坐标系中的坐标信息确定用户姿态信号对应的特征信息可以包括基于用户运动过程中的目标坐标系中的多个坐标信息确定用户姿态信号对应的特征信息。例如,用户进行坐姿夹胸运动时,用户手臂向前平举时可以对应目标坐标系中的第一坐标信息,用户手臂打开到与躯干在同一平面内时可以对应目标坐标系中的第二坐标信息,基于第一坐标信息和第二坐标信息可以计算用户姿态信号对应的特征信息。例如,角速度、角速度方向、角速度的加速度值等。
应当注意的是,上述有关流程1100的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程1100进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。
在一些实施例中,还可以通过位于用户身体不同位置的姿态传感器对应的特征信息判断用户身体不同运动部位之间的相对运动。例如,通过用户手臂处的姿态传感器对应的特征信息和用户躯干部位的姿态传感器对应的特征信息,可以判断用户运动过程中手臂与躯干之间的相对运动。图12是根据本申请一些实施例所示的确定用户的不同运动部位之间的相对运动的示例性流程图。如图12所示,流程1200可以包括:
在步骤1210中,基于不同的原始坐标系与目标坐标系的转换关系,确定至少两个传感器分别对应的特征信息。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,不同的传感器由于在人体处的安装位置不同,传感器对应的原始坐标系与目标坐标系之间具有不同的转换关系。在一些实施例中,处理设备110可以将用户不同部位(例如,小臂、大臂、躯干等)的传感器对应的原始坐标系中的坐标信息分别转换为目标坐标系中的坐标信息,从而可以分别确定至少两个传感器对应的特征信息。关于原始坐系中的坐标信息转化为目标坐标系中的坐标信息的相关描述可以在本申请的其他 地方找到,例如,图11,在此不做赘述。
在步骤1220中,基于至少两个传感器分别对应的特征信息,确定用户的不同运动部位之间的相对运动。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,运动部位可以是指人体上可以独立运动的肢体,例如,小臂、大臂、小腿、大腿等。仅作为示例性说明,用户进行手臂举哑铃运动时,设置在小臂部位的传感器对应的目标坐标系中的坐标信息和设置在大臂部位的传感器对应的目标坐标系中的坐标信息相结合,可以确定用户小臂和大臂之间的相对运动,从而可以确定用户的手臂举哑铃动作。
在一些实施例中,用户的同一运动部位还可以设置多个相同或不同类型的传感器,多个相同或不同类型的传感器对应的原始坐标系中的坐标信息可以分别转换为目标坐标系中的坐标信息。例如,用户的小臂部位的不同位置处可以设置多个相同或不同类型传感器,多个相同或不同类型的传感器对应的目标坐标系中的多个坐标信息可以同时表征用户小臂部位的运动动作。例如,可以对多个相同类型传感器对应的目标坐标系中的坐标信息求平均值,从而提高用户运动过程中运动部位的坐标信息的准确性。又例如,可以对多个不同类型传感器对应的坐标系中的坐标信息通过融合算法(例如,卡尔曼滤波等)获取目标坐标系中的坐标信息。
应当注意的是,上述有关流程1100的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程1100进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。
图13是根据本申请一些实施例所示的确定原始坐标系与特定坐标系的转换关系的示例性流程图。在一些实施例中,所述确定原始坐标系与特定坐标系的转换关系的过程也可以叫做标定过程。如图13所示,流程1300可以包括:
在步骤1310中,构建特定坐标系。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,至少一个原始坐标系与目标坐标系之间的转换关系可以通过标定过程获得。特定坐标系是指在标定过程中,用于确定原始坐标系与目标坐标系之间转换关系的参考坐标系。在一些实施例中,构建的特定坐标系可以以人体站立时躯干的长度方向为Z轴,以人体前后方向为X轴,以人体躯干的左右方向为Y轴。在一些实施例中,特定坐标系与标定过程中用户的朝向有关。例如,在标定过程中,用户身体正面朝向某个固定方 向(例如,北方),则人体前方(北方)方向即为X轴,在标定过程中,X轴的方向是固定的。
在步骤1320中,获取用户处于第一姿势时至少一个原始坐标系中的第一坐标信息。
在一些实施例中,该步骤可以由获取模块210执行。第一姿势可以是用户保持近似站立的姿势。获取模块210(例如,传感器)可以基于用户的第一姿势获取原始坐标系中的第一坐标信息。
在步骤1330中,获取用户处于第二姿势时至少一个原始坐标系中的第二坐标信息。
在一些实施例中,该步骤可以由获取模块210执行。第二姿势可以是传感器所在的用户身体部位(例如,手臂)向前倾斜的姿势。在一些实施例中,获取模块210(例如,传感器)可以基于用户的第二姿势(例如,向前倾斜姿势)获取原始坐标系中的第二坐标信息。
在步骤1340中,根据第一坐标信息、第二坐标信息和特定坐标系确定至少一个原始坐标系与特定坐标系的转换关系。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,可以通过第一姿势对应的第一坐标信息确定第一旋转矩阵。在第一姿势时,由于特定坐标系在ZYX旋转顺序下的X和Y方向欧拉角为0,而原始坐标系的X和Y方向欧拉角不一定为0,那么第一旋转矩阵就是将原始坐标系绕着X轴逆向旋转,然后绕着Y轴逆向旋转得到的旋转矩阵。在一些实施例中,可以通过第二姿势(例如,传感器所在的身体部位前倾)的第二坐标信息确定第二旋转矩阵。具体地,第二姿势时,已知特定坐标系在ZYZ旋转顺序下,Y和Z 3方向欧拉角为0,原始坐标系在Y和Z 3方向的欧拉角不一定为0,那么第二旋转矩阵就是将原始坐标系绕着Y方向逆向旋转,然后绕着Z 3方向逆向旋转得到的旋转矩阵。通过上述第一旋转矩阵和第二旋转矩阵可以确定原始坐标系和特定坐标系之间的转换关系。在一些实施例中,当原始坐标系(传感器)为多个时,可以采用上述的方法确定每一个原始坐标系与特定坐标系之间的转换关系。
需要说明的是,上述的第一姿势不限于用户保持近似站立的姿势,第二姿势不局限于传感器所在的用户身体部位(例如,手臂)向前倾斜的姿势,这里的第一姿势和第二姿势可以近似视为在标定过程中静止的姿势。在一些实施例中,第一姿势和/或第 二姿势也可以是标定过程中动态的姿势。例如,用户走路的姿势是一个相对固定的姿势,可以提取走路过程中双臂、双腿、双脚的角度和角速度,识别出向前迈步、向前摆臂等动作,用户向前走路的姿势可以作为标定过程中的第二姿势。在一些实施例中,第二姿势不限于一个动作,还可以提取多个动作作为第二姿势。例如,将多个动作的坐标信息进行融合,从而得到更加精确的旋转矩阵。
在一些实施例中,在标定过程中,可以使用一些信号处理算法(比如使用卡尔曼滤波算法)动态纠正旋转矩阵,以得到在整个标定过程中较优的转换矩阵。
在一些实施例中,可以使用机器学习算法,或者其他算法对一些特定的动作进行自动识别,以对旋转矩阵进行实时更新。例如,通过机器学习算法识别出当前用户正在走路,或者正在站立,则自动开始标定过程,在这种情况下,穿戴设备并不再需要显式标定过程,旋转矩阵会在用户使用穿戴设备的过程中进行动态更新。
在一些实施例中,姿态传感器的安装位置可以相对固定,相应的算法内部可以先预设一个旋转矩阵,可以使得特定动作的识别过程更加准确。进一步地,在用户使用穿戴设备的过程中继续对旋转矩阵进行修正,使获得的旋转矩阵更加贴近真实状况。
应当注意的是,上述有关流程1300的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程1300进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。
图14是根据本申请一些实施例所示的确定原始坐标系与目标坐标系之间的转换关系的示例性流程图。如图14所示,流程1400可以包括:
在步骤1410中,获取特定坐标系与目标坐标系的转换关系。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。特定坐标系与目标坐标系都是以人体躯干的长度方向为Z轴,因此通过特定坐标系的X轴与目标坐标系的X轴的转换关系以及特定坐标系的Y轴与目标坐标系的Y轴之间的转换关系,可以获取特定坐标系与目标坐标系之间的转换关系。关于获取特定坐标关系与目标坐标系之间的转换关系的原理可以参考图13及其相关内容。
在一些实施例中,特定坐标系可以以人体躯干的长度方向为Z轴,人体前后方向为标定的X轴。由于用户在运动(例如,转体运动)的过程中用户身体的前后方向会发生变化而不能保持在标定的坐标系中,因此需要确定一个可以随着人体转动的坐标系,即目标坐标系。在一些实施例中,目标坐标系可以随着用户的朝向变化而变化,目标坐标系的X轴始终是人体躯干的正前方。
在步骤1420中,根据至少一个原始坐标系与特定坐标系的转换关系,以及特定坐标系与目标坐标系的转换关系,确定至少一个原始坐标系与目标坐标系之间的转换关系。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,处理设备110可以根据流程1300中确定的至少一个原始坐标系与特定坐标系之间的转换关系,以及步骤1410中确定的特定坐标系与目标坐标系之间的转换关系,确定至少一个原始坐标系与目标坐标系之间的转换关系,从而可以将原始坐标系中的坐标信息转化为目标坐标系中的坐标信息。
应当注意的是,上述有关流程1400的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程1400进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。
在一些实施例中,可穿戴设备130上设置的姿态传感器的位置可能发生变化和/或姿态传感器在人体上的安装角度不同,则用户进行相同的运动,姿态传感器返回的姿态数据可以有较大的差别。
图15A是根据本申请一些实施例所示的人体小臂位置处原始坐标系中的欧拉角数据的示例性向量坐标图。框线部分可以表示用户做同一动作时小臂位置处对应的原始坐标系中的欧拉角数据(坐标信息)。如图15A所示,框线部分内Z轴方向(图15A中以“Z”示出)的欧拉角向量结果近似在-180°-(-80°)的范围内,Y轴方向(图15A中以“Y”示出)的欧拉角向量结果近似在0°上下波动,X轴方向(图15A中以“X”示出)的欧拉角向量结果近似在-80°上下波动。这里的波动范围可以是20°。
图15B是根据本申请一些实施例所示的人体小臂位置另一处原始坐标系中的欧拉角数据的示例性向量坐标图。框线部分可以表示用户做同一动作(与图15A所示的动作相同的动作)时小臂位置另一处对应的原始坐标系中的欧拉角数据。如图15B所示,框线部分内Z轴方向(图15B中以“Z ”示出)的欧拉角向量结果近似在-180°-180°的范围内,Y轴方向(图15B中以“Y ”示出)的欧拉角向量结果近似在0°上下波动,X轴方向(图15B中以“X ”示出)的欧拉角向量结果近似在-150°上下波动。这里的波动范围可以是20°。
图15A和图15B所示的欧拉角数据是基于人体小臂不同位置处(也可以理解为姿态传感器在人体小臂位置处的安装角度不同),用户做同一动作时分别得到的原始坐标系中的欧拉角数据(坐标信息)。对比图15A和图15B可以看出,姿态传感器在人 体上安装角度的不同,用户做相同的动作时,姿态传感器返回的原始坐标系中的欧拉角数据的差别可以较大。例如,图15A中Z轴方向的欧拉角向量结果近似在-180°-(-80°)的范围内,图15B中Z轴方向的欧拉角向量结果近似在-180°-180°的范围内,二者差别较大。
在一些实施例中,可以将不同安装角度的传感器对应的原始坐标系中的欧拉角数据转换为目标坐标系中的欧拉角数据,从而便于对不同位置传感器的姿态信号进行分析。仅作为示例性说明,可以将左臂所在的直线抽象为一个从手肘指向手腕的单位向量,该单位向量是在目标坐标系内的坐标值。这里的目标坐标系定义为指向人体后方的轴为X轴,指向人体右侧的轴为Y轴,指向人体的上方的轴为Z轴,符合右手坐标系。例如,目标坐标系中的坐标值[-1,0,0]表示手臂向前平举;目标坐标系的坐标值[0,-1,0]表示手臂向左侧平举。图16A是根据本申请一些实施例所示的人体小臂位置处目标坐标系中的欧拉角数据的示例性向量坐标图。图16A是基于图15A中小臂在原始坐标的欧拉角数据转换为目标坐标系中的向量坐标后获取的曲线图,其中,框线部分可以表示用户做一动作时小臂位置处的目标坐标系中的欧拉角数据。如图16A所示,框线部分内小臂向量[x,y,z]在第一位置和第二位置之间往复运动,其中第一位置是[0.2,-0.9,-0.38],第二位置是[0.1,-0.95,-0.3]。需要注意的是,小臂的每一次往复运动,第一位置和第二位置会有小幅度的偏差。
图16B是根据本申请一些实施例所示的人体小臂位置另一处的目标坐标系中的欧拉角数据的示例性向量坐标图。图16B是基于图15B中小臂在原始坐标的欧拉角数据转换为目标坐标系中的向量坐标后获取的曲线图,其中,框线部分可以表示用户做同一动作(与图16A所示的动作相同的动作)时小臂位置另一处的目标坐标系中的欧拉角数据。如图16B所示,小臂向量[x,y,z]同样在第一位置和第二位置之间往复运动,其中第一位置是[0.2,-0.9,-0.38],第二位置是[0.1,-0.95,-0.3]。
结合图15A至图16B,从图15A和15B中可以看出,由于两个姿态传感器的安装位置不同,原始坐标系下的欧拉角在取值范围和波动形式上有着很大的区别,将两个姿态传感器对应的原始坐标系的坐标信息分别转换为目标坐标系对应的向量坐标(例如,图16A和16B中的向量坐标)后,可以得到两个近似相同的向量坐标,也就是说这种方法可以使得姿态信号对应的特征信息不受到传感器安装位置的影响。具体地,在图16A和图16B中可以看出两个姿态传感器在小臂上的安装位置不同,经过上述坐标转换后,得到了相同的向量坐标,即能够表征在坐姿夹胸过程中,手臂在状态一(手臂向 右平举)和状态二(手臂向前平举)两个状态之间往复切换的过程。
图17是根据本申请一些实施例所示的肢体向量在目标坐标系中的向量坐标图。如图17所示,从上至下可以分别表示人体左手小臂(17-1)、右手小臂(17-2)、左手大臂(17-3),右手大臂(17-4),躯干(17-5)位置处姿态传感器在目标坐标系中的向量坐标。图17中示出了人体运动时各个位置(例如,17-1、17-2、17-3、17-4、17-5)在目标坐标系中的向量坐标。图17中前4200个点是对肢体进行标定所需要的标定动作,比如站立,躯干前行,手臂前伸,手臂侧平举等。使用前4200个点对应的标定动作进行标定,可以将姿态传感器采集到的原始数据转换为目标坐标系下的欧拉角。为了便于对数据进行分析,可以进一步转换为手臂向量在目标坐标系下的坐标向量。这里的目标坐标系是指向躯干前方是X轴,指向躯干左侧是Y轴,指向躯干上方是Z轴。图17中的往复性动作从左到右的动作1、动作2、动作3、动作4、动作5、动作6分别是坐姿夹胸、高位下拉、坐姿推胸、坐姿推肩、杠铃二头弯举、坐姿夹胸。从图17中可以看出,不同的动作有不同的动作模式,可以使用肢体向量很清晰的识别出来。同时,相同的动作也有很好的可重复性,比如动作1和动作6均表示坐姿夹胸动作,这两段动作的曲线有着较好的重复性。
在一些实施例中,原始坐标系的模块直接输出的姿态数据(例如,欧拉角、角速度等)可以通过流程1300和1400转换为目标坐标系中的姿态数据,从而可以得到高一致性的姿态数据(例如,欧拉角、角速度、肢体向量坐标等)。
图18A是根据本申请一些实施例所示的原始角速度的示例性向量坐标图。原始角速度可以理解为将不同安装角度的传感器对应的原始坐标系中的欧拉角数据转换为目标坐标系中的欧拉角数据。在一些实施例中,用户运动过程中的抖动等因素会影响姿态数据中角速度的结果。如图18A所示,原始角速度在抖动等的影响下,其向量坐标曲线呈现出较为明显的不平滑曲线。例如,原始角速度的向量坐标曲线中存在突变信号,使得原始角速度的向量坐标曲线不平滑。在一些实施例中,针对抖动等对角速度结果的影响,需要对抖动的角速度进行修正得到平滑的向量坐标曲线。在一些实施例中,可以采用1Hz-3Hz低通滤波方法对原始角速度进行滤波处理。图18B是根据本申请一些实施例所示的滤波处理后的角速度的示例性结果图。如图18B所示,对原始角速度进行1Hz-3Hz的低通滤波处理后,可以消除抖动等对角速度的影响(例如,突变信号),使得角速度对应的向量坐标图可以呈现出较为平滑的曲线。在一些实施例中,对角速度进行1Hz-3Hz的低通滤波处理可以有效地规避抖动等对姿态数据(例如,欧拉角、角速度 等)的影响,更加便于后续对信号分段的过程。在一些实施例中,滤波处理还可以滤除动作信号中的工频信号及其谐波信号、毛刺信号等。需要说明的是,1Hz-3Hz的低通滤波处理会引入系统时延,使得姿态信号获取的动作点与真实肌电信号的动作点有时间上的错位,因此在低通滤波处理后的向量坐标曲线的基础上减去低通滤波处理过程中产生的系统时延,保证姿态信号和肌电信号在时间上的同步。在一些实施例中,系统时延与滤波器的中心频率相关联,当姿态信号和肌电信号采用不同的滤波器进行处理时,系统时延根据滤波器的中心频率做适应性调整。在一些实施例中,由于欧拉角的角度范围为[-180°,+180°],当实际欧拉角不在这个角度范围内时,获取的欧拉角可能会有-180°到+180°或+180°到-180°的跳变。例如,当角度为-181°时,欧拉角的角度会跳变为179°。在实际应用过程中跳变会影响角度差值的计算,需要先对跳变进行修正。
在一些实施例中,还可以利用动作识别模型对用户的动作信号或者动作信号对应的特征信息进行分析,从而识别出用户动作。在一些实施例中,动作识别模型包括经过训练的用来识别用户动作的机器学习模型。在一些实施例中,动作识别模型可以包括一个或多个机器学习模型。在一些实施例中,动作识别模型可以包括但不限于对用户动作信号进行分类的机器学习模型、识别用户动作质量的机器学习模型、识别用户动作次数的机器学习模型、识别用户执行动作的疲劳程度的机器学习模型中的一个或多个。在一些实施例中,机器学习模型可以包括线性分类模型(LR)、支持向量机模型(SVM)、朴素贝叶斯模型(NB)、K近邻模型(KNN)、决策树模型(DT)、集成模型(RF/GDBT等)等中的一种或多种。关于动作识别模型的内容可以参考本申请说明书其它地方,例如图20及其相关描述。
图19是根据本申请一些实施例所示的运动监控和反馈方法的示例性流程图。如图19所示,流程1900可以包括:
在步骤1910中,获取用户运动时的动作信号。
在一些实施例中,该步骤可以由获取模块210执行。在一些实施例中,动作信号至少包括肌电信号对应的特征信息和姿态信号对应的特征信息。动作信号是指用户运动时的人体参数信息。在一些实施例中,人体参数信息可以包括但不限于肌电信号、姿态信号、心率信号、温度信号、湿度信号、血氧浓度等中的一种或多种。在一些实施例中,动作信号可以至少包括肌电信号和姿态信号。在一些实施例中,获取模块210中的肌电传感器可以采集用户运动时的肌电信号,获取模块210中的姿态传感器可以采集用户运动时的姿态信号。
在步骤1920中,通过动作识别模型,基于所述动作信号对用户的运动动作进行监控,并基于动作识别模型的输出结果进行动作反馈。
在一些实施例中,该步骤可以由处理模块220和/或处理设备110执行。在一些实施例中,动作识别模型的输出结果可以包括但不限于动作类型、动作质量、动作数量、疲劳指数等中的一种或多种。例如,动作识别模型可以根据动作信号识别用户的动作类型为坐姿夹胸。又例如,动作识别模型中的一个机器学习模型可以根据动作信号先识别用户的动作类型为坐姿夹胸,动作识别模型中的另一个机器学习模型可以根据动作信号(例如,肌电信号的幅值信息、频率信息和/或姿态信号的角速度、角速度方向、角速度的加速度值)来输出用户动作的动作质量为标准动作或错误动作。在一些实施例中,动作反馈可以包括发出提示信息。在一些实施例中,提示信息可以包括但不限于语音提示、文字提示、图像提示、视频提示等。例如,动作识别模型的输出结果为错误动作,处理设备110可以控制可穿戴设备130或移动终端设备140向用户发出语音提示(例如,“动作不规范”等信息),用以提醒用户及时调整健身动作。又例如,动作识别模型的输出结果为标准动作,可穿戴设备130或移动终端设备140可以不发出提示信息,或者发生“动作标准”类似的提示信息。在一些实施例中,动作反馈也可以包括可穿戴设备130刺激用户运动的相应部位。例如,可穿戴设备130的元件通过振动反馈、电刺激反馈、压力反馈等方式刺激用户动作的对应部位。例如,动作识别模型的输出结果为错误动作,处理设备110可以控制可穿戴设备130的元件刺激用户运动的相应部位。在一些实施例中,动作反馈还可以包括输出用户运动时的运动记录。这里的运动记录可以是指用户动作类型、运动时长、动作数量、动作质量、疲劳指数、运动时的生理参数信息等中的一个或多个。关于动作识别模型的内容可以参考本申请其他地方的描述,在此不做赘述。
应当注意的是,上述有关流程1900的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程1900进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。
图20是根据本申请一些实施例所示的模型训练的应用的示例性流程图。如图20所示,流程2000可以包括:
在步骤2010中,获取样本信息。
在一些实施例中,该步骤可以由获取模块210执行。在一些实施例中,样本信息可以包括专业人员(例如,健身教练)和/或非专业人员运动时的动作信号。例如,样 本信息可以包括专业人员和/或非专业人员在进行同种类型的动作(例如,坐姿夹胸)时产生的肌电信号和/或姿态信号。在一些实施例中,样本信息中的肌电信号和/或姿态信号可以经过流程700的分段处理、流程900的毛刺处理和流程1300的转换处理等,形成至少一段肌电信号和/或姿态信号。该至少一段肌电信号和/或姿态信号可以作为机器学习模型的输入来对机器学习模型进行训练。在一些实施例中,至少用一段肌电信号对应的特征信息和/或姿态信号对应的特征信息也可以作为机器学习模型的输入来对机器学习模型进行训练。例如,可以将肌电信号的频率信息和幅值信息作为机器学习模型的输入。又例如,可以将姿态信号的角速度、角速度方向/角速度的加速度值作为机器学习模型的输入。再例如,可以将动作信号的动作开始点、动作中间点和动作结束点作为机器学习模型的输入。在一些实施例中,样本信息可以是从处理设备110的存储设备中得到的。在一些实施例中,样本信息可以是从获取模块210中得到的。
在步骤2020中,训练动作识别模型。
该步骤可以由处理设备110执行。在一些实施例中,动作识别模型可以包括一个或多个机器学习模型。例如,动作识别模型可以包括但不限于对用户动作信号进行分类的机器学习模型、识别用户动作质量的机器学习模型、识别用户动作次数的机器学习模型、识别用户执行动作的疲劳程度的机器学习模型中的一个或多个。在一些实施例中,机器学习模型可以包括线性分类模型(LR)、支持向量机模型(SVM)、朴素贝叶斯模型(NB)、K近邻模型(KNN)、决策树模型(DT)、集成模型(RF/GDBT等)等中的一种或多种。
在一些实施例中,对机器学习模型的训练可以包括获取样本信息。在一些实施例中,样本信息可以包括专业人员(例如,健身教练)和/或非专业人员运动时的动作信号。例如,样本信息可以包括专业人员和/或非专业人员在进行同种类型的动作(例如,坐姿夹胸)时产生的肌电信号和/或姿态信号。在一些实施例中,样本信息中的肌电信号和/或姿态信号可以经过流程700的分段处理、流程900的毛刺处理和流程1300的转换处理等,形成至少一段肌电信号和/或姿态信号。该至少一段肌电信号和/或姿态信号可以作为机器学习模型的输入来对机器学习模型进行训练。在一些实施例中,至少用一段肌电信号对应的特征信息和/或姿态信号对应的特征信息也可以作为机器学习模型的输入来对机器学习模型进行训练。例如,可以将肌电信号的频率信息和幅值信息作为机器学习模型的输入。又例如,可以将姿态信号的角速度、角速度方向/角速度的加速度值作为机器学习模型的输入。再例如,可以将动作信号的动作开始点、动作中间点和/或动作 结束点对应的信号(包括肌电信号和/或姿态信号)作为机器学习模型的输入。
在一些实施例中,训练识别用户动作类型的机器学习模型时,可以将来自不同动作类型的样本信息(每段肌电信号或/姿态信号)进行打标签处理。例如,样本信息来自用户执行坐姿夹胸时产生的肌电信号和/或姿态信号可以标记为“1”,这里的“1”用于表征“坐姿夹胸”;样本信息来自用户执行二头弯举时产生的肌电信号和/或姿态信号可以标记为“2”,这里的“2”用于表征“二头弯举”。不同动作类型对应的肌电信号的特征信息(例如,频率信息、幅值信息)、姿态信号的特征信息(例如,角速度、角速度方向、角速度的角速度值)不同,将打标签的样本信息(例如,样本信息中肌电信号和/或姿态信号对应的特征信息)作为机器学习模型的输入来对机器学习模型进行训练,可以得到用于识别用户动作类型的动作识别模型,在该机器学模型中输入动作信号可以输出对应的动作类型。
在一些实施例中,动作识别模型还可以包括用于判断用户动作质量的机器学习模型。这里的样本信息可以包括标准动作信号(也被称为正样本)和非标准动作信号(也被称为负样本)。标准动作信号可以包括专业人员执行标准动作时产生的动作信号。例如,专业人员在进行标准的坐姿夹胸运动时产生的动作信号为标准动作信号。非标准动作信号可以包括用户执行非标准动作(例如,错误动作)产生的动作信号。在一些实施例中,样本信息中的肌电信号和/或姿态信号可以经过流程700的分段处理、流程900的毛刺处理和流程1300的转换处理等,形成至少一段肌电信号和/或姿态信号。该至少一段肌电信号和/或姿态信号可以作为机器学习模型的输入来对机器学习模型进行训练。在一些实施例中,可以将样本信息(每段肌电信号或/姿态信号)中的正样本和负样本进行打标签处理。例如,正样本标记为“1”,负样本标记为“0”。这里的“1”用于表征用户的动作为标准动作,这里的“0”用于表征用户的动作为错误动作。完成训练的机器学习模型可以根据输入的样本信息(例如,正样本,负样本)输出不同的标签。需要注意的是,动作识别模型可以包括一个或多个用于分析识别用户动作质量的机器学习模型,不同的机器学习模型可以分别分析识别来自不同动作类型的样本信息。
在一些实施例中,动作识别模型还可以包括识别用户健身动作的动作数量的模型。例如,将样本信息中的动作信号(例如,肌电信号和/或姿态信号)经过流程700的分段处理,得到至少一组动作开始点、动作中间点、动作结束点,对每组的动作开始点、动作中间点和动作结束点分别进行标记,比如,动作开始点标记为1,动作中间点标记为2、动作结束点标记为3,将标记作为机器学习模型的输入,在机器学习模型中输入 一组连续的“1”、“2”、“3”可以输出1次动作。例如,在机器学习模型中输入3组连续的“1”、“2”、“3”可以输出3次动作。
在一些实施例中,动作识别模型还可以包括用于识别用户疲劳指数的机器学习模型。这里的样本信息还可以包括心电信号、呼吸频率、温度信号、湿度信号等其他生理参数信号。例如,心电信号的不同频率范围可以作为机器学习模型的输入数据,心电信号的频率在60次/min-100次/min标记为“1”(正常),小于60次/min或大于100次/min标记为“2”(不正常)。在一些实施例中,还可以根据用户的心电信号频率进行进一步分段并标记不同的指数作为输入数据,完成训练的机器学习模型可以根据心电信号的频率输出对应的疲劳指数。在一些实施例中,还可以结合呼吸频率、温度信号等生理参数信号训练该机器学习模型。在一些实施例中,样本信息可以是从处理设备110的存储设备中得到的。在一些实施例中,样本信息可以是从获取模块210中得到的。需要注意的是,动作识别模型可以为上述任一一个机器学习模型,也可以为上述多个机器学习模型的组合,或者包括其它的机器学习模型,可以根据实际情况进行选择。另外,对机器学习模型的训练输入不限于一段(一个周期)的动作信号,还可以是一段信号中的部分动作信号,或者多段动作信号等。
在步骤2030中,提取动作识别模型。
在一些实施例中,该步骤可以由处理设备110执行。在一些实施例中,处理设备110和/或处理模块220可以提取动作识别模型。在一些实施例中,动作识别模型可以存储至处理设备110、处理模块220或移动终端中。
在步骤2040中,获取用户动作信号。
在一些实施例中,该步骤可以由获取模块210执行。例如,在一些实施例中,获取模块210中的肌电传感器可以获取用户的肌电信号,获取模块210中的姿态传感器可以采集用户的姿态信号。在一些实施例中,用户动作信号还可以包括用户运动时的心电信号、呼吸信号、温度信号、湿度信号等其他生理参数信号。在一些实施例中,获取用户动作信号之后可以对动作信号(例如,肌电信号和/或姿态信号)进行流程700的分段处理、流程900的毛刺处理和流程1300的转换处理等,形成至少一段肌电信号和/或姿态信号。
在步骤2050中,通过动作识别模型,基于用户动作信号判断用户动作。
该步骤可以由处理设备110和/或处理模块220执行。在一些实施例中,处理设备110和/或处理模块220可以基于动作识别模型判断用户动作。在一些实施例中,完 成训练的动作识别模型可以包括一个或多个机器学习模型。在一些实施例中,动作识别模型可以包括但不限于对用户动作信号进行分类的机器学习模型、识别用户动作质量的机器学习模型、识别用户动作次数的机器学习模型、识别用户执行动作的疲劳指数的机器学习模型中的一个或多个。不同的机器学习模型可以具有不同的识别效果。例如,对用户动作信号进行分类的机器学习模型可以以用户的动作信号作为输入数据进而输出相应的动作类型。又例如,识别用户动作质量的机器学习模型可以以用户的动作信号作为输入数据进而输出动作的质量(例如,标准动作、错误动作)。再例如,识别用户执行动作的疲劳指数的机器学习模型可以以用户的动作信号(比如,心电信号频率)作为输入数据进而输出用户的疲劳指数。在一些实施例中,用户动作信号和机器学习模型的判断结果(输出)也可以作为训练动作识别模型的样本信息,对动作识别模型进行训练,以优化动作识别模型的相关参数。需要注意的是,动作识别模型不限于上述经过训练的机器学习模型,还可以为预先设定的模型,例如,人工预先设定的条件判断算法或在经过训练的机器学习模型的基础上人工增加参数(例如,置信度)等。
在步骤2060中,基于判断结果对用户动作进行反馈。
在一些实施例中,该步骤可以由可穿戴设备130和/或移动终端设备140执行。进一步地,处理设备110和/或处理模块220基于用户动作的判断结果向可穿戴设备130和/或移动终端设备140发出反馈指令,可穿戴设备130和/或移动终端设备140基于反馈指令对用户进行反馈。在一些实施例中,反馈可以包括发出提示信息(例如,文字信息、图片信息、视频信息、语音信息、指示灯信息等)和/或执行相应动作(电流刺激、振动、压力变化、热量变化等方式)刺激用户身体。例如,用户进行仰卧起坐动作时,通过对其动作信号进行监控,判断出其在运动过程中斜方肌用力过大(也就是说用户在运动过程中头部和颈部的动作不标准),在这种情况下可穿戴设备130中的输入/输出模块260(例如,震动提示器)和移动终端设备140(例如,智能手表、智能手机等)执行相应的反馈动作(例如,在用户身体部位施加振动,发出语音提示等)以提示用户及时调整发力部位。在一些实施例中,在用户运动过程中,通过对用户运动过程中的动作信号进行监控,判断出用户在运动过程的动作类型、动作质量、动作次数,移动终端设备140可以输出相应的运动记录,以便用户了解自己在运动过程中的运动情况。
在一些实施例中,对用户进行反馈时,反馈可以与用户感知相匹配。例如,用户动作不标准时对用户动作相应的区域进振动刺激,用户基于振动刺激可以知晓动作不标准,而振动刺激在用户可接受的范围内。进一步地,可以基于用户动作信号与用户感 知建立匹配模型,在用户感知和真实反馈之间寻找最佳平衡点。
在一些实施例中,还可以根据用户动作信号训练动作识别模型。在一些实施例中,根据用户动作信号训练动作识别模型可以包括对用户动作信号进行评估确定用户动作信号的置信度。置信度的大小可以表示用户动作信号的质量。例如,置信度越高,用户动作信号的质量越好。在一些实施例中,对用户动作信号进行评估可以是在采集动作信号、预处理、分段和/或识别等阶段进行。
在一些实施例中,根据用户动作信号训练动作识别模型还可以包括判断置信度是否大于置信度阈值(例如,80),若置信度大于或等于置信度阈值,则基于该置信度对应的用户动作信号作为样本数据训练动作识别模型;若置信度小于置信度阈值,则该置信度对应的用户动作信号不作为样本数据训练动作识别模型。在一些实施例中,置信度可以包括但不限于采集动作信号、信号预处理、信号分段或信号识别等任意一个阶段的置信度。例如,以获取模块210采集到的动作信号的置信度作为判断标准。在一些实施例中,置信度可以还可以是采集动作信号、信号预处理、信号分段或信号识别等任意几个阶段的联合置信度。联合置信度可以基于每个阶段的置信度并采用平均或加权等方式进行计算。在一些实施例中,根据用户动作信号训练动作识别模型可以是实时、定期(例如,一天、一周、一个月等)或满足一定数据量进行训练。
应当注意的是,上述有关流程1700的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程1700进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。
上文已对基本概念做了描述,显然,对于本领域技术人员来说,上述详细披露仅仅作为示例,而并不构成对本申请的限定。虽然此处并没有明确说明,本领域技术人员可能会对本申请进行各种修改、改进和修正。该类修改、改进和修正在本申请中被建议,所以该类修改、改进、修正仍属于本申请示范实施例的精神和范围。
同时,本申请使用了特定词语来描述本申请的实施例。如“一个实施例”、“一实施例”、和/或“一些实施例”意指与本申请至少一个实施例相关的某一特征、结构或特点。因此,应强调并注意的是,本说明书中在不同位置两次或多次提及的“一实施例”或“一个实施例”或“一个替代性实施例”并不一定是指同一实施例。此外,本申请的一个或多个实施例中的某些特征、结构或特点可以进行适当的组合。
此外,本领域技术人员可以理解,本申请的各方面可以通过若干具有可专利性的种类或情况进行说明和描述,包括任何新的和有用的工序、机器、产品或物质的组合, 或对他们的任何新的和有用的改进。相应地,本申请的各个方面可以完全由硬件执行、可以完全由软件(包括固件、常驻软件、微码等)执行、也可以由硬件和软件组合执行。以上硬件或软件均可被称为“数据块”、“模块”、“引擎”、“单元”、“组件”或“系统”。此外,本申请的各方面可能表现为位于一个或多个计算机可读介质中的计算机产品,该产品包括计算机可读程序编码。
计算机存储介质可能包含一个内含有计算机程序编码的传播数据信号,例如在基带上或作为载波的一部分。该传播信号可能有多种表现形式,包括电磁形式、光形式等,或合适的组合形式。计算机存储介质可以是除计算机可读存储介质之外的任何计算机可读介质,该介质可以通过连接至一个指令执行系统、装置或设备以实现通讯、传播或传输供使用的程序。位于计算机存储介质上的程序编码可以通过任何合适的介质进行传播,包括无线电、电缆、光纤电缆、RF、或类似介质,或任何上述介质的组合。
本申请各部分操作所需的计算机程序编码可以用任意一种或多种程序语言编写,包括面向对象编程语言如Java、Scala、Smalltalk、Eiffel、JADE、Emerald、C++、C#、VB.NET、Python等,常规程序化编程语言如C语言、Visual Basic、Fortran 2003、Perl、COBOL 2002、PHP、ABAP,动态编程语言如Python、Ruby和Groovy,或其他编程语言等。该程序编码可以完全在用户计算机上运行、或作为独立的软件包在用户计算机上运行、或部分在用户计算机上运行部分在远程计算机运行、或完全在远程计算机或处理设备上运行。在后种情况下,远程计算机可以通过任何网络形式与用户计算机连接,比如局域网(LAN)或广域网(WAN),或连接至外部计算机(例如通过因特网),或在云计算环境中,或作为服务使用如软件即服务(SaaS)。
此外,除非权利要求中明确说明,本申请所述处理元素和序列的顺序、数字字母的使用、或其他名称的使用,并非用于限定本申请流程和方法的顺序。尽管上述披露中通过各种示例讨论了一些目前认为有用的发明实施例,但应当理解的是,该类细节仅起到说明的目的,附加的权利要求并不仅限于披露的实施例,相反,权利要求旨在覆盖所有符合本申请实施例实质和范围的修正和等价组合。例如,虽然以上所描述的系统组件可以通过硬件设备实现,但是也可以只通过软件的解决方案得以实现,如在现有的处理设备或移动设备上安装所描述的系统。
同理,应当注意的是,为了简化本申请披露的表述,从而帮助对一个或多个发明实施例的理解,前文对本申请实施例的描述中,有时会将多种特征归并至一个实施例、附图或对其的描述中。但是,这种披露方法并不意味着本申请对象所需要的特征比权利 要求中提及的特征多。实际上,实施例的特征要少于上述披露的单个实施例的全部特征。
一些实施例中使用了描述成分、属性数量的数字,应当理解的是,此类用于实施例描述的数字,在一些示例中使用了修饰词“大约”、“近似”或“大体上”来修饰。除非另外说明,“大约”、“近似”或“大体上”表明所述数字允许有±20%的变化。相应地,在一些实施例中,说明书和权利要求中使用的数值参数均为近似值,该近似值根据个别实施例所需特点可以发生改变。在一些实施例中,数值参数应考虑规定的有效数位并采用一般位数保留的方法。尽管本申请一些实施例中用于确认其范围广度的数值域和参数为近似值,在具体实施例中,此类数值的设定在可行范围内尽可能精确。
针对本申请引用的每个专利、专利申请、专利申请公开物和其他材料,如文章、书籍、说明书、出版物、文档等,特此将其全部内容并入本申请作为参考。与本申请内容不一致或产生冲突的申请历史文件除外,对本申请权利要求最广范围有限制的文件(当前或之后附加于本申请中的)也除外。需要说明的是,如果本申请附属材料中的描述、定义、和/或术语的使用与本申请所述内容有不一致或冲突的地方,以本申请的描述、定义和/或术语的使用为准。
最后,应当理解的是,本申请中所述实施例仅用以说明本申请实施例的原则。其他的变形也可能属于本申请的范围。因此,作为示例而非限制,本申请实施例的替代配置可视为与本申请的教导一致。相应地,本申请的实施例不仅限于本申请明确介绍和描述的实施例。

Claims (23)

  1. 一种运动监控方法,其特征在于,包括:
    获取用户运动时的动作信号,其中,所述动作信号至少包括肌电信号或姿态信号;以及
    至少基于所述肌电信号对应的特征信息或所述姿态信号对应的特征信息对所述用户运动的动作进行监控。
  2. 根据权利要求1所述的方法,其特征在于,至少基于所述肌电信号对应的特征信息或所述姿态信号对应的特征信息对所述用户运动的动作进行监控包括:
    基于与所述肌电信号对应的特征信息或与所述姿态信号对应的特征信息对所述动作信号进行分段;以及
    基于至少一段所述动作信号对所述用户运动的动作进行监控。
  3. 根据权利要求2所述的方法,其特征在于,所述肌电信号对应的特征信息至少包括频率信息或幅值信息,所述姿态信号对应的特征信息至少包括角速度方向、角速度值和角速度的加速度值、角度、位移信息、应力中的其中一个。
  4. 根据权利要求3所述的方法,其特征在于,所述基于与所述肌电信号对应的特征信息或所述姿态信号对应的特征信息对所述动作信号进行分段包括:
    基于所述肌电信号或所述姿态信号的时域窗口,根据预设条件从所述时域窗口内确定至少一个目标特征点;以及
    基于所述至少一个目标特征点对所述动作信号进行分段。
  5. 根据权利要求4所述的方法,其特征在于,所述至少一个目标特征点包括动作开始点、动作中间点、动作结束点中的一种。
  6. 根据权利要求5所述的方法,其特征在于,所述预设条件包括所述姿态信号对应的角速度方向发生变化、所述姿态信号对应的角速度大于或等于角速度阈值、所述姿态信号对应的角速度值的变化值为极值、所述姿态信号对应的角度达到角度阈值、所述肌电信号对应的幅值信息大于或等于肌电阈值中的一个或多个。
  7. 根据权利要求6所述的方法,其特征在于,所述预设条件还包括所述姿态信号对应的角速度的加速度在第一特定时间范围内持续大于或等于所述角速度的加速度阈值。
  8. 根据权利要求6所述的方法,其特征在于,所述预设条件还包括所述肌电信号对应的幅值在第二特定时间范围内持续大于所述肌电阈值。
  9. 根据权利要求1所述的方法,其特征在于,所述至少基于所述肌电信号对应的特征信息或所述姿态信号对应的特征信息对所述用户运动的动作进行监控包括:
    在频域或时域上对所述肌电信号进行预处理;以及
    基于预处理后的所述肌电信号获取所述肌电信号对应的特征信息,并根据所述肌电信号对应的特征信息或所述姿态信号对应的特征信息对所述用户运动的动作进行监控。
  10. 根据权利要求9所述的方法,其特征在于,所述在频域或时域上对所述肌电信号进行预处理包括:对所述肌电信号进行滤波以在频域上选取所述肌电信号中特定频率范围的成分。
  11. 根据权利要求9所述的方法,其特征在于,所述在频域或时域上对所述肌电信号进行预处理包括在时域上对所述肌电信号进行信号校正处理。
  12. 根据权利要求11所述的方法,其特征在于,所述在时域上对所述肌电信号进行信号校正处理包括:
    确定所述肌电信号中的奇异点,所述奇异点对应所述肌电信号中的突变信号;以及
    对所述肌电信号的奇异点进行信号校正处理。
  13. 根据权利要求12所述的方法,其特征在于,所述对所述肌电信号的奇异点进行信号校正处理包括去除所述奇异点或者根据所述奇异点周围的信号对所述奇异点进行修正。
  14. 根据权利要求12所述的方法,其特征在于,所述奇异点包括毛刺信号,所述 确定所述肌电信号中的奇异点包括:
    基于所述肌电信号的时域窗口,从所述肌电信号的时域窗口内选取不同的时间窗口,其中,所述不同的时间窗口分别覆盖不同的时间范围;以及
    基于所述不同的时间窗口中肌电信号对应的特征信息确定所述毛刺信号。
  15. 根据权利要求1所述的方法,其特征在于,还包括基于所述姿态信号确定与所述姿态信号对应的特征信息,其中,所述姿态信号包括至少一个原始坐标系中的坐标信息;
    所述基于所述姿态信号确定与所述姿态信号对应的特征信息包括:
    获取目标坐标系以及所述目标坐标系与所述至少一个原始坐标系之间的转换关系;
    基于所述转换关系,将所述至少一个原始坐标系中的坐标信息转换为所述目标坐标系中的坐标信息;以及
    基于所述目标坐标系中的坐标信息,确定与所述姿态信号对应的特征信息。
  16. 根据权利要求15所述的方法,其特征在于,所述姿态信号包括由至少两个传感器产生的坐标信息,所述至少两个传感器分别位于用户的不同运动部位并且对应不同的原始坐标系,所述基于所述姿态信号确定与所述姿态信号对应的特征信息包括:
    基于所述不同的原始坐标系与所述目标坐标系的转换关系,确定与所述至少两个传感器分别对应的特征信息;以及
    基于与所述至少两个传感器分别对应的特征信息,确定用户的不同运动部位之间的相对运动。
  17. 根据权利要求15所述的方法,其特征在于,所述至少一个原始坐标系与所述目标坐标系之间的转换关系通过标定过程获得,所述标定过程包括:
    构建特定坐标系,所述特定坐标系与标定过程中用户的朝向有关;
    获取用户处于第一姿势时所述至少一个原始坐标系中的第一坐标信息;
    获取用户处于第二姿势时所述至少一个原始坐标系统的第二坐标信息;以及
    根据所述第一坐标信息、第二坐标信息和所述特定坐标系确定所述至少一个原始坐标系与所述特定坐标系的转换关系。
  18. 根据权利要求17所述的方法,其特征在于,所述标定过程还包括:
    获取所述特定坐标系与所述目标坐标系的转换关系;以及
    根据所述至少一个原始坐标系与所述特定坐标系的转换关系,以及所述特定坐标系与所述目标坐标系的转换关系,确定所述至少一个原始坐标系与所述目标坐标系之间的转换关系。
  19. 根据权利要求15所述的方法,其特征在于,所述目标坐标系随着用户的朝向变化而改变。
  20. 一种确定动作识别模型的训练方法,其特征在于,包括:
    获取样本信息,所述样本信息包括用户运动时的动作信号,所述动作信号至少包括肌电信号对应的特征信息和姿态信号对应的特征信息;以及
    基于所述样本信息训练所述动作识别模型。
  21. 一种运动监控和反馈方法,其特征在于,包括:
    获取用户运动时的动作信号,其中,所述动作信号至少包括肌电信号和姿态信号;以及
    通过动作识别模型,基于所述肌电信号对应的特征信息和所述姿态信号对应的特征信息对用户的动作进行监控,并基于动作识别模型的输出结果进行动作反馈。
  22. 根据权利要求21所述的方法,其特征在于,所述动作识别模型包括经过训练的机器学习模型或预先设定的模型。
  23. 根据权利要求21所述的方法,其特征在于,包括:所述动作反馈至少包括发出提示信息、刺激用户的运动部位、输出用户运动时的运动记录中的一种。
PCT/CN2021/081931 2021-03-19 2021-03-19 一种运动监控方法及其系统 WO2022193330A1 (zh)

Priority Applications (29)

Application Number Priority Date Filing Date Title
KR1020237016055A KR20230086750A (ko) 2021-03-19 2021-03-19 운동감시방법및 운동감시시스템
PCT/CN2021/081931 WO2022193330A1 (zh) 2021-03-19 2021-03-19 一种运动监控方法及其系统
JP2023528497A JP2023549242A (ja) 2021-03-19 2021-03-19 運動監視方法及びそのシステム
EP21930919.2A EP4201323A4 (en) 2021-03-19 2021-03-19 EXERCISE MONITORING METHOD AND SYSTEM
CN202180070833.3A CN116981401A (zh) 2021-03-19 2021-03-19 一种运动监控方法及其系统
CN202110516387.6A CN115115751A (zh) 2021-03-19 2021-05-12 运动数据展示方法和系统
CN202180064627.1A CN116963807A (zh) 2021-03-19 2021-05-12 运动数据展示方法和系统
PCT/CN2021/093302 WO2022193425A1 (zh) 2021-03-19 2021-05-12 运动数据展示方法和系统
CN202210103219.9A CN115105056A (zh) 2021-03-19 2022-01-27 识别用户动作的方法和系统
CN202210103211.2A CN115105100A (zh) 2021-03-19 2022-01-27 运动数据处理方法和运动监控系统
KR1020237007354A KR20230044297A (ko) 2021-03-19 2022-01-27 사용자 동작을 식별하기 위한 방법 및 시스템
PCT/CN2022/074379 WO2022193851A1 (zh) 2021-03-19 2022-01-27 识别用户动作的方法和系统
CN202280005956.3A CN116261749A (zh) 2021-03-19 2022-01-27 识别用户动作的方法和系统
EP22770210.7A EP4167129A4 (en) 2021-03-19 2022-01-27 METHOD AND SYSTEM FOR DETECTING USER ACTIONS
EP22743434.7A EP4085834A4 (en) 2021-03-19 2022-01-27 EXERCISE DATA PROCESSING METHOD AND EXERCISE MONITORING SYSTEM
JP2023514098A JP2023540286A (ja) 2021-03-19 2022-01-27 ユーザー動作を識別する方法及びシステム
PCT/CN2022/074377 WO2022193850A1 (zh) 2021-03-19 2022-01-27 运动数据处理方法和运动监控系统
KR1020227032041A KR20220142495A (ko) 2021-03-19 2022-01-27 운동 데이터 처리방법 및 운동감시시스템
JP2022560093A JP7455995B2 (ja) 2021-03-19 2022-01-27 運動データ処理方法及び運動監視システム
JP2023535549A JP2023553625A (ja) 2021-03-19 2022-03-18 運動監視方法及びデバイス
KR1020237016947A KR20230091961A (ko) 2021-03-19 2022-03-18 운동감시방법 및 운동감시장치
PCT/CN2022/081718 WO2022194281A1 (zh) 2021-03-19 2022-03-18 一种运动监控方法和设备
EP22770633.0A EP4202667A1 (en) 2021-03-19 2022-03-18 Motion monitoring method and device
TW111110179A TWI837620B (zh) 2021-03-19 2022-03-18 運動監控方法及系統
CN202280006986.6A CN117157622A (zh) 2021-03-19 2022-03-18 一种运动监控方法和设备
US17/815,567 US20220365600A1 (en) 2021-03-19 2022-07-27 Motion data processing method and motion monitoring system
US18/155,703 US20230154607A1 (en) 2021-03-19 2023-01-17 Methods and systems for identifying user action
US18/182,373 US20230210402A1 (en) 2021-03-19 2023-03-13 Methods and devices for motion monitoring
US18/183,923 US20230233103A1 (en) 2021-03-19 2023-03-14 Motion monitoring methods and systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/081931 WO2022193330A1 (zh) 2021-03-19 2021-03-19 一种运动监控方法及其系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/183,923 Continuation US20230233103A1 (en) 2021-03-19 2023-03-14 Motion monitoring methods and systems

Publications (1)

Publication Number Publication Date
WO2022193330A1 true WO2022193330A1 (zh) 2022-09-22

Family

ID=83321834

Family Applications (4)

Application Number Title Priority Date Filing Date
PCT/CN2021/081931 WO2022193330A1 (zh) 2021-03-19 2021-03-19 一种运动监控方法及其系统
PCT/CN2021/093302 WO2022193425A1 (zh) 2021-03-19 2021-05-12 运动数据展示方法和系统
PCT/CN2022/074379 WO2022193851A1 (zh) 2021-03-19 2022-01-27 识别用户动作的方法和系统
PCT/CN2022/074377 WO2022193850A1 (zh) 2021-03-19 2022-01-27 运动数据处理方法和运动监控系统

Family Applications After (3)

Application Number Title Priority Date Filing Date
PCT/CN2021/093302 WO2022193425A1 (zh) 2021-03-19 2021-05-12 运动数据展示方法和系统
PCT/CN2022/074379 WO2022193851A1 (zh) 2021-03-19 2022-01-27 识别用户动作的方法和系统
PCT/CN2022/074377 WO2022193850A1 (zh) 2021-03-19 2022-01-27 运动数据处理方法和运动监控系统

Country Status (6)

Country Link
US (3) US20220365600A1 (zh)
EP (3) EP4201323A4 (zh)
JP (3) JP2023549242A (zh)
KR (3) KR20230086750A (zh)
CN (5) CN116981401A (zh)
WO (4) WO2022193330A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115826748A (zh) * 2022-11-26 2023-03-21 广东御腾网络科技发展有限公司 一种基于智能手环的动作识别方法及装置
CN116153510B (zh) * 2023-02-17 2024-04-16 河南翔宇医疗设备股份有限公司 矫正镜控制方法、装置、设备、存储介质及智能矫正镜

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140135960A1 (en) * 2012-11-15 2014-05-15 Samsung Electronics Co., Ltd. Wearable device, display device, and system to provide exercise service and methods thereof
CN104706359A (zh) * 2015-04-01 2015-06-17 深圳柔微传感科技有限公司 一种实现运动实时监测的方法和智能服装
CN207071088U (zh) * 2017-01-25 2018-03-06 杭州三目科技有限公司 一种基于服装的人体运动监测、分析和反馈装置
CN110327048A (zh) * 2019-03-11 2019-10-15 浙江工业大学 一种基于可穿戴式惯性传感器的人体上肢姿态重建系统
CN110609621A (zh) * 2019-09-17 2019-12-24 南京茂森电子技术有限公司 姿态标定方法及基于微传感器的人体运动捕获系统
CN112214109A (zh) * 2020-09-30 2021-01-12 深圳市润谊泰益科技有限责任公司 基于肌电和姿态数据的复合控制方法、装置及系统

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4867364B2 (ja) 2006-01-27 2012-02-01 横浜ゴム株式会社 生体電気情報計測装置
TWI393579B (zh) * 2009-11-13 2013-04-21 Inst Information Industry The state of the muscle movement state analysis system, methods and computer program products
US20140207017A1 (en) * 2013-01-23 2014-07-24 Altec, Inc. Signal quality monitor for electromyographic sensors
US10485444B2 (en) 2014-10-17 2019-11-26 G-Tech Medical, Inc. Systems and methods for processing electromyographic signals of the gastrointestinal tract
JP2016150119A (ja) 2015-02-17 2016-08-22 日本電信電話株式会社 運動状態判定方法、装置、及びプログラム
CN105635669B (zh) * 2015-12-25 2019-03-01 北京迪生数字娱乐科技股份有限公司 基于三维运动捕捉数据与实拍视频的动作对比系统及方法
JP6527830B2 (ja) 2016-02-15 2019-06-05 日本電信電話株式会社 生体信号処理装置、方法、およびプログラム
CN105997064B (zh) * 2016-05-17 2018-10-23 成都奥特为科技有限公司 一种用于人体下肢表面肌电信号的辨识方法
CN106073793B (zh) * 2016-06-13 2019-03-15 中南大学 基于微惯性传感器的姿态跟踪与识别方法
US10990174B2 (en) * 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
CN107361773B (zh) * 2016-11-18 2019-10-22 深圳市臻络科技有限公司 用于检测、缓解帕金森异常步态的装置
JP6831219B2 (ja) 2016-11-25 2021-02-17 エヌ・ティ・ティ・コミュニケーションズ株式会社 バイタル信号取得装置、バイタル信号取得方法及びコンピュータプログラム
CN108143409B (zh) * 2016-12-06 2021-01-22 中国移动通信有限公司研究院 睡眠阶段分期方法及装置
CN108211309A (zh) * 2017-05-25 2018-06-29 深圳市未来健身衣科技有限公司 健身运动的指导方法及装置
CN108211308B (zh) * 2017-05-25 2019-08-16 深圳市前海未来无限投资管理有限公司 一种运动效果展示方法及装置
CN108209910A (zh) * 2017-05-25 2018-06-29 深圳市未来健身衣科技有限公司 健身运动数据的反馈方法及装置
CN108566520B (zh) * 2017-05-25 2020-10-20 深圳市前海未来无限投资管理有限公司 视频数据和运动效果动画的同步方法及装置
CN108211310B (zh) * 2017-05-25 2019-08-16 深圳市前海未来无限投资管理有限公司 运动效果的展示方法及装置
JP6857573B2 (ja) 2017-08-08 2021-04-14 日本電信電話株式会社 筋電計測装置、方法及びプログラム
CN107349594B (zh) * 2017-08-31 2019-03-19 华中师范大学 一种虚拟舞蹈系统的动作评价方法
US20200310541A1 (en) * 2019-03-29 2020-10-01 Facebook Technologies, Llc Systems and methods for control schemes based on neuromuscular data
US11246531B2 (en) * 2018-05-10 2022-02-15 MAD Apparel, Inc. Fatigue measurement in a sensor equipped garment
US11590402B2 (en) * 2018-05-31 2023-02-28 The Quick Board, Llc Automated physical training system
CN109068081A (zh) * 2018-08-10 2018-12-21 北京微播视界科技有限公司 视频生成方法、装置、电子设备及存储介质
CN109191588B (zh) * 2018-08-27 2020-04-07 百度在线网络技术(北京)有限公司 运动教学方法、装置、存储介质及电子设备
US10902289B2 (en) * 2019-03-22 2021-01-26 Salesforce.Com, Inc. Two-stage online detection of action start in untrimmed videos
CN110478883B (zh) * 2019-08-21 2021-04-13 南京信息工程大学 一种健身动作教学及矫正系统及方法
CN110569775A (zh) * 2019-08-30 2019-12-13 武汉纺织大学 一种识别人体姿势的方法、系统、存储介质及电子设备
CN111317446B (zh) * 2020-02-27 2020-09-08 中国人民解放军空军特色医学中心 基于人体肌肉表面电信号的睡眠结构自动分析方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140135960A1 (en) * 2012-11-15 2014-05-15 Samsung Electronics Co., Ltd. Wearable device, display device, and system to provide exercise service and methods thereof
CN104706359A (zh) * 2015-04-01 2015-06-17 深圳柔微传感科技有限公司 一种实现运动实时监测的方法和智能服装
CN207071088U (zh) * 2017-01-25 2018-03-06 杭州三目科技有限公司 一种基于服装的人体运动监测、分析和反馈装置
CN110327048A (zh) * 2019-03-11 2019-10-15 浙江工业大学 一种基于可穿戴式惯性传感器的人体上肢姿态重建系统
CN110609621A (zh) * 2019-09-17 2019-12-24 南京茂森电子技术有限公司 姿态标定方法及基于微传感器的人体运动捕获系统
CN112214109A (zh) * 2020-09-30 2021-01-12 深圳市润谊泰益科技有限责任公司 基于肌电和姿态数据的复合控制方法、装置及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4201323A4 *

Also Published As

Publication number Publication date
US20230154607A1 (en) 2023-05-18
EP4085834A1 (en) 2022-11-09
CN116981401A (zh) 2023-10-31
EP4201323A4 (en) 2024-01-24
JP2023521655A (ja) 2023-05-25
KR20230044297A (ko) 2023-04-03
EP4201323A1 (en) 2023-06-28
CN115105100A (zh) 2022-09-27
KR20220142495A (ko) 2022-10-21
JP2023540286A (ja) 2023-09-22
WO2022193851A1 (zh) 2022-09-22
CN115115751A (zh) 2022-09-27
WO2022193850A1 (zh) 2022-09-22
WO2022193425A1 (zh) 2022-09-22
US20230233103A1 (en) 2023-07-27
CN115105056A (zh) 2022-09-27
EP4167129A4 (en) 2024-01-24
JP2023549242A (ja) 2023-11-22
EP4085834A4 (en) 2023-08-16
JP7455995B2 (ja) 2024-03-26
KR20230086750A (ko) 2023-06-15
US20220365600A1 (en) 2022-11-17
CN116261749A (zh) 2023-06-13
EP4167129A1 (en) 2023-04-19

Similar Documents

Publication Publication Date Title
US20180055375A1 (en) Systems and methods for determining an intensity level of an exercise using photoplethysmogram (ppg)
Yoon et al. Improvement of dynamic respiration monitoring through sensor fusion of accelerometer and gyro-sensor
US20230233103A1 (en) Motion monitoring methods and systems
US11763696B2 (en) Systems and methods for facilitating mind-body-emotion state self-adjustment and functional skills development by way of biofeedback and environmental monitoring
Yang et al. A wearable activity recognition device using air-pressure and IMU sensors
CN107961523A (zh) 基于心率检测的人体训练系统和智能健身系统
CN109222909A (zh) 一种可穿戴式智能监测装置及监测运动、脊椎弯曲和关节磨损的方法
Wang et al. Motion analysis of deadlift for trainers with different levels based on body sensor network
US20230210402A1 (en) Methods and devices for motion monitoring
CN107050800A (zh) 太极教导系统及方法
CN206404266U (zh) 太极教导系统
CN115105819A (zh) 一种运动监控方法及其系统
TWI837620B (zh) 運動監控方法及系統
TW202239378A (zh) 運動監控方法及系統
Celić et al. WBAN for physical activity monitoring in health care and wellness
CN116785659A (zh) 一种运动监控方法和设备
RU2813471C1 (ru) Способы и системы идентификации действия пользователя
US20230337989A1 (en) Motion data display method and system
Ivanov et al. Recognition and Control of the Athlete's Movements Using a Wearable Electronics System
WO2024055186A1 (zh) 一种运动评估方法及系统
Liang et al. WMS: Wearables-Based Multi-Sensor System for In-home Fitness Guidance
CN206365887U (zh) 一种心电图仪
Guiry et al. The role of smartphones as an assistive aid in mental health

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21930919

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021930919

Country of ref document: EP

Effective date: 20230322

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112023006763

Country of ref document: BR

WWE Wipo information: entry into national phase

Ref document number: 202180070833.3

Country of ref document: CN

ENP Entry into the national phase

Ref document number: 20237016055

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2023528497

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE