CN115105819A - Motion monitoring method and system - Google Patents

Motion monitoring method and system Download PDF

Info

Publication number
CN115105819A
CN115105819A CN202110298643.9A CN202110298643A CN115105819A CN 115105819 A CN115105819 A CN 115105819A CN 202110298643 A CN202110298643 A CN 202110298643A CN 115105819 A CN115105819 A CN 115105819A
Authority
CN
China
Prior art keywords
signal
user
motion
electromyographic
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110298643.9A
Other languages
Chinese (zh)
Inventor
苏雷
周鑫
黎美琪
廖风云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Voxtech Co Ltd
Original Assignee
Shenzhen Voxtech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Voxtech Co Ltd filed Critical Shenzhen Voxtech Co Ltd
Priority to CN202110298643.9A priority Critical patent/CN115105819A/en
Priority to TW111110179A priority patent/TWI837620B/en
Publication of CN115105819A publication Critical patent/CN115105819A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0605Decision makers and devices using detection means facilitating arbitration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user

Abstract

The application discloses a motion monitoring method, which comprises the following steps: acquiring a motion signal of a user during motion, wherein the motion signal at least comprises an electromyographic signal or an attitude signal; and monitoring the movement of the user on the basis of at least the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the posture signals.

Description

Motion monitoring method and system
Technical Field
The application relates to the technical field of wearable equipment, in particular to a motion monitoring method and system.
Background
With the concern of people on scientific sports and physical health, sports monitoring devices are being developed greatly. At present, exercise monitoring equipment mainly monitors part of physiological parameter information (such as heart rate, body temperature, step frequency, blood oxygen and the like) in the exercise process of a user, and cannot accurately monitor and feed back the action of the user. In a practical scenario, monitoring and feedback processes of the user's actions often require the participation of professionals. For example, in a fitness scenario, a user typically can only make constant corrections to the fitness activity under the direction of a fitness trainer.
Accordingly, it is desirable to provide an exercise monitoring device that can guide a person's exercise, thereby helping the user scientifically perform the exercise.
Disclosure of Invention
One aspect of the present application provides a method of motion monitoring, comprising: acquiring a motion signal of a user during motion, wherein the motion signal at least comprises an electromyographic signal or an attitude signal; and monitoring the movement of the user on the basis of at least the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the posture signals.
In some embodiments, monitoring the motion of the user's motion based on at least the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the posture signals comprises: segmenting the action signal based on feature information corresponding to the electromyographic signal or feature information corresponding to the posture signal; and monitoring the motion of the user motion based on at least one segment of the motion signal.
In some embodiments, the characteristic information corresponding to the electromyographic signals at least comprises frequency information or amplitude information, and the characteristic information corresponding to the posture signals at least comprises one of angular velocity direction, angular velocity value, acceleration value of angular velocity, angle, displacement information and stress.
In some embodiments, the segmenting the motion signal based on the feature information corresponding to the electromyographic signal or the feature information corresponding to the posture signal includes: determining at least one target feature point from a time domain window based on the electromyographic signal or the posture signal according to a preset condition; and segmenting the motion signal based on the at least one target feature point.
In some embodiments, the at least one target feature point comprises one of an action start point, an action intermediate point, and an action end point.
In some embodiments, the preset condition includes one or more of a change in a direction of an angular velocity corresponding to the posture signal, a change in the angular velocity corresponding to the posture signal being greater than or equal to an angular velocity threshold, a change in the value of the angular velocity corresponding to the posture signal being an extreme value, a reaching of an angle corresponding to the posture signal being an angle threshold, and a value of amplitude information corresponding to the electromyographic signal being greater than or equal to an electromyographic threshold.
In some embodiments, the preset condition further includes that the acceleration of the angular velocity corresponding to the attitude signal is continuously greater than or equal to an acceleration threshold of the angular velocity within a first specific time range.
In some embodiments, the preset condition further comprises that the corresponding amplitude of the electromyographic signal is continuously greater than the electromyographic threshold value within a second specific time range.
In some embodiments, the monitoring the motion of the user's motion based on at least the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the posture signals includes: preprocessing the electromyographic signals in a frequency domain or a time domain; and acquiring characteristic information corresponding to the electromyographic signals based on the preprocessed electromyographic signals, and monitoring the movement of the user according to the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the posture signals.
In some embodiments, the pre-processing the electromyographic signals in the frequency domain or the time domain comprises: and filtering the electromyographic signals to select components in a specific frequency range in the electromyographic signals on a frequency domain.
In some embodiments, the pre-processing the electromyographic signals in the frequency domain or the time domain includes signal correction processing the electromyographic signals in the time domain.
In some embodiments, the signal correction processing on the electromyogram signal in the time domain includes: determining singular points in the electromyographic signals, wherein the singular points correspond to mutation signals in the electromyographic signals; and performing signal correction processing on singular points of the electromyographic signals.
In some embodiments, the signal correction processing of the singular point of the electromyogram signal includes removing the singular point or correcting the singular point from signals around the singular point.
In some embodiments, the singular point comprises a spur signal, and the determining the singular point in the electromyographic signal comprises: selecting different time windows from the time domain window of the electromyographic signal based on the time domain window of the electromyographic signal, wherein the different time windows respectively cover different time ranges; and determining the burr signal based on the characteristic information corresponding to the electromyographic signals in the different time windows.
In some embodiments, the method further comprises determining feature information corresponding to the gesture signal based on the gesture signal, wherein the gesture signal comprises coordinate information in at least one original coordinate system; the determining, based on the gesture signal, feature information corresponding to the gesture signal comprises: acquiring a target coordinate system and a conversion relation between the target coordinate system and the at least one original coordinate system; converting the coordinate information in the at least one original coordinate system into coordinate information in the target coordinate system based on the conversion relation; and determining feature information corresponding to the attitude signal based on the coordinate information in the target coordinate system.
In some embodiments, the gesture signal includes coordinate information generated by at least two sensors respectively located at different moving parts of the user and corresponding to different original coordinate systems, and the determining the feature information corresponding to the gesture signal based on the gesture signal includes: determining characteristic information respectively corresponding to the at least two sensors based on the conversion relation between the different original coordinate systems and the target coordinate system; and determining relative motion between different moving parts of the user based on the characteristic information respectively corresponding to the at least two sensors.
In some embodiments, the transformation relationship between the at least one original coordinate system and the target coordinate system is obtained by a calibration process comprising: constructing a specific coordinate system, wherein the specific coordinate system is related to the orientation of the user in the calibration process; acquiring first coordinate information in the at least one original coordinate system when the user is in a first posture; acquiring second coordinate information of the at least one original coordinate system when the user is in a second posture; and determining the conversion relation between the at least one original coordinate system and the specific coordinate system according to the first coordinate information, the second coordinate information and the specific coordinate system.
In some embodiments, the calibration process further comprises: acquiring a conversion relation between the specific coordinate system and the target coordinate system; and determining the conversion relation between the at least one original coordinate system and the target coordinate system according to the conversion relation between the at least one original coordinate system and the specific coordinate system and the conversion relation between the specific coordinate system and the target coordinate system.
In some embodiments, the target coordinate system changes as the orientation of the user changes.
Another aspect of the present application provides a method for training a motion recognition model, including: acquiring sample information, wherein the sample information comprises a motion signal when a user moves, and the motion signal at least comprises characteristic information corresponding to an electromyographic signal and characteristic information corresponding to an attitude signal; and training the motion recognition model based on the sample information.
Another aspect of the present application also provides a method of motion monitoring and feedback, comprising: acquiring action signals of a user during movement, wherein the action signals at least comprise electromyographic signals and posture signals; and monitoring the action of the user based on the characteristic information corresponding to the electromyographic signals and the characteristic information corresponding to the posture signals through an action recognition model, and performing action feedback based on an output result of the action recognition model.
In some embodiments, the motion recognition model comprises a trained machine learning model or a pre-set model.
In some embodiments, the motion feedback includes at least one of issuing a prompt, stimulating a motion location of the user, and outputting a motion record of the user as the user moves.
Drawings
The present application will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals refer to like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of a motion monitoring system according to some embodiments of the present application;
fig. 2 is a schematic diagram of exemplary hardware and/or software of a wearable device shown in accordance with some embodiments of the present application;
FIG. 3 is a schematic diagram of exemplary hardware and/or software of a computing device shown in accordance with some embodiments of the present application;
fig. 4 is an exemplary block diagram of a wearable device shown in accordance with some embodiments of the present application;
FIG. 5 is an exemplary flow chart of a method of motion monitoring shown in accordance with some embodiments of the present application;
FIG. 6 is an exemplary flow diagram illustrating monitoring of user motion actions according to some embodiments of the present application;
FIG. 7 is an exemplary flow diagram of action signal segmentation, shown in accordance with some embodiments of the present application;
FIG. 8 is a graph of exemplary normalized results of action signal segmentation, shown in accordance with some embodiments of the present application;
FIG. 9 is an exemplary flow diagram of electromyographic signal preprocessing shown in accordance with some embodiments of the present application;
FIG. 10 is an exemplary flow chart of a deburring signal shown in accordance with some embodiments of the present application;
FIG. 11 is an exemplary flow chart illustrating the determination of feature information corresponding to a gesture signal according to some embodiments of the present application;
FIG. 12 is an exemplary flow diagram illustrating the determination of relative motion between different motion locations of a user according to some embodiments of the present application;
FIG. 13 is an exemplary flow chart illustrating the determination of a transformation relationship of an original coordinate system to a particular coordinate system according to some embodiments of the present application;
FIG. 14 is an exemplary flow chart illustrating the determination of a transformation relationship between an original coordinate system and a target coordinate system according to some embodiments of the present application;
FIG. 15A is an exemplary vector plot of Euler angle data in an original coordinate system at a human forearm position, according to some embodiments of the present application;
FIG. 15B is an exemplary vector plot of Euler angle data in an original coordinate system at another location of a human forearm position, in accordance with some embodiments of the present application;
FIG. 16A is an exemplary vector plot of Euler angle data in a target coordinate system at a human forearm position, according to some embodiments of the present application;
FIG. 16B is an exemplary vector plot of Euler angle data in a target coordinate system at another location of a human forearm, in accordance with some embodiments of the present application;
FIG. 17 is an exemplary vector plot of Euler angle data in a target coordinate system of a multi-sensor shown according to some embodiments of the present application;
FIG. 18A is an exemplary resulting graph of raw angular velocities shown in accordance with some embodiments of the present application;
FIG. 18B is a graph of exemplary results of angular velocity after filtering processing according to some embodiments of the present application;
FIG. 19 is an exemplary flow chart of a motion monitoring and feedback method according to some embodiments of the present application;
FIG. 20 is an exemplary flow chart of an application of model training according to some embodiments shown herein.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "device", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this application and in the claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to include the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
The present specification provides an exercise monitoring system, which can obtain an action signal of a user during exercise, wherein the action signal at least includes an electromyography signal, a posture signal, an electrocardiography signal, a respiratory frequency signal, and the like. The system can monitor the movement of the user on the basis of at least the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the posture signals. For example, the type of the user's motion, the number of motions, the quality of the motion, the time of the motion, or the physiological parameter information when the user performs the motion may be determined from frequency information corresponding to the electromyographic signal, amplitude information, and angular velocity corresponding to the posture signal, angular velocity values of the angular velocity direction and angular velocity, angle, displacement information, stress, and the like. In some embodiments, the exercise monitoring system may also generate feedback on the user's fitness actions based on the analysis of the user's fitness actions to guide the user's fitness. For example, when the user's fitness activity is not standard, the motion monitoring system may send a prompt (e.g., a voice prompt, a vibration prompt, a current stimulus, etc.) to the user. The motion monitoring system can be applied to wearable equipment (such as clothes, wristbands and helmets), medical detection equipment (such as myoelectricity testers), fitness equipment and the like, can accurately monitor and feed back the motion of a user by acquiring motion signals when the user moves, does not need the participation of professionals, and can reduce the cost of user fitness while improving the fitness efficiency of the user.
FIG. 1 is a schematic diagram of an application scenario of a motion monitoring system according to some embodiments of the present application. As shown in fig. 1, the athletic monitoring system 100 may include a processing device 110, a network 120, a wearable device 130, and a mobile terminal device 140. The motion monitoring system 100 can acquire motion signals (e.g., myoelectric signals, posture signals, electrocardio signals, respiratory frequency signals, etc.) for representing the motion of the user, and monitor and feed back the motion of the user when the user moves according to the motion signals of the user.
For example, the motion monitoring system 100 may monitor and feedback the user's actions while exercising. When the user wears the wearable device 130 for a fitness exercise, the wearable device 130 may acquire the motion signal of the user. The processing device 110 or the mobile terminal device may receive and analyze the motion signal of the user to determine whether the exercise motion of the user is normal, so as to monitor the motion of the user. In particular, monitoring the user's actions may include determining the action type of the action, the number of actions, the quality of the action, the time of the action, or physiological parameter information of the user when performing the action, etc. Further, the exercise monitoring system 100 may generate feedback on the user's exercise motions based on the analysis of the user's exercise motions to guide the user's exercise.
As another example, the athletic monitoring system 100 may monitor and feedback on the user's actions while running. For example, when the user wears the wearable device 130 for a running exercise, the exercise monitoring system 100 may monitor whether the user's running motion is normative, whether the running time meets health standards, and the like. When the user runs for too long or runs for an incorrect time, the exercise device may feed back the user's exercise status to prompt the user that the running motion or running time needs to be adjusted.
In some embodiments, processing device 110 may be used to process information and/or data related to user motion. For example, the processing device 110 may receive a motion signal (e.g., an electromyographic signal, a posture signal, an electrocardiographic signal, a respiratory frequency signal, etc.) of the user, and further extract feature information corresponding to the motion signal (e.g., feature information corresponding to the electromyographic signal and feature information corresponding to the posture signal in the motion signal). In some embodiments, the processing device 110 may perform specific signal processing, such as signal segmentation, signal pre-processing (e.g., signal correction processing, filtering processing, etc.), and the like, on the electromyographic signals or the posture signals acquired by the wearable device 130. In some embodiments, the processing device 110 may also determine whether the user action is correct based on the user's action signal. For example, the processing device 110 may determine whether the user action is correct based on feature information (e.g., amplitude information, frequency information, etc.) corresponding to the electromyographic signal. For another example, the processing device 110 may determine whether the user action is correct based on characteristic information (e.g., angular velocity direction, acceleration of angular velocity, angle, displacement information, stress, etc.) corresponding to the gesture signal. For another example, the processing device 110 may determine whether the user action is correct based on the feature information corresponding to the myoelectric signal and the feature information corresponding to the posture signal. In some embodiments, the processing device 110 may also determine whether the physiological parameter information of the user while exercising meets the health criteria. In some embodiments, the processing device 110 may also issue a corresponding instruction to feed back the motion of the user. For example, when the user performs a running exercise, the exercise monitoring system 100 monitors that the user has run for too long, and the processing device 110 may issue an instruction to the mobile terminal device 140 to prompt the user to adjust the running time. It should be noted that the characteristic information corresponding to the gesture signal is not limited to the angular velocity, the angular velocity direction, the acceleration of the angular velocity, the angle, the displacement information, the stress, and the like, and may also be other characteristic information, and all the parameter information that can be used for reflecting the relative motion of the body of the user may be the characteristic information corresponding to the gesture signal. For example, when the posture sensor is a strain gauge sensor, by measuring the magnitude of resistance that varies with the stretched length in the strain gauge sensor, the bending angle and the bending direction at the joint of the user can be acquired.
In some embodiments, the processing device 110 may be local or remote. For example, the processing device 110 may access information and/or profiles stored in the wearable device 130 and/or the mobile terminal device 140 via the network 120. In some embodiments, the processing device 110 may directly connect with the wearable device 130 and/or the mobile terminal device 140 to access information and/or material stored therein. For example, the processing device 110 may be located in the wearable device 130 and enable information interaction with the mobile terminal device 140 through the network 120. As another example, the processing device 110 may be located in the mobile terminal device 140 and enable information interaction with the wearable device 130 via a network. In some embodiments, the processing device 110 may execute on a cloud platform. For example, the cloud platform may include one or any combination of a private cloud, a public cloud, a hybrid cloud, a community cloud, a decentralized cloud, an internal cloud, and the like.
In some embodiments, processing device 110 may process data and/or information related to motion monitoring to perform one or more of the functions described herein. In some embodiments, the processing device may obtain motion signals captured by the wearable device 130 as the user moves. In some embodiments, the processing device may send control instructions to the wearable device 130 or the mobile terminal device 140. The control instructions may control the on-off state of the wearable device 130 and its various sensors. And also controls the mobile terminal device 140 to send out prompt information. In some embodiments, the processing device 110 may contain one or more sub-processing devices (e.g., a single core processing device or a multi-core processing device). By way of example only, processing device 110 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a programmable logic circuit (PLD), a controller, a micro-controller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination thereof.
Network 120 may facilitate the exchange of data and/or information in motion monitoring system 100. In some embodiments, one or more components in the athletic monitoring system 100 (e.g., processing device 110, wearable device 130, mobile terminal device 140) may send data and/or information to other components in the athletic monitoring system 100 via the network 120. For example, the motion signals collected by the wearable device 130 may be transmitted to the processing device 110 through the network 120. As another example, the confirmation result in the processing device 110 regarding the action signal may be transmitted to the mobile terminal device 140 through the network 120. In some embodiments, the network 120 may be any type of wired or wireless network. For example, network 120 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, the like, or any combination thereof.
In some embodiments, network 120 may include one or more network access points. For example, network 120 may include wired or wireless network access points, such as base stations and/or Internet switching points 120-1, 120-2, …, through which one or more components of motion monitoring system 100 may connect to network 120 to exchange data and/or information.
The wearable device 130 refers to a garment or a device having a wearing function. In some embodiments, the wearable device 130 may include, but is not limited to, a coat device 130-1, a pants device 130-2, a wrist guard 130-3, a shoe 130-4, and the like. In some embodiments, the wearable device 130 may include multiple sensors. The sensor can acquire various motion signals (such as myoelectric signals, posture signals, temperature information, heartbeat frequency, electrocardio signals and the like) when the user moves. In some embodiments, the sensors may include, but are not limited to, one or more of an electromyography sensor, an attitude sensor, a temperature sensor, a humidity sensor, an electrocardio sensor, an oxygen saturation sensor, a hall sensor, a pico sensor, a rotation sensor, and the like. For example, a muscle position (e.g., biceps brachii, triceps brachii, latissimus dorsi, trapezius, etc.) of a human body in the upper garment apparatus 130-1 may be provided with an electromyographic sensor, which may be attached to the skin of the user and collect an electromyographic signal while the user is exercising. For another example, an electrocardiograph sensor may be disposed near the pectoral muscle on the left side of the human body in the upper garment apparatus 130-1, and the electrocardiograph sensor may acquire an electrocardiograph signal of the user. For another example, a posture sensor may be disposed at a muscle position (e.g., gluteus maximus, vastus lateralis, vastus medialis, gastrocnemius, etc.) of the human body in the pants device 130-2, and the posture sensor may collect a posture signal of the user. In some embodiments, the wearable device 130 may also feedback on the user's actions. For example, when the action of a certain part of the body is not in accordance with the standard while the user is moving, the myoelectric sensor corresponding to the part can generate a stimulation signal (for example, a current stimulation or an impact signal) to remind the user.
It should be noted that the wearable device 130 is not limited to the jacket device 130-1, the trousers device 130-2, the wrist protecting device 130-3 and the shoes device 130-4 shown in fig. 1, and may also be applied to other devices requiring exercise monitoring, such as a helmet device, a knee protecting device, etc., without limitation, and any device that can use the exercise monitoring method included in the present specification is within the protection scope of the present application.
In some embodiments, mobile terminal device 140 may obtain information or data in motion monitoring system 100. In some embodiments, the mobile terminal device 140 may receive the processed motion data of the processing device 110 and feed back a motion record or the like based on the processed motion data. Exemplary feedback means may include, but are not limited to, voice prompts, image prompts, video presentations, text prompts, and the like. In some embodiments, the user may obtain the action record in the self-movement process through the mobile terminal device 140. For example, the mobile terminal device 140 may be connected (e.g., wired, wireless) to the wearable device 130 via the network 120, and the user may obtain a record of the motion of the user during the motion via the mobile terminal device 140, and the record of the motion may be transmitted to the processing device 110 via the mobile terminal device 140. In some embodiments, the mobile terminal device 140 may include one or any combination of a mobile device 140-1, a tablet computer 140-2, a notebook computer 140-3, and the like. In some embodiments, the mobile device 140-1 may include a cell phone, a smart home device, a smart mobile device, a virtual reality device, an augmented reality device, and the like, or any combination thereof. In some embodiments, the smart home device may include a control device of a smart appliance, a smart monitoring device, a smart television, a smart camera, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smart phone, a Personal Digital Assistant (PDA), a gaming device, a navigation device, a POS device, or the like, or any combination thereof. In some embodiments, the metaverse device and/or augmented reality device may include metaverse helmets, metaverse glasses, metaverse eyewear, augmented reality helmets, augmented reality glasses, augmented reality eyewear, and the like, or any combination thereof.
In some embodiments, the athletic monitoring system 100 may also include a database. The database may store data (e.g., initially set threshold conditions, etc.) and/or instructions (e.g., feedback instructions). In some embodiments, the database may store profiles obtained from the wearable device 130 and/or the mobile terminal device 140. In some embodiments, the database may store information and/or instructions for execution or use by the processing device 110 to perform the exemplary methods described herein. In some embodiments, the database may include mass storage, removable storage, volatile read-write memory (e.g., random access memory RAM), read-only memory (ROM), etc., or any combination thereof. In some embodiments, the database may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a decentralized cloud, an internal cloud, and the like, or any combination thereof.
In some embodiments, a database may be connected with network 120 to communicate with one or more components of motion monitoring system 100 (e.g., processing device 110, wearable device 130, mobile terminal device 140, etc.). One or more components of the athletic monitoring system 100 may access the data or instructions stored in the database via the network 120. In some embodiments, the database may be directly connected or in communication with one or more components (e.g., processing device 110, wearable device 130, mobile terminal device 140) in motion monitoring system 100. In some embodiments, the database may be part of the processing device 110.
Fig. 2 is a schematic diagram of exemplary hardware and/or software of a wearable device shown in accordance with some embodiments of the present application. As shown in fig. 2, wearable device 130 may include an acquisition module 210, a processing module 220 (also referred to as a processor), a control module 230 (also referred to as a master, MCU, controller), a communication module 240, a power module 250, and an input/output module 260.
The obtaining module 210 may be configured to obtain a motion signal when the user moves. In some embodiments, the acquisition module 210 may include a sensor unit, which may be used to acquire one or more motion signals while the user is moving. In some embodiments, the sensor unit may include, but is not limited to, one or more of a myoelectric sensor, an attitude sensor, an electrocardiographic sensor, a respiratory sensor, a temperature sensor, a humidity sensor, an inertial sensor, a blood oxygen saturation sensor, a hall sensor, a galvanic sensor, a rotation sensor, and the like. In some embodiments, the motion signal may include one or more of an electromyographic signal, a posture signal, an electrocardiographic signal, a respiratory rate, a temperature signal, a humidity signal, or the like. The sensor unit may be placed at different locations of the wearable device 130 depending on the type of motion signal to be acquired. For example, in some embodiments, electromyographic sensors (also referred to as electrode elements) may be provided at human muscle locations, which may be configured to collect electromyographic signals while the user is moving. The electromyographic signals and their corresponding characteristic information (e.g., frequency information, amplitude information, etc.) may reflect the state of the muscles while the user is exercising. The gesture sensors may be disposed at different locations of the human body (e.g., locations in the wearable device 130 corresponding to the torso, limbs, joints), and may be configured to acquire gesture signals as the user moves. The gesture signals and their corresponding characteristic information (e.g., angular velocity direction, angular velocity value, angular velocity acceleration value, angle, displacement information, stress, etc.) may reflect the gesture of the user's motion. The electrocardio sensor can be arranged at the position around the chest of the human body, and the electrocardio sensor can be configured to collect electrocardio data when the user moves. A respiration sensor may be disposed at a location on a peripheral side of a chest of a person, and the respiration sensor may be configured to acquire respiration data (e.g., respiration rate, respiration amplitude, etc.) of a user as the user moves. The temperature sensor may be configured to collect temperature data (e.g., body surface temperature) while the user is in motion. The humidity sensor may be configured to collect humidity data of an external environment while the user is in motion.
The processing module 220 may process data from the acquisition module 210, the control module 230, the communication module 240, the power module 250, and/or the input/output module 260. For example, the processing module 220 may process motion signals from the acquisition module 210 during user motion. In some embodiments, the processing module 220 may pre-process the motion signals (e.g., myoelectric signals, posture signals) acquired by the acquisition module 210. For example, the processing module 220 performs segmentation processing on the electromyographic signals or the posture signals when the user moves. For another example, the processing module 220 may perform pre-processing (e.g., filtering processing, signal correction processing) on the electromyographic signals while the user is moving, so as to improve the quality of the electromyographic signals. For another example, the processing module 220 may determine feature information corresponding to the gesture signal based on the gesture signal when the user is moving. In some embodiments, the processing module 220 may process instructions or operations from the input/output module 260. In some embodiments, the processed data may be stored in a memory or hard disk. In some embodiments, processing module 220 may transmit its processed data to one or more components in motion monitoring system 100 via communication module 240 or network 120. For example, the processing module 220 may transmit the result of monitoring the user's movement to the control module 230, and the control module 230 may perform a subsequent operation or instruction according to the action determination result.
The control module 230 may be connected to other modules in the wearable device 130. In some embodiments, control module 230 may control the operating state of other modules in wearable device 130 (e.g., communication module 240, power module 250, input/output module 260). For example, the control module 230 may control a power supply state (e.g., a normal mode, a power saving mode), a power supply time, and the like of the power supply module 250. When the remaining power of the power supply module 250 reaches a certain threshold (e.g., 10%) or less, the control module 230 may control the power supply module 250 to enter a power saving mode or issue a prompt message regarding the supplement power. For another example, the control module 230 may control the input/output module 260 according to the action determination result of the user, and may further control the mobile terminal device 140 to transmit a feedback result of its motion to the user. When the motion of the user is problematic (e.g., the motion does not meet the standard), the control module 230 may control the input/output module 260, and further may control the mobile terminal device 140 to feed back to the user, so that the user may know the motion state of the user in real time and adjust the motion. In some embodiments, the control module 230 may also control one or more sensors or other modules in the acquisition module 210 to feedback on the human body. For example, when the strength of a muscle is too large during the exercise of the user, the control module 230 may control the electrode module at the muscle position to electrically stimulate the user to prompt the user to adjust the motion in time.
In some embodiments, the communication module 240 may be used for the exchange of information or data. In some embodiments, the communication module 240 may be used for communication between internal components of the wearable device 130 (e.g., the acquisition module 210, the processing module 220, the control module 230, the power module 250, the input/output module 260). For example, the obtaining module 210 may send a user action signal (e.g., an electromyographic signal, a posture signal, etc.) to the communication module 240, and the communication module 240 may send the action signal to the processing module 220. In some embodiments, the communication module 240 may also be used for communication between the wearable device 130 and other components in the motion monitoring system 100 (e.g., the processing device 110, the mobile terminal device 140). For example, the communication module 240 may transmit status information (e.g., a switch status) of the wearable device 130 to the processing device 110, and the processing device 110 may monitor the wearable device 130 based on the status information. The communication module 240 may employ wired, wireless, and hybrid wired/wireless technologies. The cabling may be based on one or more fiber optic cable combinations, such as metallic cables, hybrid cables, fiber optic cables, and the like. The wireless technologies may include Bluetooth (Bluetooth), wireless network (Wi-Fi), ZigBee (ZigBee), Near Field Communication (NFC), Radio Frequency Identification (RFID), cellular networks (including GSM, CDMA, 3G, 4G, 5G, etc.), narrowband Internet of Things over cellular (NBIoT), and so on. In some embodiments, the communication module 240 may encode the transmitted information using one or more encoding schemes, for example, the encoding schemes may include phase encoding, non-return-to-zero encoding, differential manchester encoding, and the like. In some embodiments, the communication module 240 may select different transmission and encoding modes according to the type of data or the type of network to be transmitted. In some embodiments, the communication module 240 may include one or more communication interfaces for different communication modes. In some embodiments, the illustrated other modules of the motion monitoring system 100 may be distributed across multiple devices, in which case each of the other modules may include one or more communication modules 240 for transferring information between the modules. In some embodiments, the communication module 240 may include a receiver and a transmitter. In other embodiments, the communication module 240 may be a transceiver.
In some embodiments, power module 250 may provide power to other components in motion monitoring system 100 (e.g., acquisition module 210, processing module 220, control module 230, communication module 240, input/output module 260). The power module 250 may receive control signals from the processing module 220 to control the power output of the wearable device 130. For example, in the event that wearable device 130 does not receive any operation for a certain period of time (e.g., 1s, 2s, 3s, or 4s) (e.g., no motion signal is detected by acquisition module 210), power module 250 may only supply power to the memory, causing wearable device 130 to enter a standby mode. For another example, in the event that wearable device 130 does not receive any operation for a certain period of time (e.g., 1s, 2s, 3s, or 4s) (e.g., no motion signal is detected by acquisition module 210), power module 250 may disconnect power to other components, and data in motion monitoring system 100 may be dumped to a hard disk, causing wearable device 130 to enter a standby mode or a sleep mode. In some embodiments, the power module 250 may include at least one battery. The battery can comprise one or a combination of several of a dry battery, a lead storage battery, a lithium battery, a solar battery, a wind power generation battery, a mechanical power generation battery, a thermal power generation battery and the like. The solar cell may convert light energy into electric energy and store it in the power supply module 250. The wind power generation battery may convert wind energy into electric energy and store it in the power supply module 250. The mechanical energy generation cell may convert mechanical energy into electrical energy and store in the power module 250. The solar cell may include a silicon solar cell, a thin film solar cell, a nano-crystalline chemical solar cell, a fuel-sensitized solar cell, a plastic solar cell, and the like. The solar cells may be distributed on the wearable device 130 in the form of a panel. The thermal power generation cell may convert the body temperature of the user into electric energy and store it in the power supply module 250. In some embodiments, the processing module 220 may send a control signal to the power module 250 when the power of the power module 250 is less than a power threshold (e.g., 10% of the total power). The control signal may include information that the power module 250 is low. In some embodiments, power module 250 may contain a backup power source. In some embodiments, the power module 250 may further include a charging interface. For example, in an emergency (for example, when the power of the power supply module 250 is 0, and the external power system fails to supply power), the power supply module 250 may be temporarily charged by using an electronic device (for example, a mobile phone or a tablet computer) or a charger that is carried by the user.
Input/output module 260 may acquire, transmit, and send signals. Input/output module 260 may be connected to or in communication with other components in motion monitoring system 100. Other components in the motion monitoring system 100 may be connected or in communication via the input/output module 260. The input/output module 260 may be a wired USB interface, a serial communication interface, a parallel communication interface, or a wireless bluetooth, infrared, Radio-frequency identification (RFID), Wlan Authentication and Privacy Infrastructure (wap), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), etc., or any combination thereof. In some embodiments, the input/output module 260 may be connected to the network 120 and obtain information via the network 120. For example, the input/output module 260 may obtain the motion signal of the user during the motion from the obtaining module 210 through the network 120 or the communication module 240 and output the user motion information. In some embodiments, the input/output module 260 may include VCC, GND, RS-232, RS-485 (e.g., RS485-A, RS485-B), a general network interface, and the like, or any combination thereof. In some embodiments, input/output module 260 may communicate the acquired user movement information to acquisition module 210 via network 120. In some embodiments, the input/output module 260 may encode the transmitted signal using one or more encoding schemes. The encoding may include phase encoding, non-return-to-zero code, differential manchester code, or the like, or any combination thereof.
It should be understood that the system and its modules shown in FIG. 2 may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules of one or more embodiments of the present specification may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of hardware circuits and software (e.g., firmware).
It should be noted that the above description of the motion monitoring system and its modules is for convenience of description only and is not intended to limit the scope of the one or more embodiments of the present disclosure to the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of the various modules, or the connection of the constituent subsystems to other modules, or the omission of one or more of the modules, may be made without departing from such teachings. For example, the acquiring module 210 and the processing module 220 may be one module, and the module may have a function of acquiring and processing a user action signal. For another example, the processing module 220 may not be disposed in the wearable device 130, but integrated in the processing device 110. Such variations are within the scope of one or more embodiments of the present description.
FIG. 3 is a schematic diagram of exemplary hardware and/or software of a computing device shown in accordance with some embodiments of the present application. In some embodiments, processing device 110 and/or mobile terminal device 140 may be implemented on computing device 300. As shown in FIG. 3, computing device 300 may include internal communication bus 310, processor 320, read only memory 330, random access memory 340, communication port 350, input/output interface 360, hard disk 370, and user interface 380.
Internal communication bus 310 may enable data communication among the components of computing device 300. For example, the processor 320 may send data through the internal communication bus 310 to memory or other hardware such as the input/output interface 360. In some embodiments, internal communication bus 310 may be an Industry Standard (ISA) bus, an Extended ISA (EISA) bus, a Video Electronics Standard (VESA) bus, a peripheral component interconnect standard (PCI) bus, or the like. In some embodiments, the internal communication bus 310 may be used to connect various modules (e.g., the acquisition module 210, the processing module 220, the control module 230, the communication module 240, the input-output module 260) in the motion monitoring system 100 shown in FIG. 1.
Processor 320 may execute computing instructions (program code) and perform the functions of motion monitoring system 100 described herein. The computing instructions may include programs, objects, components, data structures, procedures, modules, and functions (which refer to specific functions described herein). For example, the processor 320 may process motion signals (e.g., myoelectric signals, posture signals) obtained from the wearable device 130 or/and the mobile terminal device 140 of the motion monitoring system 100 when the user moves, and monitor the motion of the user's motion according to the motion signals when the user moves. In some embodiments, processor 320 may include microcontrollers, microprocessors, Reduced Instruction Set Computers (RISC), Application Specific Integrated Circuits (ASIC), application specific instruction set processors (ASIP), Central Processing Units (CPU), Graphics Processing Units (GPU), Physical Processing Units (PPU), microcontroller units, Digital Signal Processors (DSP), Field Programmable Gate Arrays (FPGA), advanced reduced instruction set computers (ARM), programmable logic devices, any circuit or processor capable of performing one or more functions, and the like, or any combination thereof. For illustration only, the computing device 300 in fig. 3 depicts only one processor, but it should be noted that the computing device 300 in the present application may also include multiple processors.
The memory (e.g., Read Only Memory (ROM)330, Random Access Memory (RAM)340, hard disk 370, etc.) of computing device 300 may store data/information retrieved from any other component of motion monitoring system 100. In some embodiments, the memory of the computing device 300 may be located in the wearable device 130, as well as in the processing device 110. Exemplary ROMs may include Mask ROM (MROM), Programmable ROM (PROM), erasable programmable ROM (PEROM), Electrically Erasable Programmable ROM (EEPROM), optical disk ROM (CD-ROM), digital versatile disk ROM, and the like. Exemplary RAM may include Dynamic RAM (DRAM), double-data-rate synchronous dynamic RAM (DDR SDRAM), Static RAM (SRAM), thyristor RAM (T-RAM), zero-capacitance (Z-RAM), and the like.
The input/output interface 360 may be used to input or output signals, data, or information. In some embodiments, the input/output interface 360 may enable a user to interact with the athletic monitoring system 100. For example, the input/output interface 360 may include the communication module 240 to implement the communication functions of the motion monitoring system 100. In some embodiments, input/output interface 360 may include an input device and an output device. Exemplary input devices may include a keyboard, mouse, touch screen, microphone, etc., or any combination thereof. Exemplary output devices may include a display device, speakers, printer, projector, etc., or any combination thereof. Exemplary display devices may include Liquid Crystal Displays (LCDs), Light Emitting Diode (LED) based displays, flat panel displays, curved displays, television sets, Cathode Ray Tubes (CRTs), and the like, or any combination thereof. The communication port 350 may be connected to a network for data communication. The connection may be a wired connection, a wireless connection, or a combination of both. The wired connection may include an electrical cable, an optical cable, or a telephone line, among others, or any combination thereof. The wireless connection may include bluetooth, Wi-Fi, WiMax, WLAN, ZigBee, mobile networks (e.g., 3G, 4G, or 5G, etc.), and the like, or any combination thereof. In some embodiments, the communication port 350 may be a standardized port, such as RS232, RS485, and the like. In some embodiments, the communication port 350 may be a specially designed port.
Hard disk 370 may be used to store information and data generated by processing device 110 or received from processing device 110. For example, the hard disk 370 may store user confirmation information of the user. In some embodiments, hard disk 370 may comprise a mechanical hard disk (HDD), a Solid State Disk (SSD), a Hybrid Hard Disk (HHD), or the like. In some embodiments, the hard disk 370 may be disposed in the processing device 110 or in the wearable device 130. User interface 380 may enable interaction and information exchange between computing device 300 and a user. In some embodiments, the user interface 380 may be used to present the athletic records generated by the athletic monitoring system 100 to the user. In some embodiments, user interface 380 may include a physical display, such as a display with speakers, an LCD display, an LED display, an OLED display, an electronic Ink display (E-Ink), or the like.
Fig. 4 is an exemplary block diagram of a wearable device according to some embodiments of the present application. To further illustrate the wearable device, a top garment is used as an exemplary illustration, and as shown in fig. 4, the wearable device 400 may include a top garment 410. The jacket garment 410 may include a jacket garment base 4110, at least one jacket processing module 4120, at least one jacket feedback module 4130, at least one jacket acquisition module 4140, and the like. The jacket garment substrate 4110 may refer to a garment worn on the upper torso of a human body. In some embodiments, the jacket garment substrate 4110 may include a short-sleeved T-shirt, a long-sleeved T-shirt, a coat, and the like. The at least one coat processing module 4120 and the at least one coat acquiring module 4140 may be located in areas of the coat garment substrate 4110 that are configured to engage different portions of a human body. The at least one jacket feedback module 4130 may be located anywhere on the jacket garment base 4110 and the at least one jacket feedback module 4130 may be configured to feedback the user upper body motion state information. Exemplary feedback means may include, but are not limited to, voice prompts, text prompts, pressure prompts, current stimulation, and the like. In some embodiments, the at least one jacket acquisition module 4140 may include, but is not limited to, one or more of an attitude sensor, an electrocardiographic sensor, an electromyographic sensor, a temperature sensor, a humidity sensor, an inertial sensor, an acid-base sensor, an acoustic-wave transducer, and the like. The sensors in the jacket acquisition module 4140 may be placed at different locations on the user's body depending on the signals to be measured. For example, when the gesture sensor is used to obtain a gesture signal during a user's movement, the gesture sensor may be placed in a position corresponding to a torso, arms, and joints of a human body in the coat garment base 4110. For another example, when the electromyographic sensor is used to obtain an electromyographic signal during a movement of a user, the electromyographic sensor may be located near a muscle to be measured by the user. In some embodiments, the attitude sensor may include, but is not limited to, a three-axis sensor of acceleration, a three-axis sensor of angular velocity, a magnetic sensor, and the like, or any combination thereof. For example, one attitude sensor may include a three-axis acceleration sensor, a three-axis angular velocity sensor. In some embodiments, the attitude sensor may also include a strain gauge sensor. The strain gauge sensor may refer to a sensor that may be based on strain generated by a deformation of an object under stress. In some embodiments, strain gauge sensors may include, but are not limited to, one or more of strain gauge load cells, strain gauge pressure sensors, strain gauge torque sensors, strain gauge displacement sensors, strain gauge acceleration sensors, and the like. For example, a strain gauge sensor may be provided at a joint position of a user, and a bending angle and a bending direction at the joint of the user may be acquired by measuring the magnitude of resistance in the strain gauge sensor that varies with a stretch length. It is noted that the jacket garment 410 may include other modules, such as a power module, a communication module, an input/output module, etc., in addition to the jacket garment base 4110, the jacket processing module 4120, the jacket feedback module 4130, and the jacket acquisition module 4140 described above. The jacket processing module 4120 is similar to the processing module 220 in fig. 2, and the jacket obtaining module 4140 is similar to the obtaining module 210 in fig. 2, and for the detailed description of each module in the jacket 410, reference may be made to the related description in fig. 2 of the present application, which is not repeated herein.
FIG. 5 is an exemplary flow chart of a method of motion monitoring shown in accordance with some embodiments of the present application. As shown in fig. 5, the process 500 may include:
in step 510, a motion signal of the user in motion is obtained.
In some embodiments, this step 510 may be performed by the acquisition module 210. The motion signal refers to human body parameter information when the user moves. In some embodiments, the body parameter information may include, but is not limited to, one or more of an electromyographic signal, a posture signal, an electrocardiographic signal, a temperature signal, a humidity signal, a blood oxygen concentration, a respiratory rate, and the like. In some embodiments, the electromyographic sensor in the acquisition module 210 may collect an electromyographic signal of the user during the exercise. For example, when a user clamps the chest in a sitting posture, myoelectric sensors corresponding to positions of pectoral muscles, latissimus dorsi and the like of the human body in the wearable device can acquire myoelectric signals of corresponding muscle positions of the user. For another example, when the user performs a deep squatting action, the electromyographic sensors corresponding to the gluteus maximus, quadriceps femoris and other positions of the human body in the wearable device can acquire electromyographic signals of the corresponding muscle positions of the user. For another example, when the user performs a running exercise, the electromyographic sensor corresponding to the position of the human gastrocnemius muscle and the like in the wearable device may acquire an electromyographic signal of the position of the human gastrocnemius muscle and the like. In some embodiments, the gesture sensors in the acquisition module 210 may collect gesture signals when the user is moving. For example, when the user performs barbell bench press exercise, the posture sensor corresponding to the position of the human triceps muscle and the like in the wearable device may acquire the posture signal of the position of the user triceps muscle and the like. For example, when the user performs a dumbbell bird strike, the posture sensor provided at a position such as a human deltoid muscle may collect a posture signal of the position such as the deltoid muscle of the user. In some embodiments, the number of the gesture sensors in the acquiring module 210 may be multiple, multiple gesture sensors may acquire gesture signals of multiple parts when the user moves, and the multiple part gesture signals may reflect relative movement between different parts of the human body. For example, the gesture signals at the arms and the gesture signals at the torso may reflect movement of the arms relative to the torso. In some embodiments, the gesture signal is associated with a type of gesture sensor. For example, when the attitude sensor is a three-axis sensor of angular velocity, the acquired attitude signal is angular velocity information. For another example, when the attitude sensors are a three-axis angular velocity sensor and a three-axis acceleration sensor, the acquired attitude signals are angular velocity information and acceleration information. For another example, when the posture sensor is a strain sensor, the strain sensor may be disposed at a joint position of the user, and by measuring a magnitude of a resistance that varies with a stretch length in the strain sensor, the obtained posture signal may be displacement information, stress, or the like, and the bending angle and the bending direction at the joint of the user may be represented by the posture signal. It should be noted that the parameter information that can be used to represent the relative movement of the body of the user may be feature information corresponding to the gesture signal, and different types of gesture sensors may be used to obtain the parameter information according to the type of the feature information.
In some embodiments, the motion signal may include an electromyographic signal of a specific part of a user's body and a posture signal of the specific part. The electromyographic signals and the posture signals can reflect the motion state of specific parts of the body of the user from different angles. In short, the gesture signal of a specific part of the user body can reflect the action type, action amplitude, action frequency and the like of the specific part. The myoelectric signal can reflect the muscle state of the specific part during movement. In some embodiments, the myoelectric signal and/or posture signal of the same body part can be used for better evaluating whether the action of the part is normal or not.
In step 520, the movement of the user is monitored based on at least the characteristic information corresponding to the electromyographic signal or the characteristic information corresponding to the gesture signal.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the characteristic information corresponding to the electromyographic signals may include, but is not limited to, one or more of frequency information, amplitude information, and the like. The characteristic information corresponding to the gesture signal refers to parameter information for representing relative motion of the body of the user. In some embodiments, the characteristic information corresponding to the attitude signal may include, but is not limited to, one or more of an angular velocity direction, an angular velocity value, an acceleration value of an angular velocity, and the like. In some embodiments, the characteristic information corresponding to the attitude signal may further include angle, displacement information (e.g., tensile length in strain gauge sensors), stress, and the like. For example, when the posture sensor is a strain gauge sensor, the strain gauge sensor may be disposed at a joint position of the user, and by measuring a magnitude of a resistance that varies with a tensile length in the strain gauge sensor, the obtained posture signal may be displacement information, stress, or the like, from which a bending angle and a bending direction at the joint of the user may be represented. In some embodiments, the processing module 220 and/or the processing device 110 may extract feature information (e.g., frequency information, amplitude information) corresponding to the electromyographic signals or feature information (e.g., angular velocity direction, angular velocity value, acceleration value of angular velocity, angle, displacement information, stress, etc.) corresponding to the posture signals, and monitor the motion of the user's movement based on the feature information corresponding to the electromyographic signals or the feature information corresponding to the posture signals. Where monitoring the movement of the user includes monitoring information related to the user's movement. In some embodiments, the action-related information may include one or more of a user action type, a number of actions, an action quality (e.g., whether the user action meets a criterion), an action time, and the like. The type of action refers to the fitness action taken while the user is exercising. In some embodiments, the type of action may include, but is not limited to, one or more of sitting, squat, hard-pull, flat support, running, swimming, and the like. The number of actions refers to the number of times the action is performed during the user's movement. For example, the user performs the chest clamping in a sitting posture 10 times during exercise, where 10 times are the number of movements. Motion quality refers to a standard degree of a user's performance of a fitness action relative to a standard fitness action. For example, when the user performs a deep-squatting action, the processing device 110 may determine an action type of the user action based on feature information corresponding to action signals (myoelectric signals and posture signals) of specific muscle positions (gluteus maximus, quadriceps, etc.), and determine action quality of the user deep-squatting action based on action signals of standard deep-squatting actions. The action time refers to the time corresponding to one or more action types of the user or the total time of the motion process. For details of monitoring the movement of the user based on the feature information corresponding to the electromyographic signals and/or the feature information corresponding to the posture signals, reference may be made to fig. 6 of the present application and the related description thereof.
In some embodiments, the processing device 110 may utilize one or more motion recognition models to recognize and monitor the motion of the user's motion. For example, the processing device 110 may input the feature information corresponding to the electromyographic signal and/or the feature information corresponding to the posture signal into the motion recognition model, and output the information related to the user motion by the motion recognition model. In some embodiments, the motion recognition models may include different types of motion recognition models, e.g., a model for recognizing a type of user motion, or a model for recognizing a quality of user motion, etc.
It should be noted that the above description related to the flow 500 is only for illustration and description, and does not limit the applicable scope of the present specification. Various modifications and changes to flow 500 may occur to those skilled in the art upon review of the present description. However, such modifications and variations are intended to be within the scope of the present description. For example, the extracting of the feature information corresponding to the electromyographic signal or the feature information corresponding to the posture signal in step 520 may be performed by the processing device 110, and in some embodiments, may also be performed by the processing module 220. For example, the motion signal of the user is not limited to the myoelectric signal, the posture signal, the electrocardiographic signal, the temperature signal, the humidity signal, the blood oxygen concentration, and the respiratory rate, but may be other physiological parameter signals of the human body, and the physiological parameter signals related to the motion of the human body may be regarded as the motion signals in the embodiment of the present specification.
FIG. 6 is an exemplary flow diagram illustrating monitoring of user motion actions according to some embodiments of the present application. As shown in fig. 6, the process 600 may include:
in step 610, the motion signal is segmented based on the feature information corresponding to the electromyographic signal or the feature information corresponding to the posture signal.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. The acquisition process of the motion signal (e.g., myoelectric signal, posture signal) while the user is moving is continuous, and the motion while the user is moving may be a combination of a plurality of sets of motions or a combination of motions of different motion types. In order to analyze each motion in the user motion, the processing module 220 may segment the user motion signal based on the feature information corresponding to the electromyographic signal or the feature information corresponding to the posture signal when the user moves. The segmentation of the motion signal here means that the motion signal is divided into signal segments of the same or different time lengths, or one or more signal segments with a specific time length are extracted from the motion signal. In some embodiments, each segment of the motion signal may correspond to one or more complete motions of the user. For example, when the user performs a deep squat exercise, the user may perform a deep squat action from a standing position to a squat position, and then lift up to recover the standing position, which may be regarded as the user completing a deep squat action, the action signal acquired by the acquisition module 210 in this process may be regarded as a period (or a period) of action signal, and after that, the action signal acquired by the acquisition module 210 after completing the next deep squat action may be regarded as another period of action signal. In some embodiments, each motion signal may also correspond to a partial motion of the user, where a partial motion may be understood as a partial motion in a complete motion. For example, when the user performs a deep squat exercise, the user may consider one action from a standing position to a squat position, and then may consider another action as the user gets up and restores to the standing position. The myoelectric signal and the gesture signal of the corresponding part can be changed by the change of each action step when the user moves. For example, when a user performs a squat exercise, the myoelectric signals and the posture signals of the muscles corresponding to the corresponding parts of the body (for example, the arm, the leg, the hip, and the abdomen) when the user stands are less fluctuated, and when the user squats from a standing posture, the myoelectric signals and the posture signals of the muscles corresponding to the corresponding parts of the body of the user are greatly fluctuated, for example, amplitude information corresponding to signals with different frequencies in the myoelectric signals is increased, and for example, an angular velocity value, an angular velocity direction, an acceleration value of the angular velocity, an angle, displacement information, stress, and the like corresponding to the posture signals are also changed. When the user stands up from the squatting state to the standing state, the amplitude information corresponding to the electromyographic signals and the angular velocity value, the angular velocity direction, the acceleration value of the angular velocity, the angle, the displacement information and the stress corresponding to the posture signals are changed again. Based on this, the processing module 220 may segment the motion signal of the user based on the feature information corresponding to the electromyographic signal or the feature information corresponding to the posture signal. For details of segmenting the motion signal based on the feature information corresponding to the electromyographic signal or the feature information corresponding to the posture signal, reference may be made to fig. 7 and 8 and the related description thereof in the specification of the present application.
In step 620, the motion of the user's motion is monitored based on at least one segment of the motion signal.
This step may be performed by processing module 220 and/or processing device 110. In some embodiments, monitoring the motion of the user movement based on the at least one segment of motion signal may include determining a type of motion of the user while moving based on a match between the at least one segment of motion signal and the at least one segment of preset motion signal. The at least one section of preset action signal refers to standard action signals corresponding to different actions preset in a database. In some embodiments, the motion type of the user in motion may be determined by determining a matching degree between the at least one motion signal and the at least one preset motion signal. Further, whether the matching degree of the action signal and the preset action signal is within a first matching threshold range (for example, greater than 80%) is judged, and if yes, the action type of the user during the movement is determined according to the action type corresponding to the preset action signal. In some embodiments, monitoring the movement of the user based on the at least one section of movement signal may further include determining a type of the movement of the user based on matching feature information corresponding to the at least one section of electromyographic signal with feature information corresponding to an electromyographic signal in the at least one section of preset movement signal. For example, the matching degree of one or more pieces of feature information (e.g., frequency information and amplitude information) in a segment of electromyographic signals and one or more pieces of feature information in a segment of preset action signals is respectively calculated, whether the weighted matching degree or the average matching degree of the one or more pieces of feature information is within a first matching threshold range is judged, and if so, the action type of the user during the movement is determined according to the action type corresponding to the preset action signals. In some embodiments, monitoring the motion of the user when the user moves based on the at least one segment of motion signal may further include determining a type of motion of the user when the user moves based on matching feature information corresponding to the at least one segment of gesture signal with feature information corresponding to a gesture signal in the at least one segment of preset motion signal. For example, matching degrees of one or more pieces of feature information (for example, angular velocity values, acceleration values of angular velocity direction and angular velocity, angles, displacement information, stress, and the like) in a segment of attitude signal and one or more pieces of feature information in a segment of preset action signal are respectively calculated, whether the weighted matching degree or the average matching degree of the one or more pieces of feature information is within a first matching threshold range is judged, and if so, the action type of the user during movement is determined according to the action type corresponding to the preset action signal. In some embodiments, the monitoring of the movement of the user based on the at least one segment of movement signal may further include determining a type of the movement of the user when the user moves based on matching between feature information corresponding to an electromyographic signal in the at least one segment of movement signal and feature information corresponding to a posture signal and feature information corresponding to an electromyographic signal in the at least one segment of preset movement signal and feature information corresponding to a posture signal.
In some embodiments, monitoring the motion of the user's motion based on the at least one motion signal may include determining a quality of motion of the user while in motion based on a matching of the at least one motion signal and the at least one preset motion signal. Further, if the matching degree of the motion signal and the preset motion signal is within a second matching threshold range (e.g., greater than 90%), the motion quality while the user is moving meets the criterion. In some embodiments, determining the motion of the user when the user moves based on the at least one segment of motion signals may include determining a quality of motion of the user when the user moves based on matching one or more characteristic information in the at least one segment of motion signals with one or more characteristic information in at least one segment of preset motion signals. It should be noted that a motion signal may be a complete motion signal or a partial motion signal in a complete motion. In some embodiments, for a complex complete action, different force-applying modes are available at different stages of the complete action, that is, different action signals are available at different stages of the action, and the real-time performance of monitoring the action of the user can be improved by monitoring the action signals at different stages of the complete action.
It should be noted that the above description related to the flow 600 is only for illustration and description, and does not limit the application scope of the present specification. Various modifications and changes to flow 600 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are intended to be within the scope of the present description. For example, in some embodiments, the user's actions may also be determined by an action recognition model or a manually preset model.
FIG. 7 is an exemplary flow diagram of action signal segmentation, shown in accordance with some embodiments of the present application. As shown in fig. 7, the process 700 may include:
in step 710, based on a time domain window of the electromyographic signal or the gesture signal, at least one target feature point is determined from the time domain window according to a preset condition.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. The time domain window of the electromyographic signals comprises the electromyographic signals within a period of time, and the time domain window of the posture signals comprises the posture signals within the same period of time. The target feature point refers to a signal having a target feature in the motion signal, which can characterize the stage in which the user motion is. For example, when a user performs sitting posture chest clamping, at the beginning, the two arms of the user are in a left-right extension state in the horizontal direction, then the two arms start to rotate inwards, then the two arms are folded, and finally the two arms are restored to the extension state again in the horizontal direction, and the process is a complete sitting posture chest clamping action. When the user performs the sitting posture chest clamping action, the electromyographic signals or the feature information corresponding to the posture signals of each stage are different, and the target feature point corresponding to the stage where the user acts can be determined by analyzing the feature information (for example, amplitude information and frequency information) corresponding to the electromyographic signals or the feature information (for example, angular velocity value, angular velocity direction, acceleration value of the angular velocity, angle, displacement information, stress and the like) corresponding to the posture signals. In some embodiments, one or more target feature points may be determined from within the time domain window according to a preset condition. In some embodiments, the preset conditions may include one or more of a change of an angular velocity direction corresponding to the posture signal, a change of an angular velocity greater than or equal to an angular velocity threshold corresponding to the posture signal, a reaching of an angle to the angular velocity threshold corresponding to the posture signal, an extreme value of a change of the angular velocity value corresponding to the posture signal, and a greater than or equal to an electromyographic threshold corresponding to the electromyographic signal. In some embodiments, the target feature points of different phases of an action may correspond to different preset conditions. For example, in the sitting posture chest-clamping action, the preset condition of the target characteristic point when the user's arms are horizontally extended and then start to rotate inward is different from the preset condition of the target characteristic point when the user's arms are closed. In some embodiments, the target feature points of different actions may correspond to different preset conditions. For example, the chest-clamping action in a sitting posture and the two-head bending action are different, and the preset conditions corresponding to the preset target points in the two actions are also different. For example contents regarding the preset conditions, reference may be made to the description regarding the action start point, the action intermediate point, and the action end point in this specification.
In other embodiments, at least one target feature point may be determined from a time domain window based on the electromyographic signal and the time domain window based on the posture signal, according to a preset condition. The time domain windows of the electromyographic signals and the posture signals correspondingly contain a period of time range of the electromyographic signals and the posture signals. The time of the electromyographic signal corresponds to the time of the posture signal. For example, the time point of the electromyogram signal when the user starts exercise is the same as the time point of the posture signal when the user starts exercise. The target feature point may be determined here by combining feature information (e.g., amplitude information) corresponding to the myoelectric signal and feature information (e.g., angular velocity value, angular velocity direction, acceleration value of angular velocity, angle, etc.) corresponding to the posture signal.
In step 720, the motion signal is segmented based on the at least one target feature point.
In some embodiments, this step 720 may be performed by processing module 220 and/or processing device 110. In some embodiments, the target feature points in the electromyographic signal or the posture signal may be one or more, and the motion signal may be divided into a plurality of segments by the one or more target feature points. For example, when there is a target feature point in the electromyographic signal, the target feature point may divide the electromyographic signal into two segments, where the two segments may include the electromyographic signal before the target feature point and the electromyographic signal after the target feature point. Alternatively, the processing module 220 and/or the processing device 110 may extract an electromyographic signal within a certain time range around the target feature point as a segment of the electromyographic signal. For another example, when there are a plurality of target feature points (for example, n, and the first target feature point is not the start point of the time domain window and the nth target feature point is not the end point of the time domain window) in the electromyographic signal, the electromyographic signal may be divided into n +1 segments according to the n target feature points. For another example, when the electromyographic signal has a plurality of target feature points (for example, n, where the first target feature point is a start point of a time domain window and the nth target feature point is not an end point of the time domain window), the electromyographic signal may be divided into n segments according to the n target feature points. For another example, when the electromyographic signal has a plurality of target feature points (for example, n, where the first target feature point is a start point of a time domain window and the nth target feature point is an end point of the time domain window), the electromyographic signal may be divided into n-1 segments according to the n target feature points. It should be noted that the action phase corresponding to the target feature point may include one or more types, and when there are multiple types of action phases corresponding to the target feature point, the motion signal may be segmented by using the multiple types of target feature points as references. For example, the action phase corresponding to the target feature point may include an action start point and an action end point, where the action start point is before the action end point, and here, the action signal from the action start point to the next action start point may be regarded as a segment of the action signal.
In some embodiments, the target feature points may include one or more of an action start point, an action intermediate point, or an action end point.
For describing the segmentation of the motion signal, the target feature points simultaneously include a motion start point, a motion intermediate point and a motion end point as an exemplary illustration, wherein the motion start point can be regarded as a start point of a user motion cycle. In some embodiments, different actions may correspond to different preset conditions. For example, in a sitting chest-clamping action, the preset condition may be that the angular velocity direction of the action after the action starting point is changed from the angular velocity direction of the action before the action starting point, or that the angular velocity value of the action starting point is approximately 0 and the acceleration value of the angular velocity of the action starting point is greater than 0. That is, when the user performs chest clamping in a sitting posture, the starting point of the movement may be a time point when the arms extend horizontally to the left and right and start inward rotation. For another example, in the two-head bending motion, the preset condition may be that the raised angle of the arm is greater than or equal to the angle threshold. Specifically, when the user performs the two-head bending operation, the raising angle of the arm of the user is 0 ° when the arm is horizontal, the angle is negative when the arm is dropped, and the angle is positive when the arm is raised. When the arm of the user is lifted from the horizontal position, the lifting angle of the arm is larger than 0. A time point when the angle by which the user's arm is raised reaches the angle threshold may be regarded as an action start point. The angle threshold may be-70 to-20 or, preferably, the angle threshold may be-50 to-25. In some embodiments, to further ensure the accuracy of the selected action start point, the preset condition may further include: the angular velocity of the arm within a certain time range after the action start point may be greater than or equal to the angular velocity threshold. The angular velocity threshold may range from 5 °/s to 50 °/s; preferably, the angular velocity threshold may range from 10 °/s to 30 °/s. For example, when the user performs the two-head bending motion, the angular velocity of the arm is continuously greater than the angular velocity threshold within the next specific time range (e.g., 0.05s, 0.1s, 0.5s) when the angular threshold is passed and the arm of the user is continuously lifted upward. In some embodiments, if the angular velocity of the action start point selected according to the preset condition within the specific time range is less than the angular velocity threshold, the preset condition is continuously executed until an action start point is determined.
In some embodiments, the motion midpoint may be a certain point within one motion cycle from the starting point. For example, when the user performs sitting posture chest clamping, the starting point of the movement may be a time point when the arms extend horizontally to the left and right and start inward rotation, and a time point when the arms are closed may be a user movement intermediate point. In some embodiments, the preset condition may be that the angular velocity direction at a time point after the action middle point is changed relative to the angular velocity direction at a time point before the action middle point, and the angular velocity value at the action middle point is approximately 0, wherein the angular velocity direction at the action middle point is opposite to the angular velocity direction at the action start point. In some embodiments, to improve the accuracy of the selection of the motion midpoint, the change speed of the angular velocity (acceleration of the angular velocity) within a first specific time range (e.g., 0.05s, 0.1s, 0.5s) after the motion midpoint may be greater than the acceleration threshold of the angular velocity (e.g., 0.05 rad/s). In some implementations, the amplitude information corresponding to the action intermediate point in the electromyographic signal is greater than the electromyographic threshold while the action intermediate point satisfies the preset condition. The electromyographic threshold value is related to the user action and the target electromyographic signal because the electromyographic signals corresponding to different actions are different. In the sitting position chest clamping, the electromyographic signals of the chest muscles are target electromyographic signals. In some embodiments, the position corresponding to the middle point of action (also called "middle position") can be approximately regarded as the maximum value point of the muscle force, and the electromyographic signal has a larger value. When the user performs the corresponding exercise, the electromyographic signal of the corresponding portion of the user's body is greatly increased relative to the electromyographic signal of the corresponding portion when the user does not perform the exercise (at this time, the muscle of the specific portion may be regarded as a resting state), for example, the amplitude of the electromyographic signal of the corresponding portion when the user's action reaches the middle position is 10 times that of the corresponding portion in the resting state. In addition, the type of the action performed by the user is different, the relationship between the amplitude of the corresponding part of the electromyographic signal moving to the middle position (action middle point) and the amplitude of the electromyographic signal in the rest state is also different, and the relationship between the amplitude and the amplitude can be adaptively adjusted according to the action of the actual movement. In some embodiments, to improve the accuracy of the selection of the motion mid-point, the corresponding amplitude value in a second specific time range (e.g., 0.05s, 0.1s, 0.5s) after the motion mid-point may be continuously greater than the myoelectric threshold. In some embodiments, the determination of the action middle point may be performed such that the euler angles (also referred to as angles) of the action middle point and the start position satisfy certain conditions, in addition to the above preset conditions (for example, the angular velocity and the amplitude condition of the electromyographic signal) that need to be satisfied. For example, in a sitting posture chest clamp, the euler angle of the motion midpoint relative to the motion starting point may be greater than one or more euler angle thresholds (also referred to as angle thresholds), e.g., with the human body anterior-posterior direction as the X-axis, the human body left-right direction as the Y-axis, and the human body height direction as the Z-axis, the X, Y direction euler angle variation may be less than 25 °, and the Z direction euler angle variation may be greater than 40 ° (the motion of the sitting posture chest clamp is mainly a rotation in the Z-axis direction, and the above parameters are also merely reference examples). In some embodiments, the electromyographic threshold and/or the euler angle threshold may be pre-stored in the memory or hard disk of the wearable device 130, may be stored in the processing device 110, or may be calculated according to actual conditions and may be adjusted in real time.
In some embodiments, the processing module 220 may determine the action intermediate point from a time domain window of a time point after the action start point according to a preset condition based on a time domain window of the electromyographic signal or the posture signal. In some implementations, after the motion intermediate point is determined, whether other time points meeting the preset condition exist in the time range from the motion starting point to the motion intermediate point may be re-verified, and if so, the motion starting point closest to the motion intermediate point may be selected as the optimal motion starting point. In some embodiments, if the difference between the time of the motion middle point and the time of the motion start point is greater than a certain time threshold (e.g., 1/2 or 2/3 of one motion cycle), the motion middle point is invalid, and the motion start point and the motion middle point are re-determined according to a preset condition.
In some embodiments, the action end point may be within one action cycle from the action start point and at some point in time after the action intermediate point, for example, the action end point may be set to a point one action cycle from the action start point, at which point the action end point may be considered to be the end point of one action cycle of the user. For example, when the user performs sitting posture chest clamping, the starting point of the movement may be a time point when the arms are extended in the horizontal direction and start to rotate inward, the time point when the arms are closed may be a user movement intermediate point, and the time point when the arms are restored to the extended state again in the horizontal direction may correspond to a user movement end point. In some embodiments, the preset condition may be that a change value of the angular velocity value corresponding to the attitude signal is an extreme value. In some embodiments, to prevent jitter misjudgment, the change in euler angle should exceed a certain euler angle threshold, for example 20 °, in the time range from the middle point of the action to the end point of the action. In some embodiments, the processing module 220 may determine the action end point from the time domain window after the action intermediate point according to a preset condition based on the time domain windows of the electromyographic signal and the posture signal. In some embodiments, if the difference between the time of the action end point and the time of the action middle point is greater than a specific time threshold (e.g., 1/2 of one action cycle), the action start point and the action middle point are invalid, and the action start point, the action middle point and the action end point are determined again according to the preset condition.
In some embodiments, the determining of the at least one set of action start point, action intermediate point and action end point in the action signal may be repeated and the action signal may be segmented based on the at least one set of action start point, action intermediate point and action end point as target feature points. This step may be performed by processing module 220 and/or processing device 110. It should be noted that the segmentation of the operation signal is not limited to the above-mentioned operation start point, operation intermediate point, and operation end point, and may include other time points. For example, the sitting posture chest-clamping action may select 5 time points according to the above steps, the first time point may be an action starting point, the second time point may be a time point at which the internal rotation angular velocity is maximum, the third time point may be an intermediate action time point, the fourth time point may be a time point at which the external rotation angular velocity is maximum, and the fifth time point may be a time point at which the arms return to the left and right extension and the angular velocity is 0, that is, an action ending point. In this example, by adding the second time point as the 1/4 marking point of the action cycle, the action end point described in the foregoing embodiment as the fourth time point for marking the 3/4 position of the action cycle, and the fifth time point as the end point of the complete action, compared with the action start point, the action intermediate point, and the action end point in the above steps. For the sitting posture chest clamping action, more time points are used, the action quality can be identified based on the signal 3/4 before the action cycle (that is, the identification of the action quality in a single cycle does not depend on completely analyzing the signal in the whole cycle), the monitoring and feedback of the action of the user can be completed when the action in the current cycle is not finished, and simultaneously, all signals in the whole action process can be completely recorded, so that the signals can be uploaded to a cloud terminal or a mobile terminal device, and more methods can be adopted to monitor the action of the user. For more complex actions, the period of one action is very long, and each stage has different force application modes, in some embodiments, the action can be divided into multiple stages by adopting the method for determining each time point, and signals of each stage are individually identified and fed back, so that the real-time performance of the action feedback of the user is improved.
It should be noted that, the above mentioned segmenting and monitoring the motion signal according to the motion start point, the motion intermediate point and the motion end point as a set of target characteristic points is only an exemplary illustration, and in some embodiments, the motion signal of the user may also be segmented and monitored based on any one or more of the motion start point, the motion intermediate point and the motion end point as the target characteristic points. For example, the motion signal may be segmented and monitored with the motion start point as a target feature point. For example, the operation signal may be segmented and monitored by using the operation start point and the operation end point as a set of target feature points, and other time points or time ranges that can function as the target feature points are all within the scope of the present specification.
It should be noted that the above description related to the flow 700 is only for illustration and description, and does not limit the applicable scope of the present specification. Various modifications and changes to flow 700 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are still within the scope of the present specification. For example, at least two of steps 710, 720 may be performed simultaneously in the processing module 220. For another example, step 710, step 720, may be performed simultaneously in the processing module 220 and the processing device 110, respectively.
FIG. 8 is a schematic diagram of an action signal segment according to some embodiments of the present application. In fig. 8, the abscissa may represent the time of the user's exercise, and the ordinate may represent the amplitude information of the corresponding muscle portion (for example, pectoralis major) electromyogram signal during the chest-clamping exercise of the user in a sitting posture. Fig. 8 also includes an angular velocity variation curve and an euler angle variation curve corresponding to the wrist position and posture signal during the user exercise, where the angular velocity variation curve is used to represent the velocity variation condition of the user during the user exercise, and the euler angle curve is used to represent the position condition of the body part of the user during the user exercise. As shown in fig. 8, the point a1 is determined as the action start point according to the preset condition. Specifically, the angular velocity direction at a time point after the user action start point a1 is changed from the angular velocity direction at a time point before the action start point a 1. Further, the angular velocity value of the action start point a1 is approximately 0, and the acceleration value of the angular velocity at the action start point a1 is greater than 0.
Referring to fig. 8, a point B1 is determined as an action midpoint according to a preset condition. Specifically, the angular velocity direction at a time point after the user action intermediate point B1 is changed from the angular velocity direction at a time point before the action intermediate point B1, and the angular velocity value of the action intermediate point B1 is approximately 0, where the angular velocity direction of the action intermediate point B1 is opposite to the angular velocity direction of the action start point a 1. In addition, the amplitude of the myoelectric signal (shown as "myoelectric signal" in fig. 8) corresponding to the action intermediate point B1 is larger than the myoelectric threshold.
With continued reference to fig. 8, point C1 is determined to be the action end point according to preset conditions. Specifically, the change value of the angular velocity value at the operation end point C1 is an extreme value from the operation start point a1 to the operation end point C1. In some embodiments, the process 700 may complete the motion segment shown in fig. 8, and the motion signal from the motion starting point a1 to the motion ending point C1 shown in fig. 8 may be regarded as a segment of the motion of the user.
It is noted that, in some embodiments, if the time interval between the action midpoint and the action start point is greater than a specific time threshold (e.g., 1/2 of one action cycle), the processing module 220 may re-determine the action start point to determine the accuracy of the action segment. The characteristic time threshold may be stored in the memory or hard disk of the wearable device 130, may be stored in the processing device 110, or may be calculated or adjusted according to the actual situation of the user's movement. For example, if the time interval between the action start point a1 and the action intermediate point B1 in fig. 8 is greater than a specific time threshold, the processing module 220 may re-determine the action start point, so as to improve the accuracy of the action segment. The segmentation of the operation signal is not limited to the above-mentioned operation start point a1, operation intermediate point B1, and operation end point C1, but may include other time points, and the selection of the time points may be performed according to the complexity of the operation.
When the action signal of the user is obtained, the quality of the action signal may be affected by external conditions such as relative movement or extrusion between the user and the human body and the obtaining module 210 during the exercise process, for example, an myoelectric signal has a sudden change, thereby affecting the monitoring of the action of the user. For convenience of description, the abrupt electromyographic signals may be described by singular points, and exemplary singular points may include a glitch signal, a discontinuity signal, and the like. In some embodiments, monitoring the motion of the user based on at least the characteristic information corresponding to the electromyographic signal or the characteristic information corresponding to the gesture signal may further include: the method comprises the steps of preprocessing an electromyographic signal in a frequency domain or a time domain, acquiring characteristic information corresponding to the electromyographic signal based on the preprocessed electromyographic signal, and monitoring the movement of a user according to the characteristic information corresponding to the electromyographic signal or the characteristic information corresponding to an attitude signal. In some embodiments, pre-processing the electromyographic signals in the frequency domain or the time domain may include filtering the electromyographic signals in the frequency domain to select or retain components of a particular frequency range in the electromyographic signals in the frequency domain. In some embodiments, the frequency range of the electromyographic signals acquired by the acquiring module 210 is 1Hz-1000 Hz, and the electromyographic signals can be filtered and selected from the frequency range (e.g. 30Hz-150Hz) for subsequent processing. In some embodiments, the particular frequency range may be 10Hz-500 Hz. Preferably, the specific frequency range may be 15Hz-300 Hz. More preferably, the specific frequency range may be 30Hz-150 Hz. In some embodiments, the filtering process may include a low pass filter process. In some embodiments, the low pass filter may include an LC passive filter, an RC active filter, a passive filter composed of special elements. In some embodiments, the passive filter composed of special elements may include one or more of a piezoelectric ceramic filter, a crystal filter, a surface acoustic wave filter. It should be noted that the specific frequency range is not limited to the above range, and may be other ranges, which may be selected according to actual situations. Reference may be made to fig. 5 and fig. 6 and related descriptions in this specification for contents of monitoring the movement of the user according to the feature information corresponding to the electromyographic signal or the feature information corresponding to the posture signal.
In some embodiments, preprocessing the electromyographic signals in the frequency domain or the time domain may further include signal correction processing the electromyographic signals in the time domain. The signal correction processing refers to correcting singular points (for example, a glitch signal, a discontinuous signal, and the like) in the myoelectric signal. In some embodiments, the signal correction processing of the electromyographic signal in the time domain may comprise determining a singular point in the electromyographic signal, i.e. determining a sudden change in the electromyographic signal. The singular point can be the discontinuity of the myoelectric signal caused by the sudden change of the amplitude of the myoelectric signal at a certain moment. For example, the electromyographic signal is smooth in morphology, the amplitude of the electromyographic signal does not change abruptly, but first order differential of the electromyographic signal occurs abruptly, and the first order differential is discontinuous. In some embodiments, the method of determining singularities in an electromyographic signal may include, but is not limited to, one or more of a fourier transform, a wavelet transform, a fractal dimension, and the like. In some embodiments, the signal correction processing on the electromyogram signal in the time domain may include removing singular points in the electromyogram signal, for example, deleting a signal in a time range in the singular point and the vicinity thereof. Alternatively, the signal correction processing on the electromyogram signal in the time domain may include modifying a singular point of the electromyogram signal according to the feature information of the electromyogram signal in a specific time range, for example, adjusting the amplitude of the singular point according to a signal around the singular point. In some embodiments, the characteristic information of the electromyographic signal may include one or more of amplitude information, statistical information of the amplitude information. The statistical information (also called amplitude entropy) of the amplitude information refers to the distribution of the amplitude information of the electromyographic signals in the time domain. In some embodiments, after the location (e.g., corresponding time point) of the singular point in the electromyographic signal is determined by a signal processing algorithm (e.g., fourier transform, wavelet transform, fractal dimension), the singular point may be corrected according to the electromyographic signal within a certain time range before or after the location of the singular point. For example, when the singular point is a dip, the electromyographic signal at the dip may be supplemented with feature information (for example, amplitude information, statistical information of the amplitude information) of the electromyographic signal in a specific time range (for example, 5ms to 60ms) before or after the dip.
Taking singular points as a spur signal for example, fig. 9 is an exemplary flowchart of electromyographic signal preprocessing according to some embodiments of the present application. As shown in fig. 9, the process 900 may include:
in step 910, based on the time domain window of the electromyographic signal, selecting different time windows from the time domain window of the electromyographic signal, wherein the different time windows respectively cover different time ranges.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the different windows may include at least one particular window. The specific window refers to a window having a specific time length selected among time domain windows. For example, when the time length of the time domain window of the electromyogram signal is 3s, the time length of the specific window may be 100 ms. In some embodiments, a particular window may include a plurality of different time windows. For example only, the specific window may include a first time window and a second time window, and the first time window may refer to a window having a corresponding portion of time length within the specific window, for example, when the time length of the specific window is 100ms, the time length of the first time window may be 80 ms. The second time window may refer to another window within the specific window corresponding to a portion of the time length, for example, when the specific window is 100ms, the second time window may be 20 ms. In some embodiments, the first time window and the second time window may be consecutive time windows within the same particular window. In some embodiments, the first time window and the second time window may also be two time windows that are discontinuous or overlap within the same particular window. For example, when the time length of the window in the specific time range is 100ms, the time length of the first time window may be 80ms, and the time length of the second time window may be 25ms, in which case 5ms in the second time window overlaps with the first time window. In some embodiments, the processing module 220 may slide and update the specific window in order by a specific time length from a time start point of the time domain window of the electromyographic signal based on the time domain window of the electromyographic signal, and may continuously divide the updated specific window into the first time window and the second time window. The specific time period referred to herein may be less than 1s, 2s, 3s, etc. For example, the processing module 220 may select a specific window with a specific time length of 100ms and divide the specific window into a first time window of 80ms and a second time window of 20 ms. Further, the specific window may be updated by sliding in the time direction. The sliding distance may be the time length of the second time window (e.g., 20ms), or may be other suitable time lengths, such as 30ms, 40ms, etc.
In step 920, the spur signal is determined based on the characteristic information corresponding to the electromyographic signal in the different time window.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the characteristic information corresponding to the electromyographic signals may include at least one of amplitude information and statistical information of the amplitude information. In some embodiments, the processing module 220 may obtain amplitude information or statistical information of the amplitude information corresponding to the electromyographic signals in different time windows (e.g., a first time window and a second time window) to determine the position of the glitch signal. For a specific description of determining the position of the glitch signal based on the corresponding characteristic information of the electromyographic signals in different time windows, reference may be made to fig. 10 and the related description thereof.
It should be noted that the above description of the flow 900 is for illustration and description only and does not limit the scope of the application of the present disclosure. Various modifications and changes to flow 900 may occur to those skilled in the art, given the benefit of this description. For example, the specific window is not limited to include the first time window and the second time window described above, and may include other time windows, e.g., a third time window, a fourth time window, and so on. In addition, the specific range of the time before or after the position of the glitch signal can be adaptively adjusted according to the length of the glitch signal, and is not further limited herein. However, such modifications and variations are still within the scope of the present specification.
FIG. 10 is an exemplary flow chart of a deburring signal shown in accordance with some embodiments of the present application. As shown in fig. 10, the process 1000 may include:
in step 1010, first amplitude information corresponding to the myoelectric signal in the first time window and second amplitude information corresponding to the myoelectric signal in the second time window are determined.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the processing module 220 may select the time lengths of the first time window and the second time window, and extract first amplitude information corresponding to the myoelectric signal within the time length of the first time window and second amplitude information corresponding to the myoelectric signal within the time length of the second time window. In some embodiments, the first amplitude information may comprise an average amplitude of the myoelectric signal over a first time window and the second amplitude information may comprise an average amplitude of the myoelectric signal over a second time window. For example, the processing module 220 may select a first time window with a time length of 80ms and extract first amplitude information corresponding to the myoelectric signal in the first time window, and the processing module 220 may select a second time window with a time length of 20ms and extract second amplitude information corresponding to the myoelectric signal in the second time window.
In some embodiments, the first time window time length and the second time window time length are selected in relation to the shortest spur signal length and the system's computational effort. In some embodiments, the first time window time length and the second time window time length may be selected based on characteristics of the glitch signal. The time length of the electrocardio-burr signal is 40ms-100ms, the time interval of two burr signals in the electrocardio-signal can be about 1s, two sides of the peak point of the burr signal are basically symmetrical, and the amplitude distribution of two sides of the burr signal is relatively even. In some embodiments, when the glitch signal is an ecg signal, a time length smaller than the glitch signal, for example, half of the glitch signal length, may be selected as the time length of the second time window, and the time length of the first time window may be larger than the time length of the second time window, for example, 4 times the time length of the second time window. In some embodiments, the length of the first time window is within the range of the glitch interval (about 1s) minus the length of the second time window. It should be noted that, the selected time length of the first time window and the selected time length of the second time window are not limited to the above description, as long as the sum of the time length of the second time window and the time length of the first time window is less than the time interval between two adjacent glitches, or the time length of the second time window is less than the length of a single glitch, or the amplitude of the myoelectric signal in the second time window and the amplitude of the myoelectric signal in the first time window have better discrimination.
In step 1020, it is determined whether the ratio of the second amplitude information to the first amplitude information is greater than a threshold.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the processing module 220 may determine whether a ratio of second amplitude information corresponding to the myoelectric signal in the second time window to first amplitude information corresponding to the myoelectric signal in the first time window is greater than a threshold. The threshold value may be stored in the memory or hard disk of the wearable device 130, may be stored in the processing device 110, or may be adjusted according to actual situations. In some embodiments, if the processing module 220 determines that the ratio of the second magnitude information to the first magnitude information is greater than the threshold, then step 1020 may proceed to step 1030. In other embodiments, if the processing module 220 determines that the ratio of the second amplitude information to the first amplitude information is not greater than the threshold, step 1020 may proceed to step 1040.
In step 1030, signal correction processing is performed on the electromyogram signal within the second time window.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the processing module 220 may perform signal correction processing on the electromyographic signals within the second time window according to the determination result of the magnitude relationship between the ratio of the second amplitude information and the first amplitude information and the threshold in step 1020. For example, in some embodiments, if the ratio of the second amplitude information to the first amplitude information is greater than the threshold, the electromyographic signal within the second time window corresponding to the second amplitude information is a glitch signal. In some embodiments, the processing of the electromyographic signals within the second time window may include signal-correction processing the electromyographic signals within the second time window based on the electromyographic signals within a specific time range before or after the second time window. In some embodiments, the manner of performing the signal correction processing on the electromyographic signals within the second time window may include, but is not limited to, padding, interpolation, and the like. In some embodiments, the particular time range may be 5ms to 60 ms. Preferably, the specific time range may be 10ms to 50 ms. Further preferably, the specific time range may be 20ms to 40 ms. It should be noted that the specific time range is not limited to the above range, for example, the specific time range may be larger than 60ms, or smaller than 5 ms. In an actual application scenario, adaptive adjustment can be performed according to the time length of the spur signal.
In step 1040, the electromyographic signals within the second time window are retained.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the processing module 220 may perform the retaining of the electromyographic signals within the second time window according to the determination result of the magnitude relationship between the ratio of the second amplitude information and the first amplitude information and the threshold in step 1020. For example, in some embodiments, if the ratio of the second amplitude information to the first amplitude information is not greater than the threshold, the electromyographic signal within the second time window corresponding to the second amplitude information is a normal electromyographic signal, and the normal electromyographic signal may be retained, that is, the electromyographic signal within the second time window is retained.
It should be noted that, as electric charges are gradually accumulated during the muscle force application process of the user, the amplitude of the electromyographic signal is gradually increased, so that the amplitude of the electromyographic signal in two adjacent time windows (for example, the first time window and the second time window) is not abrupt in the absence of the glitch signal. In some embodiments, the process of determining and removing the glitch signal in the electromyogram signal based on the process 1000 may implement real-time processing of the glitch signal, so that the wearable device 130 or the mobile terminal device 140 may feed back the motion state of the user in real time to help the user perform more scientific motion.
In some embodiments, the first time window may correspond to a greater length of time than the second time window. In some embodiments, the particular length of time for a particular window may be less than 1 s. In some embodiments, the ratio of the length of time corresponding to the first time window to the length of time corresponding to the second time window may be greater than 2. In some embodiments, the selection of the time length corresponding to the first time window, the time length corresponding to the second time window, and the specific time length corresponding to the specific window may ensure that the shortest glitch signal length (e.g., 40ms) may be removed and have a high signal-to-noise ratio, and may make the calculation amount of the system relatively small, reduce the repeated calculation of the system, and reduce the time complexity, thereby improving the calculation efficiency and the calculation accuracy of the system.
It should be noted that the above description of the process 1000 is for illustration and description only, and does not limit the scope of the application of the present disclosure. Various modifications and changes to flow 1000 will be apparent to those skilled in the art in light of this description. For example, the above-mentioned flow 1000 is only an example in which the singular point is a spur signal, and when the singular point is a valley signal, the above-mentioned steps (for example, step 1010, step 1020, step 1030, and the like) and their schemes may be adjusted or signal correction processing may be performed by another method. However, such modifications and variations are intended to be within the scope of the present description.
In some embodiments, the signal correction processing on the singular point of the electromyogram signal may also adopt other methods, for example, a high-pass method, a low-pass method, a band-pass method, a wavelet transform reconstruction method, and the like. In some embodiments, for application scenarios where the low frequency signal is not sensitive, a 100Hz high pass filter may be used for spur signal removal. In some embodiments, in addition to the signal correction processing on the electromyogram signal, other ways of signal processing, such as filtering processing, signal amplification, phase adjustment, etc., may be performed on the electromyogram signal. In some embodiments, the electromyographic signals of the user collected by the electromyographic sensor can be converted into digital electromyographic signals through an analog-to-digital converter (ADC), the converted digital electromyographic signals can be subjected to filtering processing, and the filtering processing can filter power frequency signals, harmonic signals thereof and the like. In some embodiments, the processing of the electromyographic signals may further include removing motion artifacts of the user. The motion artifact refers to signal noise generated by relative movement of muscles at a position to be detected relative to the myoelectric module when a user moves in the process of acquiring the myoelectric signal.
In some embodiments, the gesture signals may be acquired by a gesture sensor on the wearable device 130. The gesture sensors on the wearable device 130 may be distributed over body extremities (e.g., arms, legs, etc.), body torso (e.g., chest, abdomen, back, waist, etc.), and body head, among others. The posture sensor can realize the posture signal collection of other parts of the human body such as four limbs parts, trunk parts and the like. In some embodiments, the attitude sensor may also be a sensor of an attitude measurement unit (AHRS) with attitude fusion algorithm. The attitude fusion algorithm can fuse data of a nine-axis Inertial Measurement Unit (IMU) having a three-axis acceleration sensor, a three-axis angular velocity sensor, and a three-axis geomagnetic sensor into euler angles or quaternions to obtain an attitude signal of a body part of a user where the attitude sensor is located. In some embodiments, the processing module 220 and/or the processing device 110 may determine feature information corresponding to the gesture based on the gesture signal. In some embodiments, the characteristic information corresponding to the gesture signal may include, but is not limited to, an angular velocity value, an angular velocity direction, an acceleration value of the angular velocity, and the like. In some embodiments, the gesture sensor may be a strain sensor, and the strain sensor may acquire a bending direction and a bending angle at a joint of the user, thereby acquiring a gesture signal when the user moves. For example, a strain sensor may be disposed at a knee joint of a user, a body part of the user acts on the strain sensor when the user moves, and a bending direction and a bending angle at the knee joint of the user may be calculated based on a resistance or length change of the strain sensor, so as to obtain a posture signal of a leg of the user. In some embodiments, the gesture sensor may further include a fiber optic sensor, and the gesture signal may be characterized by a change in direction of a bent light of the fiber optic sensor. In some embodiments, the attitude sensor may also be a magnetic flux sensor, and the attitude signal may be characterized by a transformation of the magnetic flux. It should be noted that the type of the attitude sensor is not limited to the above-described sensor, and may be other sensors, and sensors capable of acquiring a user attitude signal are all within the scope of the attitude sensor of the present specification.
FIG. 11 is an exemplary flow chart illustrating the determination of feature information corresponding to a gesture signal according to some embodiments of the present application. As shown in fig. 11, flow 1100 may include:
in step 1110, a target coordinate system and a transformation relationship between the target coordinate system and at least one original coordinate system are obtained.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the original coordinate system refers to a coordinate system corresponding to a posture sensor disposed on a human body. When the user uses the wearable device 130, the posture sensors on the wearable device 130 are distributed at different parts of the human body, so that the installation angles of the posture sensors on the human body are different, and the posture sensors at different parts respectively use the coordinate systems of the respective bodies as original coordinate systems, so that the posture sensors at different parts have different original coordinate systems. In some embodiments, the attitude signals acquired by each attitude sensor may be a representation in its corresponding original coordinate system. By converting the attitude signals in different original coordinate systems into the same coordinate system (e.g., a target coordinate system), the determination of the relative motion between different parts of the human body is facilitated. In some embodiments, the target coordinate system refers to a human coordinate system established based on a human body. For example, the target coordinate system may have the longitudinal direction of the human torso (i.e., the direction perpendicular to the transverse plane of the human body) as the Z-axis, the anterior-posterior direction of the human torso (i.e., the direction perpendicular to the coronal plane of the human body) as the X-axis, and the left-right direction of the human torso (i.e., the direction perpendicular to the sagittal plane of the human body) as the Y-axis. In some embodiments, there is a transformation relationship between the target coordinate system and the original coordinate system by which coordinate information in the original coordinate system can be transformed into coordinate information in the target coordinate system. In some embodiments, the transformation relationship may be represented as one or more rotation matrices. For details of determining the conversion relationship between the target coordinate system and the original coordinate system, reference may be made to fig. 13 and its related description in this specification.
In step 1120, the coordinate information in the at least one original coordinate system is converted into coordinate information in the target coordinate system based on the conversion relationship.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. The coordinate information in the original coordinate system refers to three-dimensional coordinate information in the original coordinate system. The coordinate information in the target coordinate system refers to three-dimensional coordinate information in the target coordinate system. By way of exemplary illustration only, the coordinate information v in the original coordinate system 1 According to the conversion relation, the coordinate information in the original coordinate system can be converted into the coordinate information v in the target coordinate system 2 . In particular, coordinate information v 1 And coordinate information v 2 The transformation can be performed by using a rotation matrix, and the rotation matrix can be understood as a transformation relation between the original coordinate system and the target coordinate system. In particular, coordinate information v in the original coordinate system 1 Can be converted into coordinate information v by means of a first rotation matrix 1 -1, coordinate information v 1 -1 can be changed into coordinate information v by means of a second rotation matrix 1 -2, coordinate information v 1 -2 can be changed into coordinate information v by means of a third rotation matrix 1 -3, coordinate information v 1 -3 is the coordinate information v in the target coordinate system 2 . It should be noted that the rotation matrix is not limited to the first rotation matrix, the second rotation matrix, and the third rotation matrix described above, and may include fewer or more rotationsAnd (4) matrix. In some alternative embodiments, the rotation matrix may also be a single rotation matrix or a combination of multiple rotation matrices.
In step 1130, feature information corresponding to the attitude signal is determined based on the coordinate information in the target coordinate system.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, determining the characteristic information corresponding to the user gesture signal based on the coordinate information in the target coordinate system may include determining the characteristic information corresponding to the user gesture signal based on a plurality of coordinate information in the target coordinate system during the user motion. For example, when the user performs sitting posture chest-clamping exercise, the user arms can correspond to first coordinate information in the target coordinate system when being lifted forwards, the user arms can correspond to second coordinate information in the target coordinate system when being opened to be in the same plane with the trunk, and the characteristic information corresponding to the user posture signal can be calculated based on the first coordinate information and the second coordinate information. For example, angular velocity direction, acceleration value of angular velocity, and the like.
It should be noted that the above description of flow 1100 is for purposes of illustration and description only and is not intended to limit the scope of applicability of the present description. Various modifications and changes to flow 1100 may occur to those skilled in the art, given the benefit of this description. However, such modifications and variations are intended to be within the scope of the present description.
In some embodiments, the relative motion between different motion parts of the body of the user can also be judged through the characteristic information corresponding to the posture sensors positioned at different positions of the body of the user. For example, the relative movement between the arm and the trunk during the movement of the user can be judged through the characteristic information corresponding to the posture sensor at the arm of the user and the characteristic information corresponding to the posture sensor at the trunk part of the user. FIG. 12 is an exemplary flow diagram illustrating the determination of relative motion between different moving parts of a user according to some embodiments of the present application. As shown in fig. 12, flow 1200 may include:
in step 1210, feature information corresponding to at least two sensors is determined based on the transformation relationship between the different original coordinate systems and the target coordinate system.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, different sensors have different conversion relationships between the original coordinate system and the target coordinate system corresponding to the sensors due to different installation positions at the human body. In some embodiments, the processing device 110 may convert the coordinate information in the original coordinate system corresponding to the sensors of different parts (e.g., lower arm, upper arm, torso, etc.) of the user into the coordinate information in the target coordinate system, respectively, so that the feature information corresponding to at least two sensors may be determined respectively. The related description about the transformation of the coordinate information in the original coordinate system into the coordinate information in the target coordinate system can be found elsewhere in this application, for example, fig. 11, which is not described herein again.
In step 1220, relative movement between different moving parts of the user is determined based on the feature information corresponding to the at least two sensors, respectively.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, a motion site may refer to a limb on the body that may move independently, e.g., lower arm, upper arm, lower leg, upper leg, etc. By way of example only, when a user performs an arm dumbbell exercise, coordinate information in a target coordinate system corresponding to a sensor disposed at the lower arm portion and coordinate information in a target coordinate system corresponding to a sensor disposed at the upper arm portion are combined to determine a relative movement between the lower arm and the upper arm of the user, so that the arm dumbbell exercise of the user can be determined.
In some embodiments, a plurality of sensors of the same or different types may be disposed at the same moving part of the user, and the coordinate information in the original coordinate system corresponding to the plurality of sensors of the same or different types may be respectively converted into the coordinate information in the target coordinate system. For example, a plurality of same or different types of sensors can be arranged at different positions of the forearm part of the user, and a plurality of pieces of coordinate information in the target coordinate system corresponding to the plurality of same or different types of sensors can simultaneously represent the motion action of the forearm part of the user. For example, the coordinate information in the target coordinate system corresponding to a plurality of sensors of the same type may be averaged, thereby improving the accuracy of the coordinate information of the moving part during the movement of the user. For another example, the coordinate information in the target coordinate system may be obtained by a fusion algorithm (e.g., kalman filter) on the coordinate information in the coordinate systems corresponding to the plurality of different types of sensors.
It should be noted that the above description of flow 1100 is for purposes of illustration and description only and is not intended to limit the scope of applicability of the present description. Various modifications and changes to flow 1100 may occur to those skilled in the art, given the benefit of this description. However, such modifications and variations are intended to be within the scope of the present description.
FIG. 13 is an exemplary flow chart illustrating determining a transformation relationship of an original coordinate system to a particular coordinate system according to some embodiments of the present application. In some embodiments, the process of determining the transformation relationship between the original coordinate system and the specific coordinate system may also be called a calibration process. As shown in fig. 13, flow 1300 may include:
in step 1310, a particular coordinate system is constructed.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the transformation relationship between the at least one original coordinate system and the target coordinate system may be obtained by a calibration process. The specific coordinate system is a reference coordinate system used for determining a conversion relation between an original coordinate system and a target coordinate system in a calibration process. In some embodiments, the specific coordinate system may be constructed by taking the length direction of the human body when standing as the Z axis, the front-back direction of the human body as the X axis, and the left-right direction of the human body as the Y axis. In some embodiments, the particular coordinate system is related to the orientation of the user during calibration. For example, in the calibration process, the front of the user body faces a certain fixed direction (e.g., north), and then the front (north) direction of the human body is the X-axis, and in the calibration process, the direction of the X-axis is fixed.
In step 1320, first coordinate information in at least one original coordinate system when the user is in the first pose is obtained.
In some embodiments, this step may be performed by the acquisition module 210. The first posture may be a posture in which the user keeps approximately standing. The acquisition module 210 (e.g., a sensor) may acquire first coordinate information in the original coordinate system based on a first gesture of the user.
In step 1330, second coordinate information in the at least one original coordinate system is obtained when the user is in the second pose.
In some embodiments, this step may be performed by the acquisition module 210. The second posture may be a posture in which a body part (e.g., an arm) of the user where the sensor is located is tilted forward. In some embodiments, the acquisition module 210 (e.g., a sensor) may acquire second coordinate information in the original coordinate system based on a second gesture of the user (e.g., a forward tilt gesture).
In step 1340, a transformation relationship between the at least one original coordinate system and the specific coordinate system is determined according to the first coordinate information, the second coordinate information and the specific coordinate system.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the first rotation matrix may be determined by first coordinate information corresponding to the first gesture. In the first posture, since the euler angles in the X and Y directions of the specific coordinate system in the ZYX rotation order are 0, and the euler angles in the X and Y directions of the original coordinate system are not necessarily 0, the first rotation matrix is a rotation matrix obtained by reversely rotating the original coordinate system about the X axis and then reversely rotating about the Y axis. In some embodiments, the second rotation matrix may be determined from second coordinate information of a second pose (e.g., the body part on which the sensor is located is tilted forward). Specifically, in the second posture, it is known that the specific coordinate system is in the ZYZ rotation order, Y and Z 3 The Euler angle of the direction is 0, and the original coordinate system is Y and Z 3 The Euler angle of the direction is not necessarily 0, and the second rotation matrix is to rotate the original coordinate system reversely around the Y direction and then around the Z direction 3 And reversely rotating the direction to obtain a rotation matrix. The first rotation matrix and the second rotation matrix can be used for determiningAnd determining the conversion relation between the original coordinate system and the specific coordinate system. In some embodiments, when the original coordinate system (sensor) is plural, the conversion relationship between each original coordinate system and a specific coordinate system may be determined by the method described above.
It should be noted that the first posture is not limited to a posture in which the user keeps an approximately standing posture, and the second posture is not limited to a posture in which a body part (for example, an arm) of the user where the sensor is located tilts forward, and the first posture and the second posture may be approximately regarded as a posture which is stationary during the calibration process. In some embodiments, the first pose and/or the second pose may also be a pose that is dynamic during calibration. For example, the walking posture of the user is a relatively fixed posture, the angles and angular velocities of the arms, legs and feet during walking can be extracted, the actions of stepping forward, swinging forward arms and the like can be recognized, and the walking posture of the user can be used as a second posture during calibration. In some embodiments, the second gesture is not limited to one action, and a plurality of actions may also be extracted as the second gesture. For example, coordinate information of a plurality of actions is fused, so that a more accurate rotation matrix is obtained.
In some embodiments, the rotation matrix may be dynamically corrected using some signal processing algorithm (e.g., using a kalman filter algorithm) during the calibration process to obtain a better transformation matrix throughout the calibration process.
In some embodiments, some specific actions may be automatically identified using machine learning algorithms, or other algorithms, to update the rotation matrix in real-time. For example, if the machine learning algorithm recognizes that the current user is walking or standing, the calibration process is automatically started, in which case the wearable device does not need an explicit calibration process, and the rotation matrix is dynamically updated during the user's use of the wearable device.
In some embodiments, the installation position of the attitude sensor may be relatively fixed, and a rotation matrix may be preset in the corresponding algorithm, so that the recognition process of a specific action may be more accurate. Furthermore, the rotation matrix is continuously corrected in the process that the user uses the wearable device, so that the obtained rotation matrix is closer to the real condition.
It should be noted that the above description of process 1300 is for illustration and description only and is not intended to limit the scope of applicability of the present description. Various modifications and changes to flow 1300 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are still within the scope of the present specification.
FIG. 14 is an exemplary flow chart illustrating the determination of a transformation relationship between an original coordinate system and a target coordinate system according to some embodiments of the present application. As shown in fig. 14, the process 1400 may include:
in step 1410, a transformation relationship between the specific coordinate system and the target coordinate system is obtained.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. Both the specific coordinate system and the target coordinate system take the length direction of the human body as the Z axis, so the conversion relation between the specific coordinate system and the target coordinate system can be obtained through the conversion relation between the X axis of the specific coordinate system and the X axis of the target coordinate system and the conversion relation between the Y axis of the specific coordinate system and the Y axis of the target coordinate system. Reference may be made to fig. 13 and its related contents for the principle of obtaining a conversion relationship between a specific coordinate relationship and a target coordinate system.
In some embodiments, the specific coordinate system may use the length direction of the human body as the Z axis and the front-back direction as the calibrated X axis. Since the front and back directions of the user's body may change during the movement (e.g., turning) and cannot be maintained in the calibrated coordinate system, a coordinate system that can rotate along with the body, i.e., a target coordinate system, needs to be determined. In some embodiments, the target coordinate system may change as the orientation of the user changes, the X-axis of the target coordinate system always being directly in front of the human torso.
In step 1420, a transformation relationship between the at least one original coordinate system and the target coordinate system is determined according to the transformation relationship between the at least one original coordinate system and the specific coordinate system and the transformation relationship between the specific coordinate system and the target coordinate system.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the processing device 110 may determine the transformation relationship between the at least one original coordinate system and the target coordinate system according to the transformation relationship between the at least one original coordinate system and the specific coordinate system determined in the process 1300 and the transformation relationship between the specific coordinate system and the target coordinate system determined in the step 1410, so that the coordinate information in the original coordinate system may be converted into the coordinate information in the target coordinate system.
It should be noted that the above description of the process 1400 is for illustration and description only, and does not limit the applicable scope of the present disclosure. Various modifications and changes to flow 1400 are possible to those skilled in the art, given the benefit of this description. However, such modifications and variations are intended to be within the scope of the present description.
In some embodiments, the position of the gesture sensor disposed on the wearable device 130 may change and/or the installation angle of the gesture sensor on the human body may be different, so that the user performs the same movement, and the gesture data returned by the gesture sensor may be different.
Fig. 15A is an exemplary vector plot of euler angle data in an original coordinate system at a human forearm location, according to some embodiments of the present application. The frame line part may represent euler angle data (coordinate information) in the original coordinate system corresponding to the position of the forearm when the user makes the same motion. As shown in FIG. 15A, the Euler angular magnitudes in the Z-axis direction (shown as "Z" in FIG. 15A) within the box line portion resulted approximately in the range of-180 ° - (-80 °), the Euler angular magnitudes in the Y-axis direction (shown as "Y" in FIG. 15A) resulted approximately in fluctuations above and below 0 °, and the Euler angular magnitudes in the X-axis direction (shown as "X" in FIG. 15A) resulted approximately in fluctuations above and below-80 °. The fluctuation range here may be 20 °.
Fig. 15B is an exemplary vector plot of euler angle data in an original coordinate system at another location of a human forearm position, according to some embodiments of the present application. The box line parts may represent that the user does the same actionEuler angle data in the original coordinate system corresponding to the other position of the forearm when the same action as that shown in fig. 15A is made. As shown in FIG. 15B, the Z-axis direction within the frame line portion (in FIG. 15B, denoted by "Z") "show") is approximately in the range of-180 ° -180 °, the Y-axis direction (in fig. 15B, by "Y") "show") fluctuates approximately above and below 0 ° in the euler angular magnitude result, the X-axis direction (denoted "X" in fig. 15B) "show") fluctuates approximately above-150 deg.. The fluctuation range here may be 20 °.
The euler angle data shown in fig. 15A and 15B are euler angle data (coordinate information) in the original coordinate system obtained when the user performs the same operation, based on different positions of the human forearm (it can be understood that the installation angle of the posture sensor at the position of the human forearm is different). As can be seen from comparison between fig. 15A and fig. 15B, the installation angles of the attitude sensors on the human body are different, and when the user performs the same action, the difference between the euler angle data in the original coordinate system returned by the attitude sensors may be larger. For example, the Euler angle measurement results in the Z-axis direction in FIG. 15A are approximately in the range of-180 ° - (-80 °), and the Euler angle measurement results in the Z-axis direction in FIG. 15B are approximately in the range of-180 ° -180 °, with a large difference therebetween.
In some embodiments, euler angle data in the original coordinate system corresponding to the sensors with different installation angles may be converted into euler angle data in the target coordinate system, so as to facilitate analysis of the attitude signals of the sensors with different positions. For illustrative purposes only, the line on which the left arm lies may be abstracted as a unit vector pointing from the elbow to the wrist, the unit vector being a coordinate value within the target coordinate system. The target coordinate system is defined as an axis pointing to the rear of the human body as an X-axis, an axis pointing to the right side of the human body as a Y-axis, and an axis pointing to the upper side of the human body as a Z-axis, and conforms to a right-hand coordinate system. For example, the coordinate value [ -1, 0, 0] in the target coordinate system represents the forward arm lift; the coordinate values [0, -1, 0] of the target coordinate system represent the left-hand supination of the arm. Fig. 16A is an exemplary vector plot of euler angle data in a target coordinate system at a human forearm location according to some embodiments of the present application. Fig. 16A is a graph based on the results obtained after the euler angle data of the forearm in fig. 15A is converted into vector coordinates in the target coordinate system, wherein the box line part may represent the euler angle data in the target coordinate system at the position of the forearm when the user makes an action. As shown in FIG. 16A, the forearm vector [ x, y, z ] in the wire portion reciprocates between a first position [0.2, -0.9, -0.38] and a second position [0.1, -0.95, -0.3 ]. It should be noted that the first and second positions will have a small magnitude of deviation with each reciprocation of the arm.
Fig. 16B is an exemplary vector plot of euler angle data in a target coordinate system at another location of a human forearm, according to some embodiments of the present application. Fig. 16B is a graph based on the results obtained after the euler angle data of the forearm in fig. 15B is converted into vector coordinates in the target coordinate system, wherein the box line part may represent the euler angle data in the target coordinate system at another position of the forearm position when the user makes the same action (the same action as that shown in fig. 16A). As shown in FIG. 16B, the forearm vector [ x, y, z ] likewise reciprocates between a first position [0.2, -0.9, -0.38] and a second position [0.1, -0.95, -0.3 ].
With reference to fig. 15A to 16B, as can be seen from fig. 15A and 15B, since the installation positions of the two attitude sensors are different, the euler angles in the original coordinate system are greatly different in value range and fluctuation form, and after the coordinate information of the original coordinate system corresponding to the two attitude sensors is respectively converted into the vector coordinates (for example, the vector coordinates in fig. 16A and 16B) corresponding to the target coordinate system, two approximately same vector coordinates can be obtained, that is, the method can make the feature information corresponding to the attitude signal not affected by the installation positions of the sensors. Specifically, it can be seen in fig. 16A and 16B that the two posture sensors are different in mounting position on the forearm, and the same vector coordinates are obtained after the coordinate conversion, that is, the process of reciprocating switching of the arm between the first state (the arm is lifted to the right) and the second state (the arm is lifted to the front) in the sitting posture chest clamping process can be represented.
FIG. 17 is a vector coordinate diagram of a limb vector in a target coordinate system according to some embodiments of the present application. As shown in FIG. 17, vector coordinates of the posture sensor in the target coordinate system at the positions of the left forearm (17-1), the right forearm (17-2), the left forearm (17-3), the right forearm (17-4) and the trunk (17-5) of the human body can be respectively represented from top to bottom. The vector coordinates of the various positions (e.g., 17-1, 17-2, 17-3, 17-4, 17-5) in the target coordinate system as the human body moves are shown in FIG. 17. The first 4200 points in fig. 17 are the calibration movements required to calibrate the limb, such as standing, forward torso, forward arm, lateral arm lift, etc. The calibration action corresponding to the first 4200 points is used for calibration, and the original data collected by the attitude sensor can be converted into Euler angles under a target coordinate system. In order to facilitate the analysis of the data, the data can be further converted into a coordinate vector of the arm vector in the target coordinate system. The target coordinate system here is the X-axis pointing to the front of the torso, the Y-axis pointing to the left of the torso, and the Z-axis pointing above the torso. The reciprocating motion in fig. 17 from left to right are motion 1, motion 2, motion 3, motion 4, motion 5, and motion 6, which are sitting posture chest clamping, high-level pull-down, sitting posture chest pushing, sitting posture shoulder pushing, barbell biceps bending, and sitting posture chest clamping, respectively. As can be seen from fig. 17, different motions have different motion patterns, and can be clearly recognized by using the limb vectors. Meanwhile, the same actions have good repeatability, for example, action 1 and action 6 both represent sitting posture chest clamping actions, and the curves of the two actions have good repeatability.
In some embodiments, the pose data (e.g., euler angles, angular velocities, etc.) directly output by the modules of the original coordinate system may be converted into pose data in the target coordinate system through processes 1300 and 1400, such that high consistency pose data (e.g., euler angles, angular velocities, limb vector coordinates, etc.) may be obtained.
FIG. 18A is an exemplary vector graph of raw angular velocity shown according to some embodiments of the present application. The original angular velocity may be understood as the euler angle data in the original coordinate system corresponding to the sensors of different installation angles is converted into euler angle data in the target coordinate system. In some embodiments, factors such as jitter during user movement may affect the result of angular velocity in the gesture data. As shown in fig. 18A, the vector coordinate curve of the original angular velocity shows a relatively obvious unsmooth curve under the influence of jitter and the like. For example, there is an abrupt signal in the vector coordinate curve of the original angular velocity, so that the vector coordinate curve of the original angular velocity is not smooth. In some embodiments, for the influence of jitter on the angular velocity result, it is necessary to correct the angular velocity of the jitter to obtain a smooth vector coordinate curve. In some embodiments, the original angular velocity may be filtered using a 1Hz-3Hz low pass filtering method. FIG. 18B is a graph of exemplary results of angular velocity after filtering processing according to some embodiments of the present application. As shown in fig. 18B, after the original angular velocity is subjected to the low-pass filtering processing of 1Hz to 3Hz, the influence of jitter and the like on the angular velocity (for example, abrupt signals) can be eliminated, so that the vector graph corresponding to the angular velocity can present a smoother curve. In some embodiments, the low-pass filtering processing of 1Hz-3Hz is performed on the angular velocity, so that the influence of jitter and the like on the attitude data (such as euler angles, angular velocities and the like) can be effectively avoided, and the subsequent signal segmentation process is more convenient. In some embodiments, the filtering process may further filter out a power frequency signal and a harmonic signal, a spur signal, and the like of the motion signal. It should be noted that, the low-pass filtering processing of 1Hz to 3Hz introduces the system time delay, so that the action point acquired by the attitude signal and the action point of the real electromyographic signal are temporally misaligned, and therefore, the system time delay generated in the low-pass filtering processing process is subtracted on the basis of the vector coordinate curve after the low-pass filtering processing, so that the synchronization of the attitude signal and the electromyographic signal in time is ensured. In some embodiments, the system delay is associated with a center frequency of the filter, and the system delay is adaptively adjusted according to the center frequency of the filter when the posture signal and the electromyographic signal are processed using different filters. In some embodiments, since the euler angle has an angular range of [ -180 °, +180 ° ], when the actual euler angle is not within this angular range, the obtained euler angle may jump from-180 ° to +180 ° or from +180 ° to-180 °. For example, when the angle is-181 °, the angle of the euler angle jumps to 179 °. In the practical application process, the jump can influence the calculation of the angle difference, and the jump needs to be corrected first.
In some embodiments, the motion recognition model may be further utilized to analyze the motion signal of the user or feature information corresponding to the motion signal, so as to recognize the user motion. In some embodiments, the motion recognition model comprises a machine learning model trained to recognize user motion. In some embodiments, the motion recognition model may include one or more machine learning models. In some embodiments, the motion recognition model may include, but is not limited to, one or more of a machine learning model that classifies the user motion signal, a machine learning model that recognizes the quality of the user motion, a machine learning model that recognizes the number of times the user motion is made, a machine learning model that recognizes the degree of fatigue of the user performing the motion. In some embodiments, the machine learning model may include one or more of a linear classification model (LR), a support vector machine model (SVM), a naive bayes model (NB), a K-nearest neighbor model (KNN), a decision tree model (DT), an integrated model (RF/GDBT, etc.), and the like. Reference may be made to the motion recognition model elsewhere in this specification, such as fig. 20 and its associated description.
FIG. 19 is an exemplary flow diagram of a motion monitoring and feedback method according to some embodiments of the present application. As shown in fig. 19, flow 1900 may include:
in step 1910, a motion signal is obtained while the user is moving.
In some embodiments, this step may be performed by the acquisition module 210. In some embodiments, the action signal at least comprises characteristic information corresponding to the electromyographic signal and characteristic information corresponding to the posture signal. The motion signal refers to human body parameter information when the user moves. In some embodiments, the body parameter information may include, but is not limited to, one or more of an electromyographic signal, a posture signal, a heart rate signal, a temperature signal, a humidity signal, a blood oxygen concentration, and the like. In some embodiments, the motion signal may include at least a myoelectric signal and a posture signal. In some embodiments, the electromyographic sensor in the obtaining module 210 may collect an electromyographic signal of the user when the user moves, and the gesture sensor in the obtaining module 210 may collect a gesture signal of the user when the user moves.
In step 1920, the motion recognition model monitors the motion of the user based on the motion signal, and performs motion feedback based on the output result of the motion recognition model.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the output results of the action recognition model may include, but are not limited to, one or more of an action type, an action quality, an action quantity, a fatigue index, and the like. For example, the motion recognition model may recognize the type of motion of the user as sitting clipping chest from the motion signal. For another example, one of the motion recognition models may recognize the motion type of the user as sitting posture chest according to the motion signal, and the other of the motion recognition models may output the motion quality of the user motion as a standard motion or a false motion according to the motion signal (e.g., amplitude information of the myoelectric signal, frequency information, and/or acceleration values of the angular velocity, angular velocity direction, and angular velocity of the posture signal). In some embodiments, the action feedback may include issuing a prompt message. In some embodiments, the prompt information may include, but is not limited to, voice prompts, text prompts, image prompts, video prompts, and the like. For example, if the output of the motion recognition model is a false motion, the processing device 110 may control the wearable device 130 or the mobile terminal device 140 to issue a voice prompt (e.g., "motion not normative" or the like) to the user to remind the user to adjust the exercise motion in time. For another example, the output result of the motion recognition model is a standard motion, and the wearable device 130 or the mobile terminal device 140 may not issue a prompt message, or a prompt message similar to the "motion standard" may occur. In some embodiments, the motion feedback may also include the corresponding part of the wearable device 130 stimulating the user's motion. For example, the elements of the wearable device 130 stimulate the corresponding portion of the user's motion by way of vibration feedback, electrical stimulation feedback, pressure feedback, and the like. For example, the output of the motion recognition model may be a false motion, and the processing device 110 may control elements of the wearable device 130 to stimulate respective portions of the user's motion. In some embodiments, the motion feedback may also include outputting a record of the movement of the user as it moves. The exercise record can refer to one or more of user action type, exercise duration, action quantity, action quality, fatigue index, physiological parameter information during exercise and the like. For the content of the motion recognition model, reference may be made to the description elsewhere in this application, and details are not described here.
It should be noted that the above description related to the flow 1900 is only for illustration and description, and does not limit the application scope of the present specification. Various modifications and changes to flow 1900 may occur to those skilled in the art, in light of the present description. However, such modifications and variations are intended to be within the scope of the present description.
FIG. 20 is an exemplary flow chart of an application of model training according to some embodiments shown herein. As shown in fig. 20, the process 2000 may include:
in step 2010, sample information is obtained.
In some embodiments, this step may be performed by the acquisition module 210. In some embodiments, the sample information may include motion signals while a professional (e.g., a fitness trainer) and/or a non-professional is exercising. For example, the sample information may include electromyographic signals and/or posture signals generated by professionals and/or non-professionals when performing the same type of action (e.g., sitting chest clamping). In some embodiments, the electromyographic signals and/or the posture signals in the sample information may be subjected to the segmentation process of the process 700, the burr process of the process 900, the conversion process of the process 1300, and the like to form at least one segment of electromyographic signals and/or posture signals. The at least one segment of electromyographic signals and/or posture signals may be used as an input to a machine learning model to train the machine learning model. In some embodiments, at least one piece of feature information corresponding to the electromyographic signal and/or feature information corresponding to the posture signal can also be used as an input of the machine learning model to train the machine learning model. For example, frequency information and amplitude information of the electromyographic signals may be input as machine learning models. For another example, the angular velocity of the attitude signal, the angular velocity direction, and the acceleration value of the angular velocity may be used as the input of the machine learning model. For another example, the motion start point, the motion intermediate point, and the motion end point of the motion signal may be input to the machine learning model. In some embodiments, the sample information may be obtained from a storage device of the processing device 110. In some embodiments, the sample information may be obtained from the acquisition module 210.
In step 2020, a motion recognition model is trained.
This step may be performed by the processing device 110. In some embodiments, the motion recognition model may include one or more machine learning models. For example, the motion recognition model may include, but is not limited to, one or more of a machine learning model that classifies the user motion signal, a machine learning model that recognizes the quality of the user motion, a machine learning model that recognizes the number of times the user motion is made, a machine learning model that recognizes the degree of fatigue of the user performing the motion. In some embodiments, the machine learning model may include one or more of a linear classification model (LR), a support vector machine model (SVM), a naive bayes model (NB), a K-nearest neighbor model (KNN), a decision tree model (DT), an integrated model (RF/GDBT, etc.), and the like.
In some embodiments, training of the machine learning model may include obtaining sample information. In some embodiments, the sample information may include motion signals while a professional (e.g., a fitness trainer) and/or a non-professional is exercising. For example, the sample information may include electromyographic signals and/or posture signals generated by a professional and/or non-professional when performing the same type of action (e.g., sitting chest clamping). In some embodiments, the electromyographic signals and/or the posture signals in the sample information may be subjected to the segmentation process of the process 700, the burr process of the process 900, the conversion process of the process 1300, and the like to form at least one segment of electromyographic signals and/or posture signals. The at least one segment of electromyographic signals and/or posture signals may be used as an input to a machine learning model to train the machine learning model. In some embodiments, at least one piece of feature information corresponding to the electromyographic signal and/or feature information corresponding to the posture signal can also be used as an input of the machine learning model to train the machine learning model. For example, frequency information and amplitude information of the electromyographic signals may be input as machine learning models. For another example, the angular velocity of the attitude signal, the angular velocity direction, and the acceleration value of the angular velocity may be used as the input of the machine learning model. For another example, signals (including myoelectric signals and/or posture signals) corresponding to the action start point, the action intermediate point, and/or the action end point of the action signal may be used as the input of the machine learning model.
In some embodiments, when training a machine learning model for recognizing the action type of the user, sample information (each segment of electromyographic signal or/and posture signal) from different action types can be labeled. For example, the electromyographic signal and/or the posture signal generated when the user performs sitting chest clamping can be marked as "1", wherein "1" is used for representing "sitting chest clamping"; the sample information from the electromyographic signals and/or the posture signals generated when the user performs the bicep curl may be labeled as "2", where "2" is used to characterize "bicep curl". The electromyographic signals corresponding to different action types have different feature information (such as frequency information and amplitude information) and different gesture signal feature information (such as angular velocity, angular velocity direction and angular velocity value of angular velocity), and the labeled sample information (such as the electromyographic signals and/or the feature information corresponding to the gesture signals in the sample information) is used as the input of a machine learning model to train the machine learning model, so that an action recognition model for recognizing the action type of the user can be obtained, and the corresponding action type can be output by inputting the action signals into the machine learning model.
In some embodiments, the motion recognition model may further include a machine learning model for determining the quality of the user's motion. The sample information herein may include a standard action signal (also referred to as a positive sample) and a non-standard action signal (also referred to as a negative sample). The standard action signal may comprise an action signal generated by the practitioner when performing the standard action. For example, the action signal generated by a professional performing a standard sitting chest-clamping exercise is a standard action signal. The non-standard action signal may include an action signal resulting from a user performing a non-standard action (e.g., a false action). In some embodiments, the electromyographic signals and/or posture signals in the sample information may be subjected to the segmentation process of the process 700, the burr process of the process 900, the conversion process of the process 1300, and the like, to form at least one segment of the electromyographic signals and/or posture signals. The at least one segment of electromyographic signals and/or posture signals may be used as an input to a machine learning model to train the machine learning model. In some embodiments, the positive and negative examples in the example information (each segment of electromyogram signal or/and gesture signal) may be labeled. For example, the positive samples are labeled "1" and the negative samples are labeled "0". Here, "1" is used to represent that the user's action is a standard action, and "0" is used to represent that the user's action is a false action. The trained machine learning model may output different labels based on the input sample information (e.g., positive samples, negative samples). It should be noted that the motion recognition model may include one or more machine learning models for analyzing and recognizing the quality of the user's motion, and different machine learning models may analyze and recognize sample information from different motion types, respectively.
In some embodiments, the motion recognition model may also include a model that recognizes a number of motions of the user's fitness activity. For example, the motion signal (e.g., the myoelectric signal and/or the posture signal) in the sample information is segmented by the flow 700 to obtain at least one set of motion start point, motion intermediate point, and motion end point, the motion start point, motion intermediate point, and motion end point of each set are respectively marked, for example, the motion start point is marked as 1, the motion intermediate point is marked as 2, and the motion end point is marked as 3, the marks are used as the input of the machine learning model, and a set of consecutive "1", "2", and "3" is input into the machine learning model, so that 1 motion can be output. For example, inputting 3 consecutive sets of "1", "2", "3" in the machine learning model may output 3 actions.
In some embodiments, the motion recognition model may also include a machine learning model for recognizing a user fatigue index. The sample information may also include other physiological parameter signals such as an electrocardiogram signal, a respiratory rate signal, a temperature signal, a humidity signal, and the like. For example, different frequency ranges of the cardiac electrical signal may be used as input data for the machine learning model, and the frequency of the cardiac electrical signal is labeled as "1" (normal) between 60 times/min and 100 times/min, and labeled as "2" (abnormal) less than 60 times/min or more than 100 times/min. In some embodiments, the machine learning model can further segment according to the frequency of the electrocardiosignals of the user and mark different indexes as input data, and the machine learning model which completes training can output corresponding fatigue indexes according to the frequency of the electrocardiosignals. In some embodiments, the machine learning model may also be trained in conjunction with physiological parameter signals such as respiratory rate, temperature signals, and the like. In some embodiments, the sample information may be obtained from a storage device of the processing device 110. In some embodiments, the sample information may be obtained from the acquisition module 210. It should be noted that the motion recognition model may be any one of the above machine learning models, may be a combination of a plurality of the above machine learning models, or may include other machine learning models, and may be selected according to actual situations. The training input to the machine learning model is not limited to a single-stage (one-cycle) motion signal, and may be a partial motion signal in a single-stage signal, a multi-stage motion signal, or the like.
In step 2030, a motion recognition model is extracted.
In some embodiments, this step may be performed by the processing device 110. In some embodiments, the processing device 110 and/or the processing module 220 may extract the motion recognition model. In some embodiments, the motion recognition model may be stored in the processing device 110, the processing module 220, or the mobile terminal.
In step 2040, a user action signal is acquired.
In some embodiments, this step may be performed by the acquisition module 210. For example, in some embodiments, the electromyographic sensor in the acquisition module 210 may acquire an electromyographic signal of the user, and the gesture sensor in the acquisition module 210 may acquire a gesture signal of the user. In some embodiments, the user motion signal may further include other physiological parameter signals such as an electrocardiogram signal, a respiration signal, a temperature signal, a humidity signal, and the like, when the user moves. In some embodiments, the action signal (e.g., electromyographic signal and/or posture signal) may be segmented in the process 700, processed by the spike in the process 900, and converted in the process 1300, after being acquired, to form at least one segment of electromyographic signal and/or posture signal.
In step 2050, a user action is determined based on the user action signal by the action recognition model.
This step may be performed by processing device 110 and/or processing module 220. In some embodiments, processing device 110 and/or processing module 220 may determine a user action based on the action recognition model. In some embodiments, the trained motion recognition model may include one or more machine learning models. In some embodiments, the motion recognition model may include, but is not limited to, one or more of a machine learning model that classifies the user motion signal, a machine learning model that recognizes the quality of the user motion, a machine learning model that recognizes the number of times the user motion is made, a machine learning model that recognizes the fatigue index of the user performing the motion. Different machine learning models may have different recognition effects. For example, a machine learning model that classifies a user's motion signal may take the user's motion signal as input data and output a corresponding motion type. For another example, the machine learning model that recognizes the quality of the user's motion may output the quality of the motion (e.g., standard motion, erroneous motion) using the motion signal of the user as input data. For another example, the machine learning model that identifies the fatigue index of the user performing the action may output the fatigue index of the user using the action signal (e.g., the frequency of the cardiac signal) of the user as input data. In some embodiments, the user motion signal and the determination result (output) of the machine learning model may also be used as sample information for training the motion recognition model, and the motion recognition model is trained to optimize relevant parameters of the motion recognition model. It should be noted that the motion recognition model is not limited to the trained machine learning model described above, and may be a preset model, for example, a condition judgment algorithm preset manually, or a parameter (for example, confidence) added manually on the basis of the trained machine learning model.
In step 2060, the user action is fed back based on the determination result.
In some embodiments, this step may be performed by the wearable device 130 and/or the mobile terminal device 140. Further, the processing device 110 and/or the processing module 220 issues a feedback instruction to the wearable device 130 and/or the mobile terminal device 140 based on the determination result of the user action, and the wearable device 130 and/or the mobile terminal device 140 feeds back the user based on the feedback instruction. In some embodiments, the feedback may include stimulating the user's body by issuing a prompt message (e.g., text message, picture message, video message, voice message, indicator light message, etc.) and/or performing a corresponding action (e.g., current stimulation, vibration, pressure change, heat change, etc.). For example, when the user performs a sit-up exercise, the user monitors the action signal to determine that the trapezius muscle of the user is over-stressed during the exercise (that is, the head and neck of the user are not standard in the exercise), in which case the input/output module 260 (e.g., a vibration prompter) in the wearable device 130 and the mobile terminal device 140 (e.g., a smart watch, a smart phone, etc.) perform a corresponding feedback action (e.g., applying a vibration to a body part of the user, sending a voice prompt, etc.) to prompt the user to adjust the force-exerting part in time. In some embodiments, during the user movement process, the motion signal of the user during the user movement process is monitored to determine the motion type, the motion quality, and the motion frequency of the user during the movement process, and the mobile terminal device 140 may output a corresponding motion record, so that the user can know the motion condition of the user during the movement process.
In some embodiments, when feedback is given to a user, the feedback may match the user perception. For example, the user may perform vibration stimulation on the area corresponding to the user action when the user action is not standard, and the user may know the action is not standard based on the vibration stimulation, and the vibration stimulation is within the range acceptable to the user. Further, a matching model may be built based on the user action signal and the user perception, finding the best balance point between the user perception and the true feedback.
In some embodiments, the motion recognition model may also be trained from the user motion signals. In some embodiments, training the motion recognition model from the user motion signal may include evaluating the user motion signal to determine a confidence level of the user motion signal. The magnitude of the confidence level may be indicative of the quality of the user action signal. For example, the higher the confidence, the better the quality of the user action signal. In some embodiments, the evaluation of the user motion signal may be performed during the stages of acquiring the motion signal, preprocessing, segmenting and/or identifying.
In some embodiments, training the motion recognition model according to the user motion signal may further include determining whether the confidence level is greater than a confidence threshold (e.g., 80), and if the confidence level is greater than or equal to the confidence threshold, training the motion recognition model based on the user motion signal corresponding to the confidence level as sample data; and if the confidence coefficient is smaller than the confidence coefficient threshold value, the user action signal corresponding to the confidence coefficient is not taken as sample data to train the action recognition model. In some embodiments, the confidence level may include, but is not limited to, a confidence level at any stage of the acquisition of the motion signal, signal preprocessing, signal segmentation or signal recognition. For example, the confidence level of the motion signal collected by the obtaining module 210 is used as the judgment criterion. In some embodiments, the confidence may also be a joint confidence of any several stages of acquiring motion signals, signal preprocessing, signal segmentation or signal recognition, etc. The joint confidence may be calculated based on the confidence of each stage, by averaging or weighting, etc. In some embodiments, training the motion recognition model from the user motion signal may be in real-time, periodically (e.g., a day, a week, a month, etc.), or trained to meet a certain amount of data.
It should be noted that the above description of the process 1700 is only for illustration and description, and does not limit the application scope of the present specification. Various modifications and changes to flow 1700 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are intended to be within the scope of the present description.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the broad application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such alterations, modifications, and improvements are intended to be suggested herein and are intended to be within the spirit and scope of the exemplary embodiments of this application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing processing device or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
The entire contents of each patent, patent application publication, and other material cited in this application, such as articles, books, specifications, publications, documents, and the like, are hereby incorporated by reference into this application. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements and/or uses of the present application in the material attached to this application.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application can be viewed as being consistent with the teachings of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and depicted herein.

Claims (23)

1. A method of motion monitoring, comprising:
acquiring a motion signal of a user during motion, wherein the motion signal at least comprises an electromyographic signal or an attitude signal; and
and monitoring the movement of the user on the basis of at least the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the posture signals.
2. The method of claim 1, wherein monitoring the motion of the user based on at least the characteristic information corresponding to the electromyographic signal or the characteristic information corresponding to the gesture signal comprises:
segmenting the action signal based on feature information corresponding to the electromyographic signal or feature information corresponding to the posture signal; and
and monitoring the motion of the user motion based on at least one segment of the motion signal.
3. The method according to claim 2, wherein the characteristic information corresponding to the electromyographic signals at least comprises frequency information or amplitude information, and the characteristic information corresponding to the posture signals at least comprises one of angular velocity direction, angular velocity value and acceleration value, angle, displacement information and stress of the angular velocity.
4. The method according to claim 3, wherein segmenting the motion signal based on the feature information corresponding to the electromyographic signal or the feature information corresponding to the posture signal comprises:
determining at least one target feature point from a time domain window based on the electromyographic signal or the posture signal according to a preset condition; and
segmenting the motion signal based on the at least one target feature point.
5. The method of claim 4, wherein the at least one target feature point comprises one of an action start point, an action intermediate point, and an action end point.
6. The method according to claim 5, wherein the preset conditions include one or more of a change of an angular velocity direction corresponding to the posture signal, a change of an angular velocity greater than or equal to an angular velocity threshold value corresponding to the posture signal, an extreme value of a change of an angular velocity value corresponding to the posture signal, an angle reaching angle threshold value corresponding to the posture signal, and a magnitude information greater than or equal to an electromyographic threshold value corresponding to the electromyographic signal.
7. The method according to claim 6, wherein the preset condition further comprises that the acceleration of the angular velocity corresponding to the gesture signal is continuously greater than or equal to the acceleration threshold of the angular velocity within a first specific time range.
8. The method according to claim 6, characterized in that said preset conditions further comprise that the corresponding amplitude of said electromyographic signal is continuously greater than said electromyographic threshold value for a second specific time range.
9. The method according to claim 1, wherein the monitoring the motion of the user based on at least the characteristic information corresponding to the electromyographic signal or the characteristic information corresponding to the gesture signal comprises:
preprocessing the electromyographic signals in a frequency domain or a time domain; and
acquiring characteristic information corresponding to the electromyographic signals based on the preprocessed electromyographic signals, and monitoring the movement of the user according to the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the posture signals.
10. The method according to claim 9, characterized in that said pre-processing of said electromyographic signals in the frequency or time domain comprises: and filtering the electromyographic signals to select components in a specific frequency range in the electromyographic signals on a frequency domain.
11. The method according to claim 9, characterized in that said pre-processing of the electromyographic signals in the frequency or time domain comprises a signal correction processing of the electromyographic signals in the time domain.
12. The method according to claim 11, wherein the signal correction processing on the electromyogram signal in the time domain comprises:
determining a singular point in the electromyographic signal, wherein the singular point corresponds to a mutation signal in the electromyographic signal; and
and performing signal correction processing on singular points of the electromyographic signals.
13. The method according to claim 12, wherein the signal correction processing of the singular point of the electromyographic signal comprises removing the singular point or modifying the singular point according to a signal around the singular point.
14. The method of claim 12, wherein the singular point comprises a spur signal, the determining the singular point in the electromyographic signal comprising:
selecting different time windows from the time domain window of the electromyographic signal based on the time domain window of the electromyographic signal, wherein the different time windows respectively cover different time ranges; and
and determining the burr signal based on the characteristic information corresponding to the electromyographic signals in the different time windows.
15. The method of claim 1, further comprising determining feature information corresponding to the gesture signal based on the gesture signal, wherein the gesture signal comprises coordinate information in at least one original coordinate system;
the determining, based on the gesture signal, feature information corresponding to the gesture signal includes:
acquiring a target coordinate system and a conversion relation between the target coordinate system and the at least one original coordinate system;
converting the coordinate information in the at least one original coordinate system into coordinate information in the target coordinate system based on the conversion relation; and
determining feature information corresponding to the attitude signal based on the coordinate information in the target coordinate system.
16. The method of claim 15, wherein the gesture signal comprises coordinate information generated by at least two sensors respectively located at different moving parts of the user and corresponding to different original coordinate systems, and wherein determining the feature information corresponding to the gesture signal based on the gesture signal comprises:
determining characteristic information respectively corresponding to the at least two sensors based on the conversion relation between the different original coordinate systems and the target coordinate system; and
determining relative motion between different moving parts of the user based on the characteristic information corresponding to the at least two sensors, respectively.
17. The method according to claim 15, wherein the transformation relationship between the at least one original coordinate system and the target coordinate system is obtained by a calibration process comprising:
constructing a specific coordinate system, wherein the specific coordinate system is related to the orientation of a user in the calibration process;
acquiring first coordinate information in the at least one original coordinate system when the user is in a first posture;
acquiring second coordinate information of the at least one original coordinate system when the user is in a second posture; and
and determining the conversion relation between the at least one original coordinate system and the specific coordinate system according to the first coordinate information, the second coordinate information and the specific coordinate system.
18. The method of claim 17, wherein the calibration process further comprises:
acquiring a conversion relation between the specific coordinate system and the target coordinate system; and
and determining the conversion relation between the at least one original coordinate system and the target coordinate system according to the conversion relation between the at least one original coordinate system and the specific coordinate system and the conversion relation between the specific coordinate system and the target coordinate system.
19. The method of claim 15, wherein the target coordinate system changes as the orientation of the user changes.
20. A training method for determining a motion recognition model, comprising:
acquiring sample information, wherein the sample information comprises a motion signal when a user moves, and the motion signal at least comprises characteristic information corresponding to an electromyographic signal and characteristic information corresponding to an attitude signal; and
training the motion recognition model based on the sample information.
21. A method of motion monitoring and feedback, comprising:
acquiring action signals of a user during movement, wherein the action signals at least comprise electromyographic signals and posture signals; and
and monitoring the action of the user based on the characteristic information corresponding to the electromyographic signal and the characteristic information corresponding to the attitude signal through an action recognition model, and feeding back the action based on an output result of the action recognition model.
22. The method of claim 21, wherein the motion recognition model comprises a trained machine learning model or a pre-set model.
23. The method of claim 21, comprising: the action feedback at least comprises one of sending out prompt information, stimulating the movement part of the user and outputting the movement record of the user during movement.
CN202110298643.9A 2021-03-19 2021-03-19 Motion monitoring method and system Pending CN115105819A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110298643.9A CN115105819A (en) 2021-03-19 2021-03-19 Motion monitoring method and system
TW111110179A TWI837620B (en) 2021-03-19 2022-03-18 Method and system for motion monitoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110298643.9A CN115105819A (en) 2021-03-19 2021-03-19 Motion monitoring method and system

Publications (1)

Publication Number Publication Date
CN115105819A true CN115105819A (en) 2022-09-27

Family

ID=83323387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110298643.9A Pending CN115105819A (en) 2021-03-19 2021-03-19 Motion monitoring method and system

Country Status (1)

Country Link
CN (1) CN115105819A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024065720A1 (en) * 2022-09-30 2024-04-04 深圳市韶音科技有限公司 Method for monitoring running of user, and signal acquisition apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150272482A1 (en) * 2014-03-26 2015-10-01 GestureLogic Inc. Systems, methods and devices for activity recognition
CN105705092A (en) * 2013-06-03 2016-06-22 Mc10股份有限公司 Motion sensor and analysis
CN107456743A (en) * 2017-08-14 2017-12-12 京东方科技集团股份有限公司 Exercise guidance method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105705092A (en) * 2013-06-03 2016-06-22 Mc10股份有限公司 Motion sensor and analysis
US20150272482A1 (en) * 2014-03-26 2015-10-01 GestureLogic Inc. Systems, methods and devices for activity recognition
CN107456743A (en) * 2017-08-14 2017-12-12 京东方科技集团股份有限公司 Exercise guidance method and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024065720A1 (en) * 2022-09-30 2024-04-04 深圳市韶音科技有限公司 Method for monitoring running of user, and signal acquisition apparatus

Similar Documents

Publication Publication Date Title
Yoon et al. Improvement of dynamic respiration monitoring through sensor fusion of accelerometer and gyro-sensor
US20180055375A1 (en) Systems and methods for determining an intensity level of an exercise using photoplethysmogram (ppg)
CN108113663A (en) Cardiac event detection system and method
US20230233103A1 (en) Motion monitoring methods and systems
CN107961523A (en) Human body training system and intelligent body-building system based on heart rate detection
Wang et al. Motion analysis of deadlift for trainers with different levels based on body sensor network
CN111685769A (en) Exoskeleton function detection system
US20230210402A1 (en) Methods and devices for motion monitoring
CN115105819A (en) Motion monitoring method and system
EP3515301A1 (en) Systems, devices, and methods for biometric assessment
CN206404266U (en) System is instructed in Tai Ji
Němcová et al. Recommendations for ECG acquisition using BITalino
TWI837620B (en) Method and system for motion monitoring
CN116304544A (en) Motion data calibration method and system
TW202239378A (en) Method and system for motion monitoring
CN116785659A (en) Motion monitoring method and device
CN106037640A (en) Injury remote analysis system and method
RU2813471C1 (en) Methods and systems for identifying user action
Li et al. Dynamic monitoring method of physical training intensity based on wearable sensors
US20230337989A1 (en) Motion data display method and system
Ivanov et al. Recognition and Control of the Athlete's Movements Using a Wearable Electronics System
Curone et al. An activity classifier based on heart rate and accelerometer data fusion
CN117651847A (en) Motion data calibration method and system
CN117653996A (en) Motion monitoring system, device and method
CN116965800A (en) Respiratory state evaluation method based on electrocardiographic data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40075807

Country of ref document: HK