CN116785659A - Motion monitoring method and device - Google Patents

Motion monitoring method and device Download PDF

Info

Publication number
CN116785659A
CN116785659A CN202210270372.0A CN202210270372A CN116785659A CN 116785659 A CN116785659 A CN 116785659A CN 202210270372 A CN202210270372 A CN 202210270372A CN 116785659 A CN116785659 A CN 116785659A
Authority
CN
China
Prior art keywords
user
motion
signal
action
muscle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210270372.0A
Other languages
Chinese (zh)
Inventor
苏雷
黎美琪
周鑫
廖风云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Voxtech Co Ltd
Original Assignee
Shenzhen Voxtech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Voxtech Co Ltd filed Critical Shenzhen Voxtech Co Ltd
Priority to CN202210270372.0A priority Critical patent/CN116785659A/en
Publication of CN116785659A publication Critical patent/CN116785659A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F27/00Combined visual and audible advertising or displaying, e.g. for public address
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • A63B2024/0065Evaluating the fitness, e.g. fitness level or fitness index
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • A63B2024/0071Distinction between different activities, movements, or kind of sports performed
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2208/00Characteristics or parameters related to the user or player
    • A63B2208/02Characteristics or parameters related to the user or player posture
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/08Measuring physiological parameters of the user other bio-electrical signals
    • A63B2230/085Measuring physiological parameters of the user other bio-electrical signals used as a control parameter for the apparatus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/08Measuring physiological parameters of the user other bio-electrical signals
    • A63B2230/10Measuring physiological parameters of the user other bio-electrical signals electroencephalographic signals
    • A63B2230/105Measuring physiological parameters of the user other bio-electrical signals electroencephalographic signals used as a control parameter for the apparatus

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The specification discloses a method for displaying a motion monitoring interface, which comprises the following steps: acquiring action signals of a user during movement from at least one sensor, wherein the action signals at least comprise electromyographic signals or gesture signals; determining information related to the movement of the user by processing the motion signal; and displaying the information related to the movement of the user. According to the method for displaying the movement monitoring interface, which is provided by the embodiment of the specification, the information related to the movement of the user can be displayed by using the display device, the user can intuitively observe the problems existing in the movement of the user according to the display content, and the movement can be timely adjusted to perform scientific movement.

Description

Motion monitoring method and device
Technical Field
The present disclosure relates to the field of wearable devices, and in particular, to a method and a device for monitoring motion.
Background
With people's attention to scientific sports and physical health, sports monitoring devices are greatly developing. At present, the exercise monitoring device mainly monitors partial physiological parameter information (such as heart rate, body temperature, step frequency, blood oxygen and the like) in the exercise process of a user, displays physiological data to the user and gives exercise advice according to the physiological data. In an actual scene, the motion monitoring device often cannot fully and accurately display the motion monitoring result to the user, so that the user cannot timely learn the motion condition of the user, or the physiological data given by the system has a larger motion sense difference with the motion of the user, and the reliability of the motion monitoring device by the user is reduced.
Therefore, it is desirable to provide a method and apparatus for monitoring exercise so as to be able to monitor and display exercise data of a user during exercise comprehensively and accurately.
Disclosure of Invention
One aspect of the present specification provides a motion monitoring interface display method, the method including: acquiring action signals of a user during movement from at least one sensor, wherein the action signals at least comprise electromyographic signals or gesture signals; determining information related to the movement of the user by processing the motion signal; and displaying the information related to the movement of the user.
In some embodiments, the determining information related to the motion of the user by processing the motion signal comprises: based on the electromyographic signals, a strength of the force of at least one muscle of the user is determined.
In some embodiments, the displaying the information related to the movement of the user comprises: acquiring user input about a target muscle; and displaying a status bar, wherein the color of the status bar is related to the strength of the force of the target muscle, or making a sound, and the volume of the sound is related to the strength of the force of the target muscle.
In some embodiments, the determining information related to the motion of the user by processing the motion signal comprises: based on the gesture signal, a user action model is generated that represents an action of the user motion.
In some embodiments, the displaying the information related to the movement of the user comprises: obtaining a standard action model; and displaying the user action model and the standard action model.
In some embodiments, the displaying the information related to the movement of the user comprises: determining a strength of force of at least one muscle of the user based on the electromyographic signal; and displaying the strength of the force exerted by the at least one muscle on the user action model.
In some embodiments, the determining information related to the motion of the user by processing the motion signal comprises: segmenting the motion signal based on the electromyographic signal or the gesture signal; and monitoring the motion of the user motion based on at least one section of the motion signal, and determining a monitoring result.
In some embodiments, the method further comprises: determining a mode of action feedback based on the monitoring result; and performing action feedback on the user according to the action feedback mode.
In some embodiments, the monitoring result includes information of a muscle of the user corresponding to at least one point in time, the muscle information of the user including at least one of energy consumption of at least one muscle, a degree of fatigue of the at least one muscle, an equilibrium of the at least two muscles, a capacity of the at least one muscle, the displaying the information related to the movement of the user includes: at least one of energy expenditure of at least one muscle of the user, a degree of fatigue of the at least one muscle, a training balance of the at least two muscles, a capacity of the at least one muscle is displayed at least one location in a user model, wherein the at least one location in the user model corresponds to a location of the at least one muscle in the user.
In some embodiments, the displaying the information related to the movement of the user comprises: acquiring user input about a target muscle; and displaying information of the target muscle.
In some embodiments, the displaying the information related to the movement of the user comprises: and displaying the monitoring result in at least one mode of characters, charts, sounds, images and videos.
In some embodiments, the method further comprises: and calibrating the action signal.
In some embodiments, the method further comprises: judging whether the working state of the sensor is normal or not based on the action signal; and if the working state of the sensor is abnormal, displaying prompt information.
In some embodiments, the action signal comprises a signal related to the user characteristic, the method further comprising: determining body type information and/or body composition information of the user based on the signals related to the user characteristics; and displaying the body type information and/or the body composition information of the user.
The embodiment of the present specification also provides an electronic device, which is characterized in that the electronic device includes: a display device configured to display content; an input device configured to receive user input; at least one sensor configured to detect a motion signal when a user moves, wherein the motion signal comprises at least an electromyographic signal or a gesture signal; and a processor connected to the display device, the input device, and the at least one sensor, the processor configured to: acquiring motion signals of the user during movement from the at least one sensor; determining information related to the movement of the user by processing the motion signal; and controlling the display device to display information related to the movement of the user.
Drawings
The application will be further described by way of exemplary embodiments, which will be described in detail with reference to the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a schematic illustration of an application scenario of a motion monitoring system according to some embodiments of the application;
FIG. 2 is a schematic diagram of exemplary hardware and/or software of a wearable device shown according to some embodiments of the application;
FIG. 3 is a schematic diagram of exemplary hardware and/or software of a computing device shown according to some embodiments of the application;
FIG. 4 is an exemplary block diagram of a wearable device shown according to some embodiments of the application;
FIG. 5 is an exemplary flow chart of a method of motion monitoring according to some embodiments of the application;
FIG. 6 is an exemplary flow chart for monitoring user athletic activity in accordance with some embodiments of the application;
FIG. 7 is an exemplary flow chart of action signal segmentation shown in accordance with some embodiments of the present application;
FIG. 8 is an exemplary normalized result graph of motion signal segmentation shown in accordance with some embodiments of the present application;
FIG. 9 is an exemplary flow chart of electromyographic signal preprocessing shown according to some embodiments of the application;
FIG. 10 is an exemplary flow chart of a deburring signal shown in accordance with some embodiments of the present application;
FIG. 11 is an exemplary flow chart for determining feature information corresponding to a gesture signal according to some embodiments of the application;
FIG. 12 is an exemplary flow chart for determining relative movement between different movement portions of a user, according to some embodiments of the application;
FIG. 13 is an exemplary flow chart for determining the conversion relationship of an original coordinate system to a particular coordinate system, according to some embodiments of the application;
FIG. 14 is an exemplary flowchart illustrating determining a conversion relationship between an original coordinate system and a target coordinate system according to some embodiments of the application;
FIG. 15A is an exemplary vector graph of Euler angle data in a raw coordinate system at a human forearm position, according to some embodiments of the application;
FIG. 15B is an exemplary vector graph of Euler angle data in an original coordinate system at another location of a human forearm, according to some embodiments of the application;
FIG. 16A is an exemplary vector graph of Euler angle data in a target coordinate system at a human forearm position, according to some embodiments of the application;
FIG. 16B is an exemplary vector graph of Euler angle data in a target coordinate system at another location of a human forearm, according to some embodiments of the application;
FIG. 17 is an exemplary vector coordinate graph of Euler angle data in a target coordinate system of a multi-sensor shown in accordance with some embodiments of the application;
FIG. 18A is an exemplary result graph of raw angular velocity shown in accordance with some embodiments of the present application;
FIG. 18B is an exemplary result graph of angular velocity after a filtering process according to some embodiments of the application;
FIG. 19 is an exemplary flow chart of a motion monitoring and feedback method according to some embodiments of the application;
FIG. 20 is an exemplary flow chart of an application of model training shown in accordance with some embodiments of the present application;
FIG. 21A is an exemplary flow chart of a method of displaying a motion monitoring interface according to some embodiments of the present description;
FIG. 21B is an example diagram of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 22 is an exemplary flow chart of a method of displaying a motion monitoring interface according to some embodiments of the present description;
FIG. 23A is a schematic illustration of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 23B is a schematic illustration of a motion monitoring interface shown according to some embodiments of the present description;
FIG. 23C is a schematic illustration of a motion monitoring interface shown according to some embodiments of the present description;
FIG. 24 is an exemplary flowchart of a method of displaying a motion monitoring interface according to some embodiments of the present description;
FIG. 25 is a schematic illustration of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 26 is a schematic illustration of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 27 is a schematic illustration of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 28 is a schematic illustration of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 29 is a schematic diagram of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 30 is a schematic illustration of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 31 is a schematic illustration of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 32 is a schematic diagram of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 33 is a schematic diagram of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 34 is a schematic diagram of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 35 is a schematic view of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 36 is a schematic illustration of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 37 is a schematic view of a motion monitoring interface shown in accordance with some embodiments of the present description;
FIG. 38 is a schematic diagram of a motion monitoring interface shown according to some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solution of the embodiments of the present application, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is apparent to those of ordinary skill in the art that the present application may be applied to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies of different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in the specification and in the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in the present application to describe the operations performed by a system according to embodiments of the present application. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
The present disclosure provides a motion monitoring system that may obtain a motion signal when a user moves, where the motion signal includes at least an electromyographic signal, a posture signal, an electrocardiographic signal, a respiratory rate signal, and the like. The system can monitor the motion of the user motion at least based on the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the gesture signals. For example, the type of motion, the number of motions, the motion quality motion time, or physiological parameter information when the user performs the motion, etc. of the user are determined from the frequency information, the amplitude information, the angular velocity corresponding to the posture signal, the angular velocity direction, the angular velocity value of the angular velocity, the angle, the displacement information, the stress, etc. In some embodiments, the athletic monitoring system may also generate feedback on the user's exercise based on the analysis of the user's exercise to guide the user's exercise. For example, when the user's fitness activity is not standard, the athletic monitoring system may send a prompt (e.g., voice prompt, vibration prompt, current stimulus, etc.) to the user. The motion monitoring system can be applied to wearable equipment (such as clothing, wrists and helmets), medical detection equipment (such as myoelectricity testers), fitness equipment and the like, and can accurately monitor and feed back the motion of a user by acquiring motion signals when the user moves, without participation of professional staff, and can reduce the cost of user fitness while improving the fitness efficiency of the user.
Fig. 1 is a schematic view of an application scenario of a motion monitoring system according to some embodiments of the present application. As shown in fig. 1, the athletic monitoring system 100 may include a processing device 110, a network 120, a wearable device 130, and a mobile terminal device 140. The motion monitoring system 100 may acquire motion signals (e.g., electromyographic signals, gestural signals, electrocardiographic signals, respiratory rate signals, etc.) that characterize the motion of a user's motion and monitor and feedback the motion of the user as a function of the user's motion signals.
For example, the athletic monitoring system 100 may monitor and feed back the user's actions while exercising. As the user wears wearable device 130 for fitness exercises, wearable device 130 may obtain the user's motion signal. The processing device 110 or mobile terminal device may receive and analyze the user's motion signal to determine whether the user's exercise motion is normal, thereby monitoring the user's motion. In particular, monitoring the user's actions may include determining an action type, number of actions, quality of action, time of action, or physiological parameter information when the user performs the action, etc. Further, the athletic monitoring system 100 may generate feedback on the user's exercise based on the analysis of the user's exercise to guide the user's exercise.
As another example, the athletic monitoring system 100 may monitor and feed back the user's movements while running. For example, when a user wears wearable device 130 to perform a running exercise, exercise monitoring system 100 may monitor whether the user's running motion is normative, whether the running time meets health standards, and so on. When the user runs for too long or the running motion is incorrect, the exercise device may feed back its exercise state to the user to prompt the user to adjust the running motion or running time.
In some embodiments, the processing device 110 may be used to process information and/or data related to user motion. For example, the processing device 110 may receive an action signal (e.g., an electromyographic signal, an attitude signal, an electrocardiographic signal, a respiratory rate signal, etc.) of the user, and further extract feature information corresponding to the action signal (e.g., feature information corresponding to the electromyographic signal, feature information corresponding to the attitude signal in the action signal). In some embodiments, the processing device 110 may perform specific signal processing on the electromyographic signals or gesture signals acquired by the wearable device 130, such as signal segmentation, signal preprocessing (e.g., signal correction processing, filtering processing, etc.), and so forth. In some embodiments, the processing device 110 may also determine whether the user action is correct based on the user's action signal. For example, the processing device 110 may determine whether the user action is correct based on characteristic information (e.g., amplitude information, frequency information, etc.) corresponding to the electromyographic signals. For another example, the processing device 110 may determine whether the user action is correct based on the feature information (e.g., angular velocity direction, acceleration of angular velocity, angle, displacement information, stress, etc.) corresponding to the gesture signal. For another example, the processing device 110 may determine whether the user action is correct based on the feature information corresponding to the electromyographic signals and the feature information corresponding to the gesture signals. In some embodiments, the processing device 110 may also determine whether the physiological parameter information of the user while exercising meets health criteria. In some embodiments, the processing device 110 may also issue corresponding instructions to feedback the movement of the user. For example, when the user performs a running exercise, exercise monitoring system 100 monitors that the user has run for an excessive amount of time, at which point processing device 110 may instruct mobile terminal device 140 to prompt the user to adjust the running time. It should be noted that the feature information corresponding to the gesture signal is not limited to the above-mentioned angular velocity, angular velocity direction, acceleration, angle, displacement information, stress, etc. of the angular velocity, but may be other feature information, and any parameter information that can be used to embody relative motion of the body of the user may be the feature information corresponding to the gesture signal. For example, when the posture sensor is a strain sensor, the bending angle and the bending direction at the joint of the user can be obtained by measuring the magnitude of the resistance in the strain sensor that varies with the tensile length.
In some embodiments, the processing device 110 may be local or remote. For example, the processing device 110 may access information and/or material stored in the wearable device 130 and/or the mobile terminal device 140 via the network 120. In some embodiments, the processing device 110 may be directly connected with the wearable device 130 and/or the mobile terminal device 140 to access information and/or material stored therein. For example, the processing device 110 may be located in the wearable device 130 and enable information interaction with the mobile terminal device 140 through the network 120. As another example, the processing device 110 may be located in the mobile terminal device 140 and enable information interaction with the wearable device 130 through a network. In some embodiments, the processing device 110 may execute on a cloud platform. For example, the cloud platform may include one of a private cloud, a public cloud, a hybrid cloud, a community cloud, a decentralized cloud, an internal cloud, or the like, or any combination thereof.
In some embodiments, processing device 110 may process data and/or information related to motion monitoring to perform one or more of the functions described in this disclosure. In some embodiments, the processing device may acquire motion signals acquired by the wearable device 130 as the user moves. In some embodiments, the processing device may send control instructions to the wearable device 130 or the mobile terminal device 140. The control instructions may control the switch states of the wearable device 130 and its various sensors. The mobile terminal device 140 can also be controlled to send out prompt information. In some embodiments, processing device 110 may include one or more sub-processing devices (e.g., single core processing devices or multi-core processing devices). By way of example only, the processing device 110 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Processor (ASIP), a Graphics Processor (GPU), a Physical Processor (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), an editable logic circuit (PLD), a controller, a microcontroller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, and the like, or any combination thereof.
Network 120 may facilitate the exchange of data and/or information within motion monitoring system 100. In some embodiments, one or more components in the athletic monitoring system 100 (e.g., the processing device 110, the wearable device 130, the mobile terminal device 140) may send data and/or information to other components in the athletic monitoring system 100 over the network 120. For example, the action signals collected by wearable device 130 may be transmitted to processing device 110 over network 120. As another example, the result of the acknowledgement regarding the action signal in the processing device 110 may be transmitted to the mobile terminal device 140 through the network 120. In some embodiments, network 120 may be any type of wired or wireless network. For example, the network 120 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an intranet, the Internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a Bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, and the like, or any combination thereof. In some embodiments, network 120 may include one or more network access points. For example, network 120 may include wired or wireless network access points, such as base station and/or Internet switching points 120-1, 120-2, …, through which one or more components of motion monitoring system 100 may connect to network 120 to exchange data and/or information.
Wearable device 130 refers to a garment or device having a wearable function. In some embodiments, wearable device 130 may include, but is not limited to, a coat apparatus 130-1, a pants apparatus 130-2, a wrist guard apparatus 130-3, a shoe 130-4, and the like. In some embodiments, wearable device 130 may include a plurality of sensors. The sensor may acquire various motion signals (e.g., electromyographic signals, gesture signals, temperature information, heart beat frequency, cardiac signals, etc.) as the user moves. In some embodiments, the sensor may include, but is not limited to, one or more of a myoelectric sensor, a gesture sensor, a temperature sensor, a humidity sensor, a cardioelectric sensor, an oximetry sensor, a hall sensor, a dermatologic sensor, a rotation sensor, and the like. For example, a myoelectric sensor may be provided at a location of a human muscle in the coat apparatus 130-1 (e.g., bicep, tricep, latissimus dorsi, trapezius, etc.), and may be applied to the skin of the user and collect myoelectric signals as the user moves. For another example, an electrocardiograph sensor may be disposed near the left pectoral muscle of the human body in the jacket apparatus 130-1, and the electrocardiograph sensor may collect electrocardiographic signals of the user. As another example, an attitude sensor may be provided at a location of a human muscle (e.g., gluteus maximus, vastus lateral muscle, vastus medial muscle, gastrocnemius muscle, etc.) in the pant device 130-2, and the attitude sensor may acquire an attitude signal of the user. In some embodiments, wearable device 130 may also feedback the actions of the user. For example, when the motion of a certain part of the body is not in accordance with the standard when the user moves, the myoelectric sensor corresponding to the part can generate a stimulation signal (such as current stimulation or beating signal) to remind the user.
It should be noted that the wearable device 130 is not limited to the coat device 130-1, the trousers device 130-2, the wrist support device 130-3 and the shoe device 130-4 shown in fig. 1, but may also be applied to other devices requiring exercise monitoring, such as a helmet device, a knee support device, etc., and is not limited herein, and any device that can use the exercise monitoring method included in the present specification is within the scope of the present application.
In some embodiments, mobile terminal device 140 may obtain information or data in motion monitoring system 100. In some embodiments, the mobile terminal device 140 may receive the processed motion data from the processing device 110 and feedback a motion record based on the processed motion data, or the like. Exemplary feedback means may include, but are not limited to, voice prompts, image prompts, video presentations, text prompts, and the like. In some embodiments, the user may obtain a record of the motion during his own motion through the mobile terminal device 140. For example, the mobile terminal device 140 may be connected (e.g., wired, wireless) to the wearable device 130 via the network 120, and the user may obtain, via the mobile terminal device 140, a record of the motion of the user, which may be transmitted to the processing device 110 via the mobile terminal device 140. In some embodiments, the mobile terminal device 140 may include one or any combination of a mobile device 140-1, a tablet 140-2, a notebook 140-3, and the like. In some embodiments, the mobile device 140-1 may include a cell phone, a smart home device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
In some embodiments, the athletic monitoring system 100 may also include a database. The database may store material (e.g., initially set threshold conditions, etc.) and/or instructions (e.g., feedback instructions). In some embodiments, the database may store material obtained from the wearable device 130 and/or the mobile terminal device 140. In some embodiments, the database may store information and/or instructions for execution or use by the processing device 110 to perform the exemplary methods described herein. In some embodiments, the database may include mass storage, removable storage, volatile read-write memory (e.g., random access memory RAM), read-only memory (ROM), and the like, or any combination thereof. In some embodiments, the database may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a decentralized cloud, an internal cloud, or the like, or any combination thereof.
In some embodiments, the database may be connected with the network 120 to communicate with one or more components of the athletic monitoring system 100 (e.g., the processing device 110, the wearable device 130, the mobile terminal device 140, etc.). One or more components of the athletic monitoring system 100 may access materials or instructions stored in a database via the network 120. In some embodiments, the database may be directly connected to or in communication with one or more components in the athletic monitoring system 100 (e.g., the processing device 110, the wearable device 130, the mobile terminal device 140). In some embodiments, the database may be part of the processing device 110.
Fig. 2 is a schematic diagram of exemplary hardware and/or software of a wearable device shown according to some embodiments of the application. As shown in fig. 2, the wearable device 130 may include an acquisition module 210, a processing module 220 (also referred to as a processor), a control module 230 (also referred to as a master, MCU, controller), a communication module 240, a power supply module 250, and an input/output module 260.
The acquisition module 210 may be used to acquire motion signals as the user moves. In some embodiments, the acquisition module 210 may include a sensor unit that may be used to acquire one or more motion signals as the user moves. In some embodiments, the sensor unit may include, but is not limited to, one or more of a myoelectric sensor, a posture sensor, a cardiac electric sensor, a respiratory sensor, a temperature sensor, a humidity sensor, an inertial sensor, a blood oxygen saturation sensor, a hall sensor, a piezoelectric sensor, a rotation sensor, and the like. In some embodiments, the motion signal may include one or more of an electromyographic signal, a posture signal, an electrocardiographic signal, a respiratory rate, a temperature signal, a humidity signal, and the like. The sensor unit may be placed in different locations of the wearable device 130 depending on the type of motion signal to be acquired. For example, in some embodiments, a myoelectric sensor (also referred to as an electrode element) may be provided at a human muscle location, and the myoelectric sensor may be configured to collect myoelectric signals as the user moves. The electromyographic signals and their corresponding characteristic information (e.g., frequency information, amplitude information, etc.) may reflect the state of the muscle as the user moves. The posture sensors may be disposed at different locations of the human body (e.g., locations corresponding to the torso, limbs, joints in the wearable device 130), and the posture sensors may be configured to collect posture signals as the user moves. The gesture signal and its corresponding characteristic information (e.g., angular velocity direction, angular velocity value, angular velocity acceleration value, angle, displacement information, stress, etc.) may reflect the gesture of the user's motion. The electrocardiograph may be disposed at a position on a peripheral side of a chest of the human body, and the electrocardiograph may be configured to collect electrocardiographic data when the user moves. The respiration sensor may be disposed at a location on the circumference of the chest of the person, and the respiration sensor may be configured to collect respiration data (e.g., respiration rate, respiration amplitude, etc.) as the user moves. The temperature sensor may be configured to collect temperature data (e.g., body surface temperature) as the user moves. The humidity sensor may be configured to collect humidity data of the external environment while the user is in motion.
The processing module 220 may process data from the acquisition module 210, the control module 230, the communication module 240, the power module 250, and/or the input/output module 260. For example, the processing module 220 may process motion signals from the acquisition module 210 during user motion. In some embodiments, the processing module 220 may pre-process the motion signals (e.g., electromyographic signals, gesture signals) acquired by the acquisition module 210. For example, the processing module 220 performs a segmentation process on the electromyographic signals or gesture signals while the user is moving. For another example, the processing module 220 may perform preprocessing (e.g., filtering processing, signal correction processing) on the electromyographic signals while the user is in motion to improve the quality of the electromyographic signals. For another example, the processing module 220 may determine feature information corresponding to the gesture signal based on the gesture signal when the user is moving. In some embodiments, the processing module 220 may process instructions or operations from the input/output module 260. In some embodiments, the processed data may be stored in memory or a hard disk. In some embodiments, processing module 220 may transmit its processed data to one or more components in athletic monitoring system 100 via communication module 240 or network 120. For example, the processing module 220 may send the monitoring result of the user motion to the control module 230, and the control module 230 may perform subsequent operations or instructions according to the action determination result.
The control module 230 may be connected with other modules in the wearable device 130. In some embodiments, the control module 230 may control the operational status of other modules (e.g., the communication module 240, the power module 250, the input/output module 260) in the wearable device 130. For example, the control module 230 may control a power supply state (e.g., normal mode, power saving mode), a power supply time, etc. of the power supply module 250. When the remaining power of the power supply module 250 is below a certain threshold (e.g., 10%), the control module 230 may control the power supply module 250 to enter a power saving mode or send out a prompt message about the supplementary power. For another example, the control module 230 may control the input/output module 260 according to the result of the motion determination of the user, and may further control the mobile terminal device 140 to send the feedback result of the motion thereof to the user. When a problem occurs in the motion of the user (for example, the motion does not meet the standard), the control module 230 may control the input/output module 260, and may further control the mobile terminal device 140 to feed back to the user, so that the user may know the motion state of the user and adjust the motion in real time. In some embodiments, the control module 230 may also control one or more sensors or other modules in the acquisition module 210 to feedback the human body. For example, when the strength of the force exerted by a certain muscle during the exercise of the user is too great, the control module 230 may control the electrode module at the muscle position to electrically stimulate the user to prompt the user to adjust the action in time.
In some embodiments, the communication module 240 may be used for the exchange of information or data. In some embodiments, the communication module 240 may be used for communication between internal components of the wearable device 130 (e.g., the acquisition module 210, the processing module 220, the control module 230, the power supply module 250, the input/output module 260). For example, the acquisition module 210 may send user action signals (e.g., electromyographic signals, gesture signals, etc.) to the communication module 240, which communication module 240 may send to the processing module 220. In some embodiments, the communication module 240 may also be used for communication between the wearable device 130 and other components in the athletic monitoring system 100 (e.g., the processing device 110, the mobile terminal device 140). For example, the communication module 240 may send status information (e.g., a switch status) of the wearable device 130 to the processing device 110, which the processing device 110 may monitor the wearable device 130 based on. The communication module 240 may employ wired, wireless, and hybrid wired/wireless technologies. The wireline technology may be based on one or more fiber optic cable combinations, such as metallic cables, hybrid cables, fiber optic cables, and the like. Wireless technologies may include Bluetooth (Bluetooth), wireless network (Wi-Fi), zigBee (ZigBee), near field communication (Near Field Communication, NFC), radio frequency identification technology (Radio Frequency Identification, RFID), cellular networks (including GSM, CDMA, 3G, 4G, 5G, etc.), cellular-based narrowband internet of things (Narrow Band Internet of Things, NBIoT), etc. In some embodiments, the communication module 240 may encode the transmitted information using one or more encoding schemes, which may include, for example, phase encoding, non-return-to-zero encoding, differential Manchester encoding, and the like. In some embodiments, the communication module 240 may select different transmission and coding modes according to the type of data or network type to be transmitted. In some embodiments, the communication module 240 may include one or more communication interfaces for different communication modes. In some embodiments, other illustrated modules of the athletic performance monitoring system 100 may be distributed across multiple devices, in which case each of the other modules may each include one or more communication modules 240 for inter-module information transfer. In some embodiments, the communication module 240 may include one receiver and one transmitter. In other embodiments, the communication module 240 may be a transceiver.
In some embodiments, the power module 250 may provide power to other components in the athletic monitoring system 100 (e.g., the acquisition module 210, the processing module 220, the control module 230, the communication module 240, the input/output module 260). The power module 250 may receive control signals from the processing module 220 to control the power output of the wearable device 130. For example, in the event that wearable device 130 does not receive any operation for a certain period of time (e.g., 1s, 2s, 3s, or 4 s) (e.g., no action signal is detected by acquisition module 210), power module 250 may only power the memory, causing wearable device 130 to enter a standby mode. For another example, in the event that wearable device 130 does not receive any operation for a certain period of time (e.g., 1s, 2s, 3s, or 4 s) (e.g., no motion signal is detected by acquisition module 210), power module 250 may disconnect power to other components, and data in motion monitoring system 100 may be transferred to a hard disk, causing wearable device 130 to enter a standby mode or sleep mode. In some embodiments, the power module 250 may include at least one battery. The battery may include one or a combination of several of a dry battery, a lead storage battery, a lithium battery, a solar battery, a wind power generation battery, a mechanical power generation battery, a thermal power generation battery, and the like. The solar cell may convert light energy into electrical energy and store in the power module 250. The wind power generation cell may convert wind energy into electrical energy and store it in the power module 250. The mechanical energy generation cell may convert mechanical energy into electrical energy and store it in the power module 250. The solar cell may include a silicon solar cell, a thin film solar cell, a nanocrystalline solar cell, a fuel sensitized solar cell, a plastic solar cell, and the like. The solar cells may be distributed on the wearable device 130 in the form of a battery panel. The thermal energy generation battery may convert the user's body temperature into electrical energy and store in the power module 250. In some embodiments, the processing module 220 may send a control signal to the power module 250 when the power of the power module 250 is less than a power threshold (e.g., 10% of the total power). The control signal may include information of the power shortage of the power supply module 250. In some embodiments, the power module 250 may contain a backup power source. In some embodiments, the power module 250 may also include a charging interface. For example, in an emergency situation (e.g., the power of the power supply module 250 is 0, and the external power system fails to supply power), the power supply module 250 may be temporarily charged by using an electronic device (e.g., a mobile phone, a tablet computer) or a charger carried by the user.
The input/output module 260 may acquire, transmit, and send signals. The input/output module 260 may be connected to or in communication with other components in the athletic monitoring system 100. Other components in the athletic monitoring system 100 may be connected or communicate via the input/output module 260. The input/output module 260 may be a wired USB interface, a serial communication interface, a parallel communication interface, or a wireless Bluetooth, infrared, radio-frequency identification (Radio-frequency identification, RFID), WLAN authentication and privacy infrastructure (Wlan Authentication and Privacy Infrastructure, WAPI), general packet Radio service (General Packet Radio Service, GPRS), code division multiple access (Code Division Multiple Access, CDMA), etc., or any combination thereof. In some embodiments, input/output module 260 may be coupled to network 120 and obtain information via network 120. For example, the input/output module 260 may obtain the motion signal during the user's motion from the obtaining module 210 through the network 120 or the communication module 240 and output the user's motion information. In some embodiments, the input/output module 260 may include VCC, GND, RS-232, RS-485 (e.g., RS485-A, RS 485-B), a general network interface, etc., or any combination thereof. In some embodiments, the input/output module 260 may communicate the acquired user motion information to the acquisition module 210 over the network 120. In some embodiments, the input/output module 260 may encode the transmitted signal using one or more encoding schemes. The encoding means may include phase encoding, non-return to zero code, differential Manchester code, etc., or any combination thereof.
It should be understood that the system shown in fig. 2 and its modules may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may then be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or special purpose design hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such as provided on a carrier medium such as a magnetic disk, CD or DVD-ROM, a programmable memory such as read only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules of one or more embodiments of the present description may be implemented not only with hardware circuitry, such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also with software, such as executed by various types of processors, and with a combination of the above hardware circuitry and software (e.g., firmware).
It should be noted that the above description of the motion monitoring system and its modules is for descriptive convenience only and is not intended to limit one or more embodiments of the present disclosure to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that having the benefit of this disclosure, it is possible to combine the various modules arbitrarily, or to construct a subsystem in connection with other modules, or to omit one or more of the modules, without departing from the principles. For example, the acquisition module 210 and the processing module 220 may be one module, which may have a function of acquiring and processing a user action signal. As another example, the processing module 220 may also be integrated into the processing device 110 without being disposed in the wearable device 130. Such variations are within the scope of one or more embodiments of the present description.
FIG. 3 is a schematic diagram of exemplary hardware and/or software of a computing device according to some embodiments of the application. In some embodiments, the processing device 110 and/or the mobile terminal device 140 may be implemented on the computing device 300. As shown in fig. 3, computing device 300 may include an internal communication bus 310, a processor 320, a read only memory 330, a random access memory 340, a communication port 350, an input/output interface 360, a hard disk 370, and a user interface 380.
Internal communication bus 310 may enable data communication among the components in computing device 300. For example, the processor 320 may send data over the internal communication bus 310 to memory or other hardware such as the input/output interface 360. In some embodiments, internal communication bus 310 may be an Industry Standard (ISA) bus, an Extended ISA (EISA) bus, a Video Electronics Standards (VESA) bus, a peripheral component interconnect standard (PCI) bus, and so forth. In some embodiments, internal communication bus 310 may be used to connect the various modules (e.g., acquisition module 210, processing module 220, control module 230, communication module 240, input-output module 260) in motion monitoring system 100 shown in fig. 1.
Processor 320 may execute computing instructions (program code) and perform the functions of motion monitoring system 100 described herein. The computing instructions may include programs, objects, components, data structures, procedures, modules, and functions (which refer to particular functions described in this disclosure). For example, the processor 320 may process motion signals (e.g., electromyographic signals, gesture signals) acquired from the wearable device 130 or/and the mobile terminal device 140 of the motion monitoring system 100 when the user moves, and monitor the motion of the user's motion according to the motion signals when the user moves. In some embodiments, processor 320 may include a microcontroller, a microprocessor, a Reduced Instruction Set Computer (RISC), an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a microcontroller unit, a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), an advanced reduced instruction set computer (ARM), a programmable logic device, any circuit and processor capable of performing one or more functions, and the like, or any combination thereof. For illustration only, computing device 300 in FIG. 3 depicts one processor, but it should be noted that computing device 300 in the present application may also include multiple processors.
The memory of computing device 300 (e.g., read Only Memory (ROM) 330, random Access Memory (RAM) 340, hard disk 370, etc.) may store data/information retrieved from any other component of motion monitoring system 100. In some embodiments, the memory of computing device 300 may be located in wearable device 130 or in processing device 110. Exemplary ROMs may include Mask ROM (MROM), programmable ROM (PROM), erasable programmable ROM (PEROM), electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), and digital versatile disk ROM, among others. Exemplary RAM may include Dynamic RAM (DRAM), double rate synchronous dynamic RAM (DDR SDRAM), static RAM (SRAM), thyristor RAM (T-RAM), zero capacitance (Z-RAM), and the like.
The input/output interface 360 may be used to input or output signals, data, or information. In some embodiments, input/output interface 360 may enable a user to interact with motion monitoring system 100. For example, the input/output interface 360 may include the communication module 240 to implement the communication functions of the athletic monitoring system 100. In some embodiments, the input/output interface 360 may include an input device and an output device. Exemplary input devices may include a keyboard, mouse, touch screen, microphone, and the like, or any combination thereof. Exemplary output means may include a display device, speakers, printer, projector, etc., or any combination thereof. Exemplary display devices may include Liquid Crystal Displays (LCDs), light Emitting Diode (LED) based displays, flat panel displays, curved displays, television equipment, cathode Ray Tubes (CRTs), and the like, or any combination thereof. The communication port 350 may be connected to a network for data communication. The connection may be a wired connection, a wireless connection, or a combination of both. The wired connection may include an electrical cable, optical cable, or telephone line, or the like, or any combination thereof. The wireless connection may include bluetooth, wi-Fi, wiMax, WLAN, zigBee, a mobile network (e.g., 3G, 4G, 5G, etc.), etc., or any combination thereof. In some embodiments, the communication port 350 may be a standardized port, such as RS232, RS485, and the like. In some embodiments, communication port 350 may be a specially designed port.
Hard disk 370 may be used to store information and data generated by processing device 110 or received from processing device 110. For example, the hard disk 370 may store user identification information of the user. In some embodiments, hard disk 370 may include a mechanical hard disk (HDD), a Solid State Disk (SSD), a Hybrid Hard Disk (HHD), or the like. In some embodiments, hard disk 370 may be disposed in processing device 110 or in wearable device 130. User interface 380 may enable interaction and exchange of information between computing device 300 and a user. In some embodiments, the user interface 380 may be used to present the motion record generated by the motion monitoring system 100 to a user. In some embodiments, user interface 380 may include a physical display, such as a speaker-bearing display, an LCD display, an LED display, an OLED display, an electronic Ink display (E-Ink), or the like.
Fig. 4 is an exemplary block diagram of a wearable device according to some embodiments of the application. To further describe the wearable device, taking the upper garment as an exemplary illustration, as shown in fig. 4, the wearable device 400 may include an upper garment 410. The upper garment 410 may include a upper garment base 4110, at least one upper garment process module 4120, at least one upper garment feedback module 4130, at least one upper garment acquisition module 4140, and the like. The upper garment substrate 4110 may refer to a garment worn on the upper body of a human body. In some embodiments, upper garment substrate 4110 may include a T-shirt, a long-sleeved T-shirt, a coat, and the like. The at least one coat processing module 4120 and the at least one coat acquisition module 4140 may be located on the upper garment substrate 4110 in areas that are in contact with different parts of the human body. The at least one coat feedback module 4130 may be located at any position of the coat base 4110, and the at least one coat feedback module 4130 may be configured to feedback the user upper body movement state information. Exemplary feedback means may include, but are not limited to, voice prompts, text prompts, pressure prompts, current stimuli, and the like. In some embodiments, the at least one coat acquisition module 4140 may include, but is not limited to, one or more of an attitude sensor, an electrocardio sensor, a myo-sensor, a temperature sensor, a humidity sensor, an inertial sensor, an acid-base sensor, an acoustic wave transducer, and the like. The sensors in the coat acquisition module 4140 may be placed at different positions of the user's body according to the signals to be measured. For example, when the posture sensor is used to acquire a posture signal during the movement of the user, the posture sensor may be placed at a position corresponding to the trunk, arms, joints of the human body in the upper garment base 4110. For another example, where the electromyographic sensor is used to acquire electromyographic signals during a user's movement, the electromyographic sensor may be located in proximity to the muscle to be measured by the user. In some embodiments, the attitude sensor may include, but is not limited to, an acceleration tri-axis sensor, an angular velocity tri-axis sensor, a magnetic force sensor, and the like, or any combination thereof. For example, one attitude sensor may include an acceleration triaxial sensor, an angular velocity triaxial sensor. In some embodiments, the gesture sensor may also include a strain gauge sensor. The strain sensor may refer to a sensor that can be based on strain generated by deformation of an object under test. In some embodiments, the strain gauge sensor may include, but is not limited to, one or more of a strain gauge load cell, a strain gauge pressure sensor, a strain gauge torque sensor, a strain gauge displacement sensor, a strain gauge acceleration sensor, and the like. For example, the strain gauge sensor may be positioned at a joint location of the user, and the bending angle and bending direction at the joint of the user may be obtained by measuring the magnitude of resistance of the strain gauge sensor that varies with the length of the extension. It should be noted that the jacket garment 410 may further include other modules, such as a power supply module, a communication module, an input/output module, etc., in addition to the jacket garment base 4110, the jacket processing module 4120, the jacket feedback module 4130, the jacket acquisition module 4140, etc., described above. The upper garment process module 4120 is similar to the process module 220 of fig. 2, and the upper garment acquisition module 4140 is similar to the acquisition module 210 of fig. 2, and the detailed description of the various modules in the upper garment 410 may be referred to in relation to the description of fig. 2 of the present application, and will not be repeated here.
Fig. 5 is an exemplary flow chart of a method of motion monitoring according to some embodiments of the application. As shown in fig. 5, the process 500 may include:
in step 510, an action signal is acquired as the user moves.
In some embodiments, this step 510 may be performed by the acquisition module 210. The motion signal refers to human body parameter information when the user moves. In some embodiments, the body parameter information may include, but is not limited to, one or more of an electromyographic signal, a posture signal, an electrocardiographic signal, a temperature signal, a humidity signal, an blood oxygen concentration, a respiratory rate, and the like. In some embodiments, the electromyographic sensors in the acquisition module 210 may acquire electromyographic signals of the user during exercise. For example, when a user performs sitting posture chest clipping, the myoelectric sensor corresponding to the chest muscle, latissimus dorsi and other positions in the wearable device can collect the myoelectric signals of the corresponding muscle positions of the user. For another example, when the user performs a squat action, the electromyographic sensors in the wearable device corresponding to the positions of the gluteus maximus, quadriceps femoris, and the like of the human body can collect electromyographic signals of the corresponding muscle positions of the user. For another example, when the user performs running exercise, the myoelectric sensor corresponding to the position of the gastrocnemius of the human body in the wearable device may collect the myoelectric signal of the position of the gastrocnemius of the human body. In some embodiments, a gesture sensor in the acquisition module 210 may acquire gesture signals as the user moves. For example, when the user performs a barbell pushing motion, an attitude sensor in the wearable device corresponding to the position of the human triceps brachii can collect an attitude signal of the position of the user triceps brachii. For another example, when the user performs a dumbbell fly, an attitude sensor provided at a position such as a deltoid muscle of the human body may collect an attitude signal of the position such as the deltoid muscle of the user. In some embodiments, the number of gesture sensors in the acquisition module 210 may be multiple, the multiple gesture sensors may acquire gesture signals of multiple parts during the movement of the user, and the multiple part gesture signals may reflect the relative movement situation between different parts of the human body. For example, the pose signal at the arm and the pose signal at the torso may reflect the motion of the arm relative to the torso. In some embodiments, the gesture signal is associated with a type of gesture sensor. For example, when the attitude sensor is an angular velocity triaxial sensor, the acquired attitude signal is angular velocity information. For another example, when the posture sensor is an angular velocity triaxial sensor and an acceleration triaxial sensor, the acquired posture signal is angular velocity information and acceleration information. For another example, when the posture sensor is a strain sensor, the strain sensor may be disposed at a joint position of the user, and by measuring a resistance of the strain sensor that varies with a tensile length, the acquired posture signals may be displacement information, stress, and the like, and by these posture signals, a bending angle and a bending direction at the joint of the user may be represented. It should be noted that, the parameter information that can be used to represent the relative motion of the body of the user may be the characteristic information corresponding to the gesture signal, and different types of gesture sensors may be used to obtain the characteristic information according to the type of the characteristic information.
In some embodiments, the motion signal may include an electromyographic signal for a particular part of the user's body and a gesture signal for that particular part. The electromyographic signals and the gesture signals can reflect the motion state of a specific part of the user body from different angles. In short, the gesture signal of a specific part of the user's body may reflect the action type, action amplitude, action frequency, etc. of the specific part. The electromyographic signals may reflect the muscle state of the particular site as it moves. In some embodiments, by means of the electromyographic and/or gesture signals of the same body part, it may be better to assess whether the motion of that part is normative.
In step 520, the motion of the user motion is monitored based at least on the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the gesture signals.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the characteristic information corresponding to the electromyographic signals may include, but is not limited to, one or more of frequency information, amplitude information, and the like. The characteristic information corresponding to the gesture signal refers to parameter information for representing relative motion of the body of the user. In some embodiments, the characteristic information corresponding to the gesture signal may include, but is not limited to, one or more of an angular velocity direction, an angular velocity value, an acceleration value of an angular velocity, and the like. In some embodiments, the characteristic information corresponding to the gesture signal may also include angle, displacement information (e.g., tensile length in a strain gauge sensor), stress, and the like. For example, when the posture sensor is a strain sensor, the strain sensor may be disposed at a joint position of a user, and by measuring a resistance of the strain sensor that varies with a tensile length, acquired posture signals may be displacement information, stress, and the like, by which a bending angle and a bending direction at the joint of the user may be represented. In some embodiments, the processing module 220 and/or the processing device 110 may extract characteristic information (e.g., frequency information, amplitude information) corresponding to the electromyographic signals or characteristic information corresponding to the gesture signals (e.g., angular velocity direction, angular velocity value, acceleration value of angular velocity, angle, displacement information, stress, etc.), and monitor the motion of the user motion based on the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the gesture signals. Monitoring the motion of the user motion herein includes monitoring information related to the user motion. In some embodiments, the action-related information may include one or more of a user action type, a number of actions, a quality of the action (e.g., whether the user action meets a criterion), a time of the action, and the like. The action type refers to a fitness action taken by the user while exercising. In some embodiments, the type of motion may include, but is not limited to, one or more of a sitting position chest clip, a squat exercise, a hard pulling exercise, a flat support, running, swimming, and the like. The number of actions refers to the number of times an action is performed during the user's movement. For example, the user performs 10 sitting position chest clamps during exercise, 10 of which is the number of actions. Motion quality refers to the standard degree of exercise motion performed by a user relative to standard exercise motion. For example, when the user performs a squat exercise, the processing device 110 may determine an exercise type of the user exercise based on characteristic information corresponding to an exercise signal (an electromyographic signal and an attitude signal) of a specific muscle position (gluteus maximus, quadriceps femoris, etc.), and determine an exercise quality of the user squat exercise based on an exercise signal of a standard squat exercise. The action time refers to the time corresponding to one or more action types of the user or the total time of the movement process. For details on monitoring the motion of the user's motion based on the characteristic information corresponding to the electromyographic signals and/or the characteristic information corresponding to the gesture signals, reference may be made to fig. 6 of the present application and the related description thereof.
In some embodiments, the processing device 110 may utilize one or more motion recognition models to recognize and monitor the motion of the user's motion. For example, the processing device 110 may input the feature information corresponding to the electromyographic signals and/or the feature information corresponding to the gesture signals into the motion recognition model, and output the information related to the user motion from the motion recognition model. In some embodiments, the motion recognition model may include different types of motion recognition models, such as a model for recognizing a type of user motion, or a model for recognizing a quality of user motion, or the like.
It should be noted that the above description of the process 500 is for purposes of illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 500 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description. For example, the extracting of the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the gesture signals in step 520 may be performed by the processing device 110, and in some embodiments, may also be performed by the processing module 220. For example, the motion signal of the user is not limited to the electromyographic signal, the posture signal, the electrocardiographic signal, the temperature signal, the humidity signal, the blood oxygen concentration, and the respiratory rate, but may be other physiological parameter signals of the human body, and the physiological parameter signals related to the motion of the human body may be regarded as the motion signal in the embodiments of the present specification.
FIG. 6 is an exemplary flow chart for monitoring user athletic activity in accordance with some embodiments of the application. As shown in fig. 6, the flow 600 may include:
in step 610, the motion signal is segmented based on the characteristic information corresponding to the electromyographic signal or the characteristic information corresponding to the gesture signal.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. The process of acquisition of motion signals (e.g., electromyographic signals, gesture signals) while the user is moving is continuous, and the motion while the user is moving may be a combination of multiple sets of motions or a combination of motions of different motion types. To analyze each motion in the user motion, the processing module 220 may segment the user motion signal based on the feature information corresponding to the electromyographic signal or the feature information corresponding to the gesture signal when the user moves. The segmentation of the motion signal refers to dividing the motion signal into signal segments with the same or different time durations, or extracting one or more signal segments with specific time durations from the motion signal. In some embodiments, each segment of action signal may correspond to one or more complete actions of the user. For example, when the user performs a squat exercise, the user may take the standing posture from standing posture to squat posture, recover the standing posture again, and the action signal collected by the obtaining module 210 during this process may be regarded as one section (or one period) of action signal, and after that, the action signal generated when the user completes the squat exercise next time and collected by the obtaining module 210 is regarded as another section of action signal. In some embodiments, each segment of action signal may also correspond to a partial action of the user, where a partial action may be understood as a partial action in one complete action. For example, when the user performs a squat exercise, the user may consider one motion from a standing posture to squat, and the user may consider another motion from a standing posture to a crouching posture. The electromyographic signals and posture signals of the corresponding parts can be changed by the change of each action step when the user moves. For example, when the user performs a squatting action, the fluctuation of the myoelectric signals and posture signals at the corresponding muscles of the corresponding parts of the body (for example, arms, legs, buttocks and abdomen) when the user stands is small, when the user performs squatting by standing posture, the myoelectric signals and posture signals at the corresponding muscles of the user's body can generate larger fluctuation, for example, amplitude information corresponding to signals with different frequencies in the myoelectric signals is larger, for example, angular velocity values, angular velocity directions, acceleration values of angular velocities, angles, displacement information, stress and the like corresponding to the posture signals are also changed. When the user rises from the squat state to the standing state, the amplitude information corresponding to the electromyographic signals, the angular velocity value, the angular velocity direction and the acceleration value, the angle, the displacement information and the stress corresponding to the gesture signals are changed. Based on this, the processing module 220 may segment the motion signal of the user based on the feature information corresponding to the electromyographic signal or the feature information corresponding to the gesture signal. For details on the segmentation of the motion signal based on the characteristic information corresponding to the electromyographic signal or the characteristic information corresponding to the gesture signal, reference may be made to fig. 7 and 8 of the present specification and the description thereof.
In step 620, the motion of the user motion is monitored based on at least one segment of the motion signal.
This step may be performed by processing module 220 and/or processing device 110. In some embodiments, monitoring the motion of the user movement based on the at least one segment of motion signal may include determining a type of motion of the user movement based on matching the at least one segment of motion signal with the at least one segment of preset motion signal. At least one section of preset action signal refers to standard action signals corresponding to different actions preset in a database. In some embodiments, the action type of the user during movement can be determined by judging the matching degree of at least one section of action signal and at least one section of preset action signal. Further, whether the matching degree of the action signal and the preset action signal is within a first matching threshold range (for example, greater than 80%) is judged, and if so, the action type of the user during movement is determined according to the action type corresponding to the preset action signal. In some embodiments, monitoring the motion of the user movement based on the at least one segment of motion signal may further include determining a type of motion of the user movement based on matching the characteristic information corresponding to the at least one segment of electromyographic signal with the characteristic information corresponding to the electromyographic signal in the at least one segment of preset motion signal. For example, the matching degree of one or more characteristic information (such as frequency information and amplitude information) in a section of electromyographic signal and one or more characteristic information in a section of preset action signal is calculated respectively, whether the weighted matching degree or average matching degree of the one or more characteristic information is within a first matching threshold value range is judged, and if yes, the action type of the user during movement is determined according to the action type corresponding to the preset action signal. In some embodiments, monitoring the motion of the user motion based on the at least one segment of motion signal may further include determining a type of motion of the user motion based on matching the feature information corresponding to the at least one segment of gesture signal with the feature information corresponding to the gesture signal in the at least one segment of preset motion signal. For example, matching degrees of one or more characteristic information (for example, an angular velocity value, an angular velocity direction, an acceleration value of an angular velocity, an angle, displacement information, stress and the like) in a section of gesture signal and one or more characteristic information in a section of preset action signal are respectively calculated, whether the weighted matching degree or the average matching degree of the one or more characteristic information is within a first matching threshold range is judged, and if so, the action type when the user moves is determined according to the action type corresponding to the preset action signal. In some embodiments, monitoring the motion of the user movement based on the at least one motion signal may further include determining a type of the motion of the user when the user moves based on matching the characteristic information corresponding to the electromyographic signal and the characteristic information corresponding to the gesture signal in the at least one motion signal with the characteristic information corresponding to the electromyographic signal and the characteristic information corresponding to the gesture signal in the at least one preset motion signal.
In some embodiments, monitoring the motion of the user movement based on the at least one motion signal may include determining a quality of the motion of the user movement based on matching the at least one motion signal with at least one preset motion signal. Further, if the matching degree of the motion signal and the preset motion signal is within the second matching threshold range (for example, greater than 90%), the motion quality of the user during the motion meets the standard. In some embodiments, determining the motion of the user movement based on the at least one segment of the motion signal may include determining a quality of the motion of the user movement based on matching one or more characteristic information in the at least one segment of the motion signal with one or more characteristic information in the at least one segment of the preset motion signal. It should be noted that a segment of motion signal may be a motion signal of a complete motion or a motion signal of a partial motion in a complete motion. In some embodiments, for a complex complete motion, there are different ways of applying force at different phases of the complete motion, that is, there are different motion signals at different phases of the motion, and the real-time performance of the motion monitoring of the user can be improved by monitoring the motion signals at different phases of the complete motion.
It should be noted that the above description of the process 600 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 600 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description. For example, in some embodiments, the user's actions may also be determined by an action recognition model or a manually preset model.
Fig. 7 is an exemplary flow chart of action signal segmentation shown in accordance with some embodiments of the present application. As shown in fig. 7, the flow 700 may include:
in step 710, at least one target feature point is determined from a time domain window of the electromyographic signal or the gesture signal according to a preset condition.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. The time domain window of electromyographic signals comprises electromyographic signals within a time range, and the time domain window of gesture signals comprises gesture signals within the same time range. The target feature point refers to a signal with a target feature in the action signal, which can characterize the stage in which the user action is located. For example, when the user clamps the chest in a sitting position, initially, the two arms of the user are in a left-right extending state in the horizontal direction, then the two arms start to rotate inwards, then the two arms are folded, and finally the two arms are restored to an extending state again in the horizontal direction, and the process is a complete chest clamping action in a sitting position. When the user performs the sitting posture chest clipping motion, the characteristic information corresponding to the electromyographic signals or the posture signals at each stage is different, and the target characteristic points corresponding to the stage where the user motion is located can be determined by analyzing the characteristic information (for example, amplitude information, frequency information) corresponding to the electromyographic signals or the characteristic information (for example, angular velocity value, angular velocity direction, acceleration value of angular velocity, angle, displacement information, stress, etc.) corresponding to the posture signals. In some embodiments, one or more target feature points may be determined from within a time domain window according to preset conditions. In some embodiments, the preset condition may include one or more of a change in an angular velocity direction corresponding to the gesture signal, an angular velocity corresponding to the gesture signal being greater than or equal to an angular velocity threshold, an angle corresponding to the gesture signal reaching the angular threshold, a change value of an angular velocity value corresponding to the gesture signal being an extremum, and amplitude information corresponding to the electromyographic signal being greater than or equal to an electromyographic threshold. In some embodiments, the target feature points of different phases of an action may correspond to different preset conditions. For example, in a sitting posture chest clipping motion, the preset condition of the target feature point when the user's arms are left and right stretched in the horizontal direction and then the arms start to pronate is different from the preset condition of the target feature point when the arms are closed. In some embodiments, the target feature points of different actions may correspond to different preset conditions. For example, the sitting posture chest-clamping movement and the two-head bending movement are different in movement, and the preset conditions corresponding to the preset target points in the two movements are also different. For example, the description of the action start point, the action intermediate point, and the action end point in this specification may be referred to for example for the preset conditions.
In other embodiments, at least one target feature point may also be determined from a time domain window of the electromyographic signal and the gesture signal according to a preset condition based on the time domain window. The time domain window of the electromyographic and gestural signals corresponds to a time range containing the electromyographic and gestural signals. The time of the electromyographic signal corresponds to the time of the gesture signal. For example, the point in time of the electromyographic signal when the user starts to move is the same as the point in time of the gesture signal when the user starts to move. Here, the target feature point may be determined by combining feature information (e.g., amplitude information) corresponding to the electromyographic signal and feature information (e.g., angular velocity value, angular velocity direction, acceleration value of angular velocity, angle, etc.) corresponding to the attitude signal.
In step 720, the action signal is segmented based on the at least one target feature point.
In some embodiments, this step 720 may be performed by the processing module 220 and/or the processing device 110. In some embodiments, the target feature points in the electromyographic or gesture signals may be one or more, through which the motion signal may be divided into multiple segments. For example, when there is one target feature point in the electromyographic signals, the target feature point may divide the electromyographic signals into two segments, where the two segments may include the electromyographic signals before the target feature point and the electromyographic signals after the target feature point. Alternatively, the processing module 220 and/or the processing device 110 may extract the electromyographic signals within a time range around the target feature point as a segment of the electromyographic signals. For another example, when the electromyographic signal has a plurality of target feature points (e.g., n, and the first target feature point is not the start point of the time domain window and the nth target feature point is not the end point of the time domain window), the electromyographic signal may be divided into n+1 segments according to the n target feature points. For another example, when the electromyographic signal has a plurality of target feature points (e.g., n, and the first target feature point is the start point of the time domain window and the nth target feature point is not the end point of the time domain window), the electromyographic signal may be divided into n segments according to the n target feature points. For another example, when the electromyographic signal has a plurality of target feature points (e.g., n, and the first target feature point is the start point of the time domain window and the nth target feature point is the end point of the time domain window), the electromyographic signal may be divided into n-1 segments according to the n target feature points. It should be noted that the motion phases corresponding to the target feature points may include one or more types, and when the motion phases corresponding to the target feature points are multiple, the motion signals may be segmented by using the multiple types of target feature points as references. For example, the motion phase corresponding to the target feature point may include a motion start point and a motion end point, the motion start point being before the motion end point, and the motion signal between the motion start point and the next motion start point may be regarded as a segment of the motion signal.
In some embodiments, the target feature points may include one or more of an action start point, an action intermediate point, or an action end point.
For describing the segmentation of the motion signal, the target feature point includes a motion start point, a motion middle point, and a motion end point at the same time as an exemplary description, wherein the motion start point may be regarded as a start point of a user motion cycle. In some embodiments, different actions may correspond to different preset conditions. For example, in the sitting posture chest clipping motion, the preset condition may be that the angular velocity direction of the motion after the motion start point is changed with respect to the angular velocity direction of the motion before the motion start point, or the angular velocity value of the motion start point is approximately 0, and the acceleration value of the angular velocity of the motion start point is greater than 0. That is, when the user performs sitting posture chest clipping, the start point of the motion may be set to a point of time when the arms are extended horizontally and the user starts pronating. For another example, in the two-head bending motion, the preset condition may be that the angle of the arm lifting is greater than or equal to the angle threshold. Specifically, when the user performs the bicep bending and lifting operation, the lifting angle when the user's arm is horizontal is 0 °, the angle when the arm is drooping is negative, and the angle when the arm is lifted is positive. When the arm of the user is lifted from the horizontal position, the angle at which the arm is lifted is greater than 0. The point in time when the angle at which the user's arm is raised reaches the angle threshold may be regarded as the action start point. The angle threshold may be-70 deg. -20 deg.. In some embodiments, to further ensure the accuracy of the selected action starting point, the preset conditions may further include: the angular velocity of the arm within a certain time range after the start point of the motion may be greater than or equal to the angular velocity threshold. The angular velocity threshold may range from 5 °/s to 50 °/s. For example, when the user performs a bicep bending operation, the angular velocity of the arm in a specific time range (for example, 0.05s, 0.1s, 0.5 s) is continuously greater than the angular velocity threshold value when the angle threshold value is passed and the arm of the user is continuously lifted upward. In some embodiments, if the angular velocity of the action starting point selected according to the preset condition is less than the angular velocity threshold value within the specific time range, the preset condition is continuously performed until one action starting point is determined.
In some embodiments, the action intermediate point may be a point within one action cycle from the start point. For example, when the user performs sitting posture chest clipping, the start point of the motion may be set to be the time point when the arms are extended horizontally and start pronation, and the time point when the arms are closed may be set to be the user motion intermediate point. In some embodiments, the preset condition may be that the angular velocity direction of the time point after the action intermediate point is changed with respect to the angular velocity direction of the time point before the action intermediate point, and the angular velocity value of the action intermediate point is approximately 0, wherein the angular velocity direction of the action intermediate point is opposite to the angular velocity direction of the action start point. In some embodiments, to improve the accuracy of the mid-point selection of the motion, the rate of change of angular velocity (acceleration of angular velocity) within a first specific time range (e.g., 0.05s, 0.1s, 0.5 s) after the mid-point of motion may be greater than an acceleration threshold of angular velocity (e.g., 0.05 rad/s). In some implementations, the amplitude information corresponding to the action intermediate point in the electromyographic signal is greater than the electromyographic threshold while the action intermediate point satisfies the preset condition. The myoelectricity threshold is related to the user action and the target myoelectricity signal because the myoelectricity signals corresponding to the different actions are different. In a sitting posture clamping chest, the electromyographic signal at the pectoral muscle is the target electromyographic signal. In some embodiments, the position corresponding to the middle point of action (which may also be called the "middle position") may be approximately regarded as the maximum point of muscle force, where the electromyographic signal may have a larger value. When the user performs the corresponding exercise, the electromyographic signal at the corresponding portion of the user's body is greatly increased relative to the electromyographic signal at the corresponding portion when the user does not perform the exercise (the muscle at the specific portion may be regarded as a resting state), for example, the magnitude of the electromyographic signal at the corresponding portion when the user's exercise reaches the intermediate position is 10 times that in the resting state. In addition, the types of actions performed by the user are different, the amplitude relationship between the electromyographic signals of the corresponding part moving to the middle position (action middle point) and the electromyographic signals in the rest state is also different, and the relationship between the two can be adaptively adjusted according to the action of actual movement. In some embodiments, to improve the accuracy of the mid-point selection of the motion, the corresponding amplitude within a second specific time range (e.g., 0.05s, 0.1s, 0.5 s) after the mid-point of the motion may continue to be greater than the myoelectric threshold. In some embodiments, the determination of the action intermediate point may, in addition to the above-mentioned preset conditions (e.g., the angular velocity and the amplitude condition of the electromyographic signal), be such that the euler angle (also referred to as the angle) of the action intermediate point and the start position satisfies a certain condition. For example, in a sitting posture chest-clamping, the euler angle of the action intermediate point relative to the action start point may be greater than one or more euler angle thresholds (also referred to as angle thresholds), for example, with the human front-back direction as the X-axis, the human left-right direction as the Y-axis, the human height direction as the Z-axis, the X, Y direction euler angle change may be less than 25 °, and the Z-direction euler angle change may be greater than 40 ° (the sitting posture chest-clamping action is mainly a rotation in the Z-axis direction, the above parameters are also merely reference examples). In some embodiments, the myoelectricity threshold and/or euler angle threshold may be pre-stored in a memory or hard disk of the wearable device 130, may be stored in the processing device 110, or may be calculated according to actual situations and may be adjusted in real time.
In some embodiments, the processing module 220 may determine the action intermediate point from a time domain window of a time point after the action start point according to a preset condition based on a time domain window of the electromyographic or gesture signal. In some implementations, after determining the action intermediate point, it may be re-verified whether there are other time points meeting the preset condition in the time range from the action start point to the action intermediate point, and if so, the action start point closest to the action intermediate point is selected as the optimal action start point. In some embodiments, if the difference between the time of the action intermediate point and the time of the action start point is greater than a specific time threshold (e.g., 1/2 or 2/3 of one action period), the action intermediate point is invalid, the action start point and the action intermediate point are re-determined according to a preset condition.
In some embodiments, the action end point may be within one action cycle from the action start point and at some point in time after the action intermediate point, for example, the action end point may be set to a point one action cycle from the action start point, at which point the action end point may be considered as the end point of one action cycle for the user. For example, when the user performs sitting posture chest clipping, the start point of the motion may be set to be the time point when the arms are extended horizontally and start pronation, the time point when the arms are closed may be the middle point of the motion of the user, and the time point when the arms are restored to the extended state again in the horizontal direction may correspond to the end point of the motion of the user. In some embodiments, the preset condition may be that a change value of the angular velocity value corresponding to the gesture signal is an extremum. In some embodiments, to prevent jitter misinterpretation, the variation of the euler angle should exceed a certain euler angle threshold, e.g. 20 °, in the time range from the middle point of action to the end point of action. In some embodiments, the processing module 220 may determine the action end point from a time domain window after the action intermediate point according to a preset condition based on the time domain window of the electromyographic signal and the gesture signal. In some embodiments, if the difference between the time of the action end point and the time of the action intermediate point is greater than a specific time threshold (e.g., 1/2 of one action cycle), then both the action start point and the action intermediate point are invalid, and then the action start point, the action intermediate point, and the action end point are determined again according to preset conditions.
In some embodiments, the determining of at least one set of the motion start point, the motion intermediate point, and the motion end point in the motion signal may be repeated, and the motion signal may be segmented based on the at least one set of the motion start point, the motion intermediate point, and the motion end point as target feature points. This step may be performed by processing module 220 and/or processing device 110. It should be noted that the segmentation of the motion signal is not limited to the motion start point, the motion middle point, and the motion end point described above, but may include other time points. For example, the sitting posture chest clipping motion may select 5 time points according to the above steps, the first time point may be a motion start point, the second time point may be a time point when the internal rotation angular velocity is maximum, the third time point may be a middle motion time point, the fourth time point may be a time point when the external rotation angular velocity is maximum, the fifth time point may be a time point when the arms return to extend to the left and right, and the angular velocity is 0, that is, a motion end point. In this example, by adding the second time point as the 1/4 mark point of the action cycle, the action end point described in the foregoing embodiment is used as the fourth time point for marking the 3/4 position of the action cycle, and the fifth time point is added as the end point of the complete action, as compared with the action start point, the action intermediate point, and the action end point in the above steps. For sitting posture chest clipping actions, more time points are used, the identification of the action quality can be completed based on the signals 3/4 of the action period (that is, the identification of the action quality of a single period does not depend on the signals of the whole period of analysis), the monitoring and feedback of the actions of the user can be completed when the actions of the current period are not finished, and meanwhile, all the signals in the whole action process can be completely recorded so as to be convenient for uploading the signals to a cloud or mobile terminal device, so that more methods can be adopted for monitoring the actions of the user. For more complex actions, the period of one action is long, and each stage has different force-exerting modes, in some embodiments, the method for determining each time point can be adopted to divide the action into a plurality of stages, and independent recognition and feedback are performed on signals of each stage, so that the real-time performance of user action feedback is improved.
It should be noted that, the above-mentioned steps of segmenting and monitoring the motion signal according to the motion start point, the motion intermediate point, and the motion end point as a set of target feature points are merely illustrative, and in some embodiments, the motion signal of the user may be segmented and monitored based on any one or more of the motion start point, the motion intermediate point, and the motion end point as the target feature points. For example, the motion signal may be segmented and monitored with the motion start point as the target feature point. For another example, the motion signal may be segmented and monitored by using the motion start point and the motion end point as a set of target feature points, and other time points or time ranges that can function as the target feature points are all within the protection scope of the present specification.
It should be noted that the above description of the process 700 is for purposes of illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 700 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description. For example, at least two of steps 710, 720 may be performed simultaneously in the processing module 220. As another example, steps 710, 720 may be performed simultaneously in the processing module 220 and the processing device 110, respectively.
Fig. 8 is a schematic diagram of an action signal segment shown in accordance with some embodiments of the application. The abscissa in fig. 8 may represent the time of the user's exercise, and the ordinate may represent the amplitude information of the electromyographic signals of the corresponding muscle portion (e.g., pectoral major muscle) during the chest-clamping exercise in the sitting posture of the user. The figure 8 also comprises an angular velocity change curve and an Euler angle change curve corresponding to the wrist position and posture signal in the user movement process, wherein the angular velocity change curve is used for representing the velocity change condition of the user during movement, and the Euler angle curve is used for representing the position condition of the body part during movement of the user. As shown in fig. 8, the A1 point is determined as an action start point according to a preset condition. Specifically, the angular velocity direction at a point in time after the user action start point A1 changes with respect to the angular velocity direction at a point in time before the action start point A1. Further, the angular velocity value of the operation start point A1 is approximately 0, and the acceleration value of the angular velocity at the operation start point A1 is greater than 0.
Referring to fig. 8, the point B1 is determined as an action intermediate point according to a preset condition. Specifically, the angular velocity direction of the user at a point in time after the action intermediate point B1 is changed with respect to the angular velocity direction of the user at a point in time before the action intermediate point B1, the angular velocity value of the action intermediate point B1 being approximately 0, wherein the angular velocity direction of the action intermediate point B1 is opposite to the angular velocity direction of the action start point A1. In addition, the amplitude corresponding to the action intermediate point B1 in the myoelectric signal (shown as "myoelectric signal" in fig. 8) is larger than the myoelectric threshold.
With continued reference to fig. 8, the C1 point is determined as an action end point according to a preset condition. Specifically, the change value of the angular velocity value at the operation end point C1 is the extremum of the operation start point A1 to the operation end point C1. In some embodiments, flow 700 may complete the motion segmentation illustrated in FIG. 8, and the motion signal from motion start point A1 to motion end point C1 illustrated in FIG. 8 may be considered a segment of the user's motion.
It should be noted that, in some embodiments, if the time interval between the action intermediate point and the action start point is greater than a specific time threshold (e.g., 1/2 of one action period), the processing module 220 may redetermine the action start point to determine the accuracy of the action segment. The characteristic time threshold may be stored in a memory or hard disk of the wearable device 130, may be stored in the processing device 110, or may be calculated or adjusted according to the actual situation of the user's movement. For example, if the time interval between the action starting point A1 and the action middle point B1 in fig. 8 is greater than the specific time threshold, the processing module 220 may redetermine the action starting point, so that the accuracy of the action segmentation may be improved. The segmentation of the motion signal is not limited to the motion start point A1, the motion intermediate point B1, and the motion end point C1, and may include other time points, and the selection of the time points may be performed according to the complexity of the motion.
When the motion signal of the user is acquired, other physiological parameter information (for example, heart rate signal) of the user, and external conditions such as relative movement or extrusion of the acquisition module 210 and the human body during the motion process may affect the quality of the motion signal, for example, cause abrupt changes in the electromyographic signal, so as to affect the monitoring of the motion of the user. For ease of description, the abrupt myoelectric signal may be described in terms of singular points, exemplary singular points may include spike signals, discontinuous signals, and the like. In some embodiments, monitoring the motion of the user motion based at least on the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the gesture signals may further include: preprocessing the electromyographic signals in a frequency domain or a time domain, acquiring characteristic information corresponding to the electromyographic signals based on the preprocessed electromyographic signals, and monitoring the motion of the user according to the characteristic information corresponding to the electromyographic signals or the characteristic information corresponding to the gesture signals. In some embodiments, pre-processing the electromyographic signal in the frequency domain or the time domain may include filtering the electromyographic signal in the frequency domain to select or preserve components of a particular frequency range in the electromyographic signal in the frequency domain. In some embodiments, the frequency range of the electromyographic signals acquired by the acquisition module 210 is 1Hz-1000 Hz, which may be filtered and from which electromyographic signals of a particular frequency range (e.g., 30Hz-150 Hz) may be selected for subsequent processing. In some embodiments, the particular frequency range may be 10Hz-500 Hz. In some embodiments, the filtering process may include a low pass filter process. In some embodiments, the low pass filter may include an LC passive filter, an RC active filter, a passive filter composed of special elements. In some embodiments, the passive filter composed of special elements may include one or more of a piezoceramic filter, a crystal filter, an acoustic surface filter. It should be noted that the specific frequency range is not limited to the above range, but may be other ranges, and may be selected according to practical situations. For the content of monitoring the motion of the user's motion according to the characteristic information corresponding to the electromyographic signal or the characteristic information corresponding to the gesture signal, reference may be made to fig. 5, 6 and the related description of the present application.
In some embodiments, preprocessing the electromyographic signal in the frequency domain or the time domain may further include signal correction processing the electromyographic signal in the time domain. The signal correction processing refers to correction of singular points (e.g., burr signals, discontinuous signals, etc.) in the electromyographic signals. In some embodiments, the signal correction processing of the electromyographic signal in the time domain may comprise determining singular points in the electromyographic signal, i.e. determining abrupt signals in the electromyographic signal. The singular point may be that the amplitude of the electromyographic signal is mutated at a certain time, resulting in a discontinuity of the signal. For another example, the electromyographic signal is morphologically smooth, no abrupt change in amplitude of the electromyographic signal occurs, but the first order differential of the electromyographic signal occurs abruptly, and the first order differential is discontinuous. In some embodiments, the method of determining singular points in the electromyographic signals may include, but is not limited to, one or more of fourier transforms, wavelet transforms, fractal dimensions, and the like. In some embodiments, signal correction processing of the electromyographic signal in the time domain may include removing singular points in the electromyographic signal, e.g., deleting signals within a range of singular points and their vicinity. Alternatively, the signal correction processing of the electromyographic signal in the time domain may include correcting the singular point of the electromyographic signal according to the characteristic information of the electromyographic signal in a specific time range, for example, adjusting the amplitude of the singular point according to the signal around the singular point. In some embodiments, the characteristic information of the electromyographic signal may include one or more of amplitude information, statistical information of the amplitude information. The statistical information of the amplitude information (also referred to as amplitude entropy) refers to the distribution of the amplitude information of the electromyographic signals in the time domain. In some embodiments, after the location (e.g., corresponding point in time) of the singular point in the electromyographic signal is determined by a signal processing algorithm (e.g., fourier transform, wavelet transform, fractal dimension), the singular point may be corrected based on the electromyographic signal within a specific time range before or after the location of the singular point. For example, when the singular point is a mutation trough, the electromyographic signal at the mutation trough may be supplemented according to characteristic information (e.g., amplitude information, statistical information of amplitude information) of the electromyographic signal within a specific time range (e.g., 5ms-60 ms) before or after the mutation trough.
Illustratively speaking with singularities as the glitch signal, fig. 9 is an illustrative flow chart illustrating electromyographic signal preprocessing in accordance with some embodiments of the application. As shown in fig. 9, the process 900 may include:
in step 910, based on the time domain window of the electromyographic signal, different time windows are selected from the time domain window of the electromyographic signal, wherein the different time windows respectively cover different time ranges.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the different windows may include at least one particular window. The specific window refers to a window having a specific time length selected from time domain windows. For example, when the time length of the time domain window of the electromyographic signal is 3s, the time length of the specific window may be 100ms. In some embodiments, a particular window may include a plurality of different time windows. By way of example only, the particular window may include a first time window and a second time window, the first time window may refer to one window of a corresponding portion of time length within the particular window, e.g., the time length of the first time window may be 80ms when the time length of the particular window is 100ms. The second time window may refer to another window corresponding to a portion of the time length within the particular window, for example, the second time window may be 20ms when the particular window is 100ms. In some embodiments, the first time window and the second time window may be consecutive time windows within the same particular window. In some embodiments, the first time window and the second time window may also be two time windows that are discontinuous or overlap within the same particular window. For example, when the time length of the window in the specific time range is 100ms, the time length of the first time window may be 80ms, and the time length of the second time window may be 25ms, in which case 5ms in the second time window overlaps with the first time window. In some embodiments, the processing module 220 may slide and update the specific window sequentially by a specific time length from a time point of the time domain window of the electromyographic signal based on the time domain window of the electromyographic signal, and may divide the updated specific window into the first time window and the second time window. The specific time length here may be less than 1s, 2s, 3s, etc. For example, the processing module 220 may select a particular window having a particular time length of 100ms and divide the particular window into a first time window of 80ms and a second time window of 20ms. Further, the specific window may be updated by sliding in the time direction. The sliding distance may be the time length of the second time window (e.g., 20 ms), or may be other suitable time length, such as 30ms,40ms, etc.
In step 920, the glitch signal is determined based on the characteristic information corresponding to the electromyographic signal in the time window.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the characteristic information corresponding to the electromyographic signal may include at least one of amplitude information and statistical information of the amplitude information. In some embodiments, the processing module 220 may obtain amplitude information corresponding to the electromyographic signals in different time windows (e.g., a first time window, a second time window) or statistical information of the amplitude information to determine the location of the glitch signal. For a specific description of determining the position of the glitch signal based on the characteristic information corresponding to the electromyographic signals in different time windows, reference may be made to fig. 10 and its related description.
It should be noted that the above description of the process 900 is for illustration and description only, and is not intended to limit the scope of the application of the present disclosure. Various modifications and changes to flow 900 will be apparent to those skilled in the art in light of the present description. For example, the specific window is not limited to include the first time window and the second time window described above, but may include other time windows, such as a third time window, a fourth time window, and the like. In addition, the specific range of the time before or after the burr signal position can be adaptively adjusted according to the length of the burr signal, which is not further limited herein. However, such modifications and variations are still within the scope of the present description.
Fig. 10 is an exemplary flow chart of a deburring signal shown in accordance with some embodiments of the present application. As shown in fig. 10, the process 1000 may include:
in step 1010, first amplitude information corresponding to the electromyographic signals in a first time window and second amplitude information corresponding to the electromyographic signals in a second time window are determined.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the processing module 220 may select a time length of the first time window and the second time window, and extract first amplitude information corresponding to the electromyographic signal in the time length of the first time window and second amplitude information corresponding to the electromyographic signal in the time length of the second time window. In some embodiments, the first amplitude information may include an average amplitude of the electromyographic signal over a first time window and the second amplitude information may include an average amplitude of the electromyographic signal over a second time window. For example, the processing module 220 may select a first time window with a time length of 80ms and extract first amplitude information corresponding to the electromyographic signals in the first time window, and the processing module 220 may select a second time window with a time length of 20ms and extract second amplitude information corresponding to the electromyographic signals in the second time window.
In some embodiments, the first time window time length and the second time window time length are selected in relation to the shortest glitch length and the computational effort of the system. In some embodiments, the first time window time length and the second time window time length may be selected based on characteristics of the glitch signal. The time length of the electrocardio burr signals is 40ms-100ms, the time interval of two burr signals in the electrocardio signals can be about 1s, the two sides of the peak point of the burr signals are basically symmetrical, the amplitude distribution of the two sides of the burr signals is relatively uniform, and the like. In some embodiments, when the glitch signal is an electrocardiograph signal, a time length of less than half of the glitch signal length, for example, may be selected as the time length of the second time window, and the time length of the first time window may be greater than the time length of the second time window, for example, 4 times the time length of the second time window. In some embodiments, the time length of the first time window is within the range of the glitch interval (about 1 s) minus the length of the second time window. It should be further noted that, the time length of the first time window and the time length of the second time window are not limited to the above description, so long as the sum of the time length of the second time window and the time length of the first time window is smaller than two adjacent glitch signal time intervals, or the time length of the second time window is smaller than a single glitch signal length, or the magnitude of the myoelectric signal in the second time window and the magnitude of the myoelectric signal in the first time window have better differentiation.
In step 1020, it is determined whether the ratio of the second amplitude information to the first amplitude information is greater than a threshold.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the processing module 220 may determine whether a ratio of the second amplitude information corresponding to the electromyographic signal in the second time window to the first amplitude information corresponding to the electromyographic signal in the first time window is greater than a threshold. The threshold value here may be stored in a memory or a hard disk of the wearable device 130, or may be stored in the processing device 110, or may be adjusted according to the actual situation. In some embodiments, if the processing module 220 determines that the ratio of the second amplitude information to the first amplitude information is greater than the threshold value, step 1020 may proceed to step 1030. In other embodiments, if the processing module 220 determines that the ratio of the second amplitude information to the first amplitude information is not greater than the threshold value, step 1020 may proceed to step 1040.
In step 1030, a signal correction process is performed on the electromyographic signals within the second time window.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the processing module 220 may perform signal correction processing on the electromyographic signals within the second time window according to the determination result of the magnitude relation between the ratio of the second amplitude information to the first amplitude information in step 1020 and the threshold value. For example, in some embodiments, if the ratio of the second amplitude information to the first amplitude information is greater than a threshold, the electromyographic signal within the second time window corresponding to the second amplitude information is a glitch signal. In some embodiments, processing the electromyographic signals within the second time window may include performing a signal correction process on the electromyographic signals within the second time window based on the electromyographic signals within a particular time range before or after the second time window. In some embodiments, the manner in which the electromyographic signals within the second time window are signal correction processed may include, but is not limited to, padding, interpolation, and the like. In some embodiments, the particular time range may be 5ms-60ms. It should be noted that the specific time range is not limited to the above range, and for example, the specific time range may be more than 60ms, or less than 5ms, or the like. In an actual application scene, the adaptive adjustment can be performed according to the time length of the glitch signal.
In step 1040, the electromyographic signals within the second time window are retained.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the processing module 220 may perform retaining the electromyographic signals within the second time window according to the determination result of the magnitude relation of the ratio of the second amplitude information to the first amplitude information in step 1020 and the threshold value. For example, in some embodiments, if the ratio of the second amplitude information to the first amplitude information is not greater than the threshold, the electromyographic signal in the second time window corresponding to the second amplitude information is a normal electromyographic signal, and the normal electromyographic signal may be preserved, i.e., the electromyographic signal in the second time window is preserved.
It should be noted that, during the muscular effort of the user, the electric charges gradually accumulate, and the amplitude of the myoelectric signal gradually increases, so that in the absence of the glitch signal, the amplitude of the myoelectric signal in two adjacent time windows (for example, the first time window and the second time window) does not suddenly change. In some embodiments, determining and removing the glitch signal in the electromyographic signal based on the process 1000 may implement real-time processing of the glitch signal, so that the wearable device 130 or the mobile terminal device 140 may feed back the motion state thereof to the user in real time, and help the user perform the motion more scientifically.
In some embodiments, the length of time corresponding to the first time window may be greater than the length of time corresponding to the second time window. In some embodiments, the particular window may correspond to a particular length of time that is less than 1s. In some embodiments, the ratio of the length of time corresponding to the first time window to the length of time corresponding to the second time window may be greater than 2. In some embodiments, the time length corresponding to the first time window, the time length corresponding to the second time window, and the specific time length corresponding to the specific window are selected, so that on one hand, the shortest glitch length (for example, 40 ms) can be ensured to be removed and have a high signal-to-noise ratio, and on the other hand, the calculated amount of the system is relatively smaller, the repeated calculation of the system is reduced, the time complexity is reduced, and therefore the calculation efficiency and the calculation accuracy of the system can be improved.
It should be noted that the above description of the process 1000 is for illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 1000 may be made by those skilled in the art under the guidance of this specification. For example, the above-described flow 1000 is merely an example of a singular point being a glitch signal, and when the singular point is a valley signal, the above-described steps (e.g., step 1010, step 1020, step 1030, etc.) and their schemes may be adjusted or signal correction processing may be performed by other methods. However, such modifications and variations are still within the scope of the present description.
In some embodiments, the signal correction processing for the singular point of the electromyographic signal may also use other methods, such as a high-pass method, a low-pass method, a band-pass method, a wavelet transformation reconstruction method, and the like. In some embodiments, a 100Hz high pass filter may be used for glitch removal for application scenarios where the low frequency signal is insensitive. In some embodiments, in addition to performing signal correction processing on the electromyographic signal, other manners of signal processing may be performed on the electromyographic signal, such as filtering processing, signal amplification, phase adjustment, and the like. In some embodiments, the user myoelectric signal collected by the myoelectric sensor may be converted into a digital myoelectric signal by an analog-to-digital converter (ADC), and the converted digital myoelectric signal may be subjected to filtering processing, where the filtering processing may filter out a power frequency signal and a harmonic signal thereof, and so on. In some embodiments, the processing of the electromyographic signals may further comprise removing motion artifacts of the user. The motion artifact refers to signal noise generated by the relative movement of muscles at a position to be detected relative to the myoelectric module when a user moves in the process of acquiring the myoelectric signal.
In some embodiments, the gesture signal may be acquired by a gesture sensor on the wearable device 130. The posture sensors on the wearable device 130 may be distributed at four limbs of the human body (e.g., arms, legs, etc.), trunk of the human body (e.g., chest, abdomen, back, waist, etc.), head of the human body, etc. The gesture sensor can collect gesture signals of other parts such as four limbs, trunk and the like of the human body. In some embodiments, the attitude sensor may also be an attitude measurement unit (AHRS) sensor with an attitude fusion algorithm. The posture fusion algorithm can fuse data of a nine-axis Inertial Measurement Unit (IMU) with a three-axis acceleration sensor, a three-axis angular velocity sensor and a three-axis geomagnetic sensor into Euler angles or quaternions so as to acquire posture signals of a user body part where the posture sensor is located. In some embodiments, processing module 220 and/or processing device 110 may determine feature information corresponding to the pose based on the pose signal. In some embodiments, the characteristic information corresponding to the gesture signal may include, but is not limited to, an angular velocity value, an angular velocity direction, an acceleration value of the angular velocity, and the like. In some embodiments, the gesture sensor may be a strain sensor that may acquire a bending direction and a bending angle at a joint of the user, thereby acquiring a gesture signal when the user moves. For example, the strain sensor may be disposed at a knee joint of the user, and when the user moves, a body part of the user acts on the strain sensor, and a bending direction and a bending angle at the knee joint of the user may be calculated based on a resistance or a length change of the strain sensor, thereby acquiring a posture signal of a leg of the user. In some embodiments, the attitude sensor may further comprise a fiber optic sensor, and the attitude signal may be characterized by a change in direction of the fiber optic sensor after bending of the light. In some embodiments, the attitude sensor may also be a magnetic flux sensor, and the attitude signal may be characterized by a transformation of the magnetic flux. It should be noted that the type of the posture sensor is not limited to the above-described one, but may be other ones, and the sensors capable of acquiring the user posture signal are all within the range of the posture sensor of the present specification.
FIG. 11 is an exemplary flow chart for determining feature information corresponding to a gesture signal according to some embodiments of the application. As shown in fig. 11, the process 1100 may include:
in step 1110, a target coordinate system and a conversion relationship between the target coordinate system and at least one original coordinate system are obtained.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the original coordinate system refers to a coordinate system corresponding to an attitude sensor provided on a human body. When the user uses the wearable device 130, the posture sensors on the wearable device 130 are distributed on different parts of the human body, so that the installation angles of the posture sensors on the human body are different, and the posture sensors on the different parts respectively take the coordinate systems of the respective bodies as original coordinate systems, so that the posture sensors on the different parts have different original coordinate systems. In some embodiments, the gesture signals acquired by the respective gesture sensors may be representations under their corresponding raw coordinate systems. By translating the pose signals in different original coordinate systems into the same coordinate system (e.g., the target coordinate system), it is convenient to determine the relative motion between different parts of the human body. In some embodiments, the target coordinate system refers to a human body coordinate system established based on a human body. For example, the length direction of the human body trunk (i.e., the direction perpendicular to the human body cross-section) may be taken as the Z-axis, the front-back direction of the human body trunk (i.e., the direction perpendicular to the human body coronal plane) may be taken as the X-axis, and the left-right direction of the human body trunk (i.e., the direction perpendicular to the human body sagittal plane) may be taken as the Y-axis in the target coordinate system. In some embodiments, there is a conversion relationship between the target coordinate system and the original coordinate system, by which the coordinate information in the original coordinate system can be converted into the coordinate information in the target coordinate system. In some embodiments, the transformation relationship may be represented as one or more rotation matrices. For details on determining the conversion relation between the target coordinate system and the original coordinate system, reference is made to fig. 13 and the related description of the present application.
In step 1120, the coordinate information in the at least one original coordinate system is converted into coordinate information in the target coordinate system based on the conversion relationship.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. The coordinate information in the original coordinate system refers to three-dimensional coordinate information in the original coordinate system. The coordinate information in the target coordinate system refers to three-dimensional coordinate information in the target coordinate system. By way of example only, coordinate information v in the original coordinate system 1 The coordinate information in the original coordinate system can be converted into the coordinate information v in the target coordinate system according to the conversion relation 2 . Specifically, the coordinate information v 1 And coordinate information v 2 The transformation can be performed by using a rotation matrix, and the rotation matrix can be understood as a transformation relation between an original coordinate system and a target coordinate system. Specifically, coordinate information v in the original coordinate system 1 Can be converted into coordinate information v by a first rotation matrix 1 -1, coordinate information v 1 -1 can be changed into coordinate information v by a second rotation matrix 1 -2, coordinate information v 1 -2 can be changed into coordinate information v by a third rotation matrix 1 -3, coordinate information v 1 -3 is the coordinate information v in the target coordinate system 2 . It should be noted that the rotation matrix is not limited to the first rotation matrix, the second rotation matrix, and the third rotation matrix described above, and may include fewer or more rotation matrices. In some alternative embodiments, the rotation matrix may also be one rotation matrix or a combination of rotation matrices.
In step 1130, feature information corresponding to the gesture signal is determined based on the coordinate information in the target coordinate system.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, determining the feature information corresponding to the user gesture signal based on the coordinate information in the target coordinate system may include determining the feature information corresponding to the user gesture signal based on a plurality of coordinate information in the target coordinate system during the user movement. For example, when the user performs sitting posture chest clipping exercise, the user arm can correspond to first coordinate information in the target coordinate system when the user lifts forward, the user arm can correspond to second coordinate information in the target coordinate system when the user opens to the same plane as the trunk, and feature information corresponding to the user gesture signal can be calculated based on the first coordinate information and the second coordinate information. For example, angular velocity direction, acceleration value of angular velocity, etc.
It should be noted that the above description of the process 1100 is for purposes of illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to the process 1100 may be made by those skilled in the art under the guidance of this specification. However, such modifications and variations are still within the scope of the present description.
In some embodiments, the relative motion between different motion parts of the user body can also be determined through the characteristic information corresponding to the gesture sensors positioned at different positions of the user body. For example, the relative movement between the arm and the trunk during the movement of the user can be determined by the characteristic information corresponding to the posture sensor at the arm of the user and the characteristic information corresponding to the posture sensor at the trunk part of the user. FIG. 12 is an exemplary flow chart for determining relative movement between different movement portions of a user, according to some embodiments of the application. As shown in fig. 12, the process 1200 may include:
in step 1210, characteristic information corresponding to each of the at least two sensors is determined based on the conversion relationships between the different original coordinate systems and the target coordinate system.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, different sensors have different conversion relationships between the original coordinate system and the target coordinate system corresponding to the sensors due to different mounting positions at the human body. In some embodiments, the processing device 110 may convert coordinate information in the original coordinate system corresponding to the sensors of different parts (e.g., forearm, torso, etc.) of the user into coordinate information in the target coordinate system, respectively, so that feature information corresponding to at least two sensors may be determined, respectively. The relevant description about the transformation of the coordinate information in the original coordinate system into the coordinate information in the target coordinate system can be found elsewhere in the present application, for example, in fig. 11, which is not repeated here.
In step 1220, relative motion between different motion portions of the user is determined based on the characteristic information corresponding to the at least two sensors, respectively.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the movement site may refer to a limb on the human body that may move independently, e.g., a forearm, a thigh, a calf, a thigh, etc. By way of example only, when a user performs a dumbbell lifting motion on his/her arm, the relative motion between his/her arm and the forearm may be determined by combining the coordinate information in the target coordinate system corresponding to the sensor disposed at the forearm portion and the coordinate information in the target coordinate system corresponding to the sensor disposed at the forearm portion, thereby determining the dumbbell lifting motion on the user's arm.
In some embodiments, the same movement part of the user may further be provided with a plurality of sensors of the same or different types, and coordinate information in the original coordinate system corresponding to the plurality of sensors of the same or different types may be converted into coordinate information in the target coordinate system respectively. For example, a plurality of sensors of the same or different types may be disposed at different positions of the forearm portion of the user, and a plurality of coordinate information in a target coordinate system corresponding to the plurality of sensors of the same or different types may simultaneously characterize the movement motion of the forearm portion of the user. For example, the coordinate information in the target coordinate system corresponding to the plurality of sensors of the same type can be averaged, so that the accuracy of the coordinate information of the moving part in the moving process of the user is improved. For another example, the coordinate information in the target coordinate system may be obtained by a fusion algorithm (e.g., kalman filter, etc.) on the coordinate information in the coordinate systems corresponding to the plurality of different types of sensors.
It should be noted that the above description of the process 1100 is for purposes of illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to the process 1100 may be made by those skilled in the art under the guidance of this specification. However, such modifications and variations are still within the scope of the present description.
FIG. 13 is an exemplary flow chart for determining the transformation relationship of an original coordinate system to a particular coordinate system, according to some embodiments of the application. In some embodiments, the process of determining the conversion relationship between the original coordinate system and the specific coordinate system may also be called a calibration process. As shown in fig. 13, the process 1300 may include:
in step 1310, a specific coordinate system is constructed.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the conversion relationship between the at least one original coordinate system and the target coordinate system may be obtained through a calibration process. The specific coordinate system refers to a reference coordinate system used for determining a conversion relation between the original coordinate system and the target coordinate system in the calibration process. In some embodiments, the specific coordinate system may be constructed with the length direction of the trunk of the human body standing as the Z axis, the front-back direction of the human body as the X axis, and the left-right direction of the trunk of the human body as the Y axis. In some embodiments, the particular coordinate system is related to the orientation of the user during the calibration process. For example, in the calibration process, the front of the user body faces a certain fixed direction (for example, north), and the direction in front of the human body (north) is the X axis, and in the calibration process, the direction of the X axis is fixed.
In step 1320, first coordinate information in at least one original coordinate system is acquired when the user is in a first pose.
In some embodiments, this step may be performed by the acquisition module 210. The first posture may be a posture in which the user remains approximately standing. The acquisition module 210 (e.g., a sensor) may acquire first coordinate information in the original coordinate system based on a first gesture of the user.
In step 1330, second coordinate information in at least one original coordinate system is acquired when the user is in a second pose.
In some embodiments, this step may be performed by the acquisition module 210. The second posture may be a posture in which a body part (e.g., an arm) of the user where the sensor is located is tilted forward. In some embodiments, the acquisition module 210 (e.g., a sensor) may acquire second coordinate information in the original coordinate system based on a second pose (e.g., a forward tilt pose) of the user.
In step 1340, a conversion relationship between at least one original coordinate system and a specific coordinate system is determined according to the first coordinate information, the second coordinate information, and the specific coordinate system.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the first rotation matrix may be determined by first coordinate information corresponding to the first pose. In the first posture, since the euler angles of the X and Y directions of the specific coordinate system in the ZYX rotation order are 0, and the euler angles of the X and Y directions of the original coordinate system are not necessarily 0, the first rotation matrix is a rotation matrix obtained by reversely rotating the original coordinate system around the X axis and then reversely rotating around the Y axis. In some embodiments, the second rotation matrix may be determined by second coordinate information of a second pose (e.g., a forward tilt of the body part in which the sensor is located). Specifically, in the second posture, the specific coordinate system is known to be Y and Z in the ZYZ rotation order 3 The Euler angle of the direction is 0, and the original coordinate system is in Y and Z 3 The euler angle of the direction is not necessarily 0, then the second rotation matrix is to counter-rotate the original coordinate system around the Y-direction and then around Z 3 And (5) reversely rotating the direction to obtain a rotation matrix. The conversion relationship between the original coordinate system and the specific coordinate system can be determined by the above-described first rotation matrix and second rotation matrix. In some embodiments, when the original coordinate system (sensor) is plural, the conversion relationship between each original coordinate system and the specific coordinate system may be determined using the above-described method.
It should be noted that the first posture described above is not limited to a posture in which the user keeps approximately standing, and the second posture is not limited to a posture in which the body part (for example, arm) of the user where the sensor is located is tilted forward, and the first posture and the second posture may be regarded approximately as a posture that is stationary during calibration. In some embodiments, the first pose and/or the second pose may also be a dynamic pose during calibration. For example, the walking posture of the user is a relatively fixed posture, the angles and angular velocities of the two arms, the two legs and the two feet in the walking process can be extracted, the forward stepping, the forward swing arm and other actions are recognized, and the forward walking posture of the user can be used as the second posture in the calibrating process. In some embodiments, the second gesture is not limited to one action, and multiple actions may be extracted as the second gesture. For example, coordinate information of a plurality of actions is fused, so that a more accurate rotation matrix is obtained.
In some embodiments, the rotation matrix may be dynamically corrected during calibration using some signal processing algorithm (e.g., using a Kalman filtering algorithm) to obtain a conversion matrix that is optimal throughout the calibration.
In some embodiments, machine learning algorithms, or other algorithms, may be used to automatically identify certain actions to update the rotational torque matrix in real time. For example, if the machine learning algorithm recognizes that the user is walking or standing, the calibration process is automatically started, in which case the wearable device does not need an explicit calibration process, and the rotation matrix is dynamically updated during the use of the wearable device by the user.
In some embodiments, the installation position of the gesture sensor may be relatively fixed, and a rotation matrix may be preset in the corresponding algorithm, so that the recognition process of the specific action may be more accurate. Further, the rotation matrix is continuously corrected in the process that the user uses the wearable device, so that the obtained rotation matrix is closer to the real situation.
It should be noted that the above description of the process 1300 is for illustration and description only, and is not intended to limit the scope of the present disclosure. Various modifications and changes to the process 1300 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
FIG. 14 is an exemplary flow chart for determining a conversion relationship between an original coordinate system and a target coordinate system, according to some embodiments of the application. As shown in fig. 14, the process 1400 may include:
in step 1410, a transformation relationship between a specific coordinate system and a target coordinate system is obtained.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. The specific coordinate system and the target coordinate system are both Z-axis along the length direction of the human trunk, so that the conversion relationship between the specific coordinate system and the target coordinate system can be obtained through the conversion relationship between the X-axis of the specific coordinate system and the X-axis of the target coordinate system and the conversion relationship between the Y-axis of the specific coordinate system and the Y-axis of the target coordinate system. The principle of acquiring the conversion relationship between the specific coordinate relationship and the target coordinate system can be referred to fig. 13 and its related contents.
In some embodiments, the specific coordinate system may take the length direction of the human body trunk as the Z axis, and the front-back direction of the human body as the calibrated X axis. Since the user's body changes its front-rear direction during movement (e.g., swivel movement) and cannot be maintained in a calibrated coordinate system, it is necessary to determine a coordinate system that can be rotated with the human body, i.e., a target coordinate system. In some embodiments, the target coordinate system may change as the user's orientation changes, with the X-axis of the target coordinate system always being directly in front of the human torso.
In step 1420, a transformation relationship between the at least one original coordinate system and the target coordinate system is determined according to the transformation relationship between the at least one original coordinate system and the specific coordinate system and the transformation relationship between the specific coordinate system and the target coordinate system.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the processing device 110 may determine a conversion relationship between the at least one original coordinate system and the target coordinate system according to the conversion relationship between the at least one original coordinate system and the specific coordinate system determined in the process 1300 and the conversion relationship between the specific coordinate system and the target coordinate system determined in the step 1410, so that coordinate information in the original coordinate system may be converted into coordinate information in the target coordinate system.
It should be noted that the above description of the process 1400 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to the process 1400 may be made by those skilled in the art under the guidance of this specification. However, such modifications and variations are still within the scope of the present description.
In some embodiments, the position of the gesture sensor disposed on the wearable device 130 may change and/or the installation angle of the gesture sensor on the human body may be different, so that the user performs the same movement, and the gesture data returned by the gesture sensor may have a larger difference.
Fig. 15A is an exemplary vector graph of euler angle data in a raw coordinate system at a human forearm position, according to some embodiments of the application. The frame line portion may represent euler angle data (coordinate information) in the original coordinate system corresponding to the position of the forearm when the user does the same motion. As shown in fig. 15A, the euler angle vector result in the Z-axis direction (shown as "Z" in fig. 15A) within the frame line portion is approximately in the range of-180 ° - (-80 °), the euler angle vector result in the Y-axis direction (shown as "Y" in fig. 15A) is approximately fluctuating at 0 ° and the euler angle vector result in the X-axis direction (shown as "X" in fig. 15A) is approximately fluctuating at-80 °. The fluctuation range here may be 20 °.
Fig. 15B is an exemplary vector graph of euler angle data in an original coordinate system at another location of a human forearm, according to some embodiments of the application. The box line portion may represent euler angle data in the original coordinate system corresponding to the other position of the forearm when the user performs the same action (the same action as that shown in fig. 15A). As shown in fig. 15B, the euler angle vector result in the Z-axis direction (shown as "Z '" in fig. 15B) within the frame line portion is approximately within the range of-180 ° to 180 °, the euler angle vector result in the Y-axis direction (shown as "Y '" in fig. 15B) is approximately fluctuating at 0 °, and the euler angle vector result in the X-axis direction (shown as "X '" in fig. 15B) is approximately fluctuating at-150 °. The fluctuation range here may be 20 °.
The euler angle data shown in fig. 15A and 15B are euler angle data (coordinate information) in the original coordinate system respectively obtained when the user performs the same action based on the difference in positions of the human forearm (which may also be understood as the difference in the installation angle of the posture sensor at the position of the human forearm). As can be seen from comparing fig. 15A and 15B, when the posture sensor is installed at different angles on the human body, and the user does the same action, the difference of euler angle data in the original coordinate system returned by the posture sensor can be larger. For example, the euler angle vector result in the Z-axis direction in fig. 15A is approximately in the range of-180 ° - (-80 °), and the euler angle vector result in the Z-axis direction in fig. 15B is approximately in the range of-180 ° -180 °, which are greatly different.
In some embodiments, euler angle data in an original coordinate system corresponding to the sensors with different installation angles can be converted into euler angle data in a target coordinate system, so that the gesture signals of the sensors with different positions can be analyzed conveniently. By way of example only, the line in which the left arm is located may be abstracted as a unit vector directed from the elbow to the wrist, the unit vector being a coordinate value within the target coordinate system. The target coordinate system is defined as an axis pointing to the rear of the human body is an X axis, an axis pointing to the right side of the human body is a Y axis, and an axis pointing to the upper side of the human body is a Z axis, and accords with the right hand coordinate system. For example, the coordinate values [ -1,0] in the target coordinate system indicate forward panning of the arm; the coordinate values [0, -1,0] of the target coordinate system indicate that the arm is lifted leftwards. Fig. 16A is an exemplary vector graph of euler angle data in a target coordinate system at a human forearm position, according to some embodiments of the application. Fig. 16A is a graph obtained based on the euler angle data of the forearm in fig. 15A after conversion into vector coordinates in the target coordinate system, wherein the frame line portion may represent the euler angle data in the target coordinate system at the position of the forearm when the user performs an action. As shown in FIG. 16A, the forearm vector [ x, y, z ] within the wire portion reciprocates between a first position of [0.2, -0.9, -0.38], and a second position of [0.1, -0.95, -0.3]. It should be noted that for each reciprocation of the forearm, the first and second positions will deviate by a small amount.
Fig. 16B is an exemplary vector graph of euler angle data in a target coordinate system at another location of a human forearm, according to some embodiments of the application. Fig. 16B is a graph obtained based on the euler angle data of the forearm in fig. 15B after conversion into vector coordinates in the target coordinate system, wherein the frame line portion may represent the euler angle data in the target coordinate system at another position of the forearm when the user performs the same action (the same action as that shown in fig. 16A). As shown in fig. 16B, the forearm vector [ x, y, z ] likewise reciprocates between a first position [0.2, -0.9, -0.38], and a second position [0.1, -0.95, -0.3].
As can be seen from fig. 15A to fig. 16B in combination with fig. 15A and fig. 15B, since the mounting positions of the two attitude sensors are different, euler angles under the original coordinate systems have great differences in the value range and the fluctuation form, after coordinate information of the original coordinate systems corresponding to the two attitude sensors is respectively converted into vector coordinates (for example, vector coordinates in fig. 16A and fig. 16B) corresponding to the target coordinate systems, two vector coordinates which are approximately the same can be obtained, that is, the characteristic information corresponding to the attitude signals can be not affected by the mounting positions of the sensors by the method. Specifically, in fig. 16A and 16B, it can be seen that the mounting positions of the two gesture sensors on the forearm are different, and the same vector coordinates are obtained after the coordinate conversion, that is, the process of reciprocally switching the arm between the state one (the arm is lifted right) and the state two (the arm is lifted forward) in the sitting posture chest clamping process can be represented.
Fig. 17 is a vector coordinate diagram of limb vectors in a target coordinate system, shown in accordance with some embodiments of the present application. As shown in fig. 17, from top to bottom, the left hand forearm (17-1), the right hand forearm (17-2), the left hand forearm (17-3), the right hand forearm (17-4), and the vector coordinates of the posture sensor in the target coordinate system at the position of the torso (17-5) may be represented, respectively. Vector coordinates of various locations (e.g., 17-1, 17-2, 17-3, 17-4, 17-5) in the target coordinate system as the human body moves are shown in fig. 17. The front 4200 points in fig. 17 are the calibration actions required to calibrate the limb, such as standing, torso forward, arm forward extension, arm side lifting, etc. The calibration is performed by using calibration actions corresponding to 4200 points before the calibration, so that the original data acquired by the attitude sensor can be converted into Euler angles under a target coordinate system. To facilitate analysis of the data, it may be further converted into a coordinate vector of the arm vector in the target coordinate system. The target coordinate system here is directed to the front of the torso, on the X-axis, to the left of the torso, on the Y-axis, and above the torso, on the Z-axis. The reciprocating motions in fig. 17 are from left to right, motion 1, motion 2, motion 3, motion 4, motion 5, and motion 6, respectively, a sitting posture chest clip, a high-position pull down, a sitting posture chest push, a sitting posture shoulder push, a barbell two-head bending lifting, and a sitting posture chest clip. As can be seen from fig. 17, the different actions have different action patterns, which can be clearly identified using limb vectors. Meanwhile, the same action has good repeatability, for example, both action 1 and action 6 represent sitting chest clamping actions, and the curves of the two actions have good repeatability.
In some embodiments, the pose data (e.g., euler angles, angular velocities, etc.) directly output by the modules of the original coordinate system may be converted to pose data in the target coordinate system by processes 1300 and 1400, such that highly consistent pose data (e.g., euler angles, angular velocities, limb vector coordinates, etc.) may be obtained.
Fig. 18A is an exemplary vector graph of raw angular velocities shown according to some embodiments of the application. The raw angular velocity may be understood as converting euler angle data in the raw coordinate system corresponding to the sensor of the different mounting angles into euler angle data in the target coordinate system. In some embodiments, factors such as jitter during user motion may affect the results of angular velocity in the pose data. As shown in fig. 18A, the vector coordinate curve of the original angular velocity exhibits a more pronounced uneven curve under the influence of jitter or the like. For example, there is a sudden change in the vector coordinate curve of the original angular velocity, so that the vector coordinate curve of the original angular velocity is not smooth. In some embodiments, for the effect of the angular velocity results, such as jitter, it is desirable to correct the angular velocity of the jitter to obtain a smooth vector coordinate curve. In some embodiments, the raw angular velocity may be filtered using a 1Hz-3Hz low pass filtering method. Fig. 18B is an exemplary result graph of angular velocity after filtering processing shown in accordance with some embodiments of the present application. As shown in fig. 18B, after the low-pass filtering processing of 1Hz-3Hz is performed on the original angular velocity, the influence (for example, abrupt signal) of the angular velocity such as jitter can be eliminated, so that the vector graph corresponding to the angular velocity can present a smoother curve. In some embodiments, the low-pass filtering processing of the angular velocity of 1Hz-3Hz can effectively avoid the influence of jitter and the like on attitude data (such as euler angles, angular velocities and the like), and is more convenient for the subsequent process of segmenting signals. In some embodiments, the filtering process may also filter out power frequency signals and harmonic signals, glitches, etc. in the motion signal. It should be noted that, the low-pass filtering processing of 1Hz-3Hz introduces system time delay, so that the action points obtained by the gesture signal and the action points of the real electromyographic signals are staggered in time, and therefore the system time delay generated in the low-pass filtering processing is subtracted on the basis of the vector coordinate curve after the low-pass filtering processing, and the synchronization of the gesture signal and the electromyographic signals in time is ensured. In some embodiments, the system delay is associated with a center frequency of the filter, and is adaptively adjusted according to the center frequency of the filter when the attitude signal and the electromyographic signal are processed with different filters. In some embodiments, since the angle range of Euler angles is [ -180, +180 ], when the actual Euler angle is not within this angle range, the acquired Euler angle may jump from-180 to +180 or +180 to-180. For example, when the angle is-181 °, the angle of the euler angle jumps to 179 °. In the practical application process, jump can influence the calculation of angle difference value, and the jump needs to be corrected first.
In some embodiments, the motion recognition model may also be used to analyze the motion signal of the user or the feature information corresponding to the motion signal, so as to recognize the motion of the user. In some embodiments, the action recognition model includes a machine learning model trained to recognize user actions. In some embodiments, the action recognition model may include one or more machine learning models. In some embodiments, the motion recognition model may include, but is not limited to, one or more of a machine learning model that classifies a user motion signal, a machine learning model that identifies a quality of a user motion, a machine learning model that identifies a number of user motions, a machine learning model that identifies a degree of fatigue of a user performing a motion. In some embodiments, the machine learning model may include one or more of a linear classification model (LR), a support vector machine model (SVM), a naive bayes model (NB), a K-nearest neighbor model (KNN), a decision tree model (DT), an integrated model (RF/GDBT, etc.), and the like. For the content of the motion recognition model reference may be made elsewhere in the description of the application, for example to fig. 20 and its associated description.
FIG. 19 is an exemplary flow chart of a motion monitoring and feedback method according to some embodiments of the application. As shown in fig. 19, the process 1900 may include:
In step 1910, a motion signal is acquired as the user moves.
In some embodiments, this step may be performed by the acquisition module 210. In some embodiments, the motion signal includes at least characteristic information corresponding to the electromyographic signal and characteristic information corresponding to the gesture signal. The motion signal refers to human body parameter information when the user moves. In some embodiments, the human parameter information may include, but is not limited to, one or more of an electromyographic signal, a posture signal, a heart rate signal, a temperature signal, a humidity signal, an blood oxygen concentration, and the like. In some embodiments, the motion signal may include at least an electromyographic signal and a gesture signal. In some embodiments, the electromyographic sensors in the acquisition module 210 may acquire electromyographic signals while the user is in motion, and the gesture sensors in the acquisition module 210 may acquire gesture signals while the user is in motion.
In step 1920, the motion of the user is monitored based on the motion signal by the motion recognition model, and the motion feedback is performed based on the output result of the motion recognition model.
In some embodiments, this step may be performed by processing module 220 and/or processing device 110. In some embodiments, the output of the motion recognition model may include, but is not limited to, one or more of a motion type, a motion quality, a number of motions, a fatigue index, and the like. For example, the motion recognition model may recognize the type of motion of the user as sitting posture chest clipping based on the motion signal. For another example, one of the motion recognition models may recognize that the motion type of the user is sitting, based on the motion signal, and the other one of the motion recognition models may output the motion quality of the user's motion as a standard motion or a false motion based on the motion signal (e.g., amplitude information of the electromyographic signal, frequency information, and/or angular velocity of the gesture signal, angular velocity direction, acceleration value of the angular velocity). In some embodiments, the action feedback may include issuing a prompt. In some embodiments, the prompt may include, but is not limited to, a voice prompt, a text prompt, an image prompt, a video prompt, and the like. For example, the output result of the motion recognition model is a malfunction, and the processing device 110 may control the wearable device 130 or the mobile terminal device 140 to send a voice prompt (e.g., information such as "motion is not standard") to the user, so as to remind the user to adjust the exercise motion in time. For another example, the output result of the motion recognition model is a standard motion, and the wearable device 130 or the mobile terminal device 140 may not issue a prompt message, or a similar prompt message of "motion standard" occurs. In some embodiments, the motion feedback may also include the wearable device 130 stimulating the corresponding location of the user's motion. For example, elements of the wearable device 130 stimulate corresponding portions of the user's actions by way of vibration feedback, electrical stimulation feedback, pressure feedback, and the like. For example, the output of the motion recognition model is a false motion, and the processing device 110 may control elements of the wearable device 130 to stimulate corresponding portions of the user's motion. In some embodiments, the motion feedback may also include outputting a record of the motion as the user moves. The motion record herein may refer to one or more of a user action type, a duration of the motion, a number of actions, a quality of the action, a fatigue index, physiological parameter information at the time of the motion, etc. For the content of the motion recognition model, reference may be made to the description elsewhere in the present application, and no further description is given here.
It should be noted that the above description of the process 1900 is for purposes of illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to the process 1900 may be made by those skilled in the art under the guidance of this specification. However, such modifications and variations are still within the scope of the present description.
FIG. 20 is an exemplary flow chart of an application of model training shown in accordance with some embodiments of the application. As shown in fig. 20, the process 2000 may include:
in step 2010, sample information is acquired.
In some embodiments, this step may be performed by the acquisition module 210. In some embodiments, the sample information may include motion signals of a professional (e.g., fitness trainer) and/or a non-professional when exercising. For example, the sample information may include electromyographic signals and/or posture signals generated by professionals and/or non-professionals when performing the same type of motion (e.g., sitting chest clamping). In some embodiments, the electromyographic signals and/or gesture signals in the sample information may be subjected to a segmentation process of flow 700, a glitch process of flow 900, a transition process of flow 1300, and the like, to form at least one segment of electromyographic signals and/or gesture signals. The at least one electromyographic signal and/or the gesture signal may be used as an input to a machine learning model to train the machine learning model. In some embodiments, at least the characteristic information corresponding to a segment of the electromyographic signal and/or the characteristic information corresponding to the gesture signal may also be used as an input to the machine learning model to train the machine learning model. For example, frequency information and amplitude information of the electromyographic signals may be used as inputs to a machine learning model. For another example, the angular velocity of the attitude signal, the angular velocity direction, and the acceleration value of the angular velocity may be used as inputs to the machine learning model. For another example, an action start point, an action intermediate point, and an action end point of the action signal may be input to the machine learning model. In some embodiments, the sample information may be obtained from a memory device of the processing device 110. In some embodiments, the sample information may be obtained from the acquisition module 210.
In step 2020, an action recognition model is trained.
This step may be performed by the processing device 110. In some embodiments, the action recognition model may include one or more machine learning models. For example, the motion recognition model may include, but is not limited to, one or more of a machine learning model that classifies a user motion signal, a machine learning model that identifies a quality of a user motion, a machine learning model that identifies a number of times a user motion, a machine learning model that identifies a degree of fatigue of a user performed a motion. In some embodiments, the machine learning model may include one or more of a linear classification model (LR), a support vector machine model (SVM), a naive bayes model (NB), a K-nearest neighbor model (KNN), a decision tree model (DT), an integrated model (RF/GDBT, etc.), and the like.
In some embodiments, training of the machine learning model may include obtaining sample information. In some embodiments, the sample information may include motion signals of a professional (e.g., fitness trainer) and/or a non-professional when exercising. For example, the sample information may include electromyographic signals and/or posture signals generated by professionals and/or non-professionals when performing the same type of motion (e.g., sitting chest clamping). In some embodiments, the electromyographic signals and/or gesture signals in the sample information may be subjected to a segmentation process of flow 700, a glitch process of flow 900, a transition process of flow 1300, and the like, to form at least one segment of electromyographic signals and/or gesture signals. The at least one electromyographic signal and/or the gesture signal may be used as an input to a machine learning model to train the machine learning model. In some embodiments, at least the characteristic information corresponding to a segment of the electromyographic signal and/or the characteristic information corresponding to the gesture signal may also be used as an input to the machine learning model to train the machine learning model. For example, frequency information and amplitude information of the electromyographic signals may be used as inputs to a machine learning model. For another example, the angular velocity of the attitude signal, the angular velocity direction, and the acceleration value of the angular velocity may be used as inputs to the machine learning model. For another example, signals (including electromyographic signals and/or posture signals) corresponding to the motion start point, the motion intermediate point, and/or the motion end point of the motion signal may be used as inputs to the machine learning model.
In some embodiments, when training a machine learning model that identifies user action types, sample information (per-segment electromyographic signals or/and gesture signals) from different action types may be labeled. For example, sample information from myoelectric and/or posture signals generated when a user performs a sitting chest session may be labeled "1", where "1" is used to characterize "sitting chest session"; the sample information from the electromyographic signals and/or gesture signals generated by the user when performing a bicep curl may be labeled "2", where "2" is used to characterize "bicep curl". The characteristic information (e.g., frequency information, amplitude information) of the electromyographic signals corresponding to different action types, the characteristic information (e.g., angular velocity direction, angular velocity value) of the gesture signals are different, and the labeled sample information (e.g., the characteristic information corresponding to the electromyographic signals and/or the gesture signals in the sample information) is used as the input of the machine learning model to train the machine learning model, so that an action recognition model for recognizing the action type of the user can be obtained, and the corresponding action type can be output by inputting the action signals into the machine learning model.
In some embodiments, the motion recognition model may also include a machine learning model for determining the quality of the user's motion. The sample information here may include standard motion signals (also referred to as positive samples) and non-standard motion signals (also referred to as negative samples). The standard motion signal may include a motion signal generated by a professional performing a standard motion. For example, the motion signal generated by a professional when performing a standard sitting chest session is a standard motion signal. The non-standard motion signal may include a motion signal generated by a user performing a non-standard motion (e.g., a false motion). In some embodiments, the electromyographic signals and/or gesture signals in the sample information may be subjected to a segmentation process of flow 700, a glitch process of flow 900, a transition process of flow 1300, and the like, to form at least one segment of electromyographic signals and/or gesture signals. The at least one electromyographic signal and/or the gesture signal may be used as an input to a machine learning model to train the machine learning model. In some embodiments, the labeling process may be performed on positive and negative samples in the sample information (each segment of the electromyographic or/and gesture signal). For example, a positive sample is marked "1" and a negative sample is marked "0". Here, "1" is used to characterize the user's action as a standard action, and "0" is used to characterize the user's action as a false action. The trained machine learning model may output different labels based on the input sample information (e.g., positive samples, negative samples). It should be noted that the motion recognition model may include one or more machine learning models for analyzing and recognizing the quality of the user's motion, and different machine learning models may analyze and recognize sample information from different motion types, respectively.
In some embodiments, the motion recognition model may also include a model that recognizes the number of motions of the user's fitness motion. For example, the motion signal (e.g., the electromyographic signal and/or the gesture signal) in the sample information is subjected to the segmentation process of the flow 700 to obtain at least one set of motion start point, motion intermediate point, and motion end point, and each set of motion start point, motion intermediate point, and motion end point is marked, for example, with the motion start point marked 1, the motion intermediate point marked 2, and the motion end point marked 3, and the marks are used as inputs of a machine learning model, and a set of consecutive "1", "2", and "3" inputs in the machine learning model can be output 1 motion. For example, inputting 3 consecutive sets of "1", "2", "3" in the machine learning model may output 3 actions.
In some embodiments, the action recognition model may also include a machine learning model for recognizing a user fatigue index. The sample information here may also comprise other physiological parameter signals such as electrocardiographic signals, respiratory rate, temperature signals, humidity signals, etc. For example, different frequency ranges of the electrocardiosignal can be used as input data of a machine learning model, and the frequency of the electrocardiosignal is marked as '1' (normal) between 60 times/min and 100 times/min, and is marked as '2' (abnormal) less than 60 times/min or more than 100 times/min. In some embodiments, the machine learning model after training may further segment according to the frequency of the electrocardiograph signal of the user and tag different indices as input data, and the machine learning model after training may output a corresponding fatigue index according to the frequency of the electrocardiograph signal. In some embodiments, the machine learning model may also be trained in conjunction with physiological parameter signals such as respiratory rate, temperature signals, and the like. In some embodiments, the sample information may be obtained from a memory device of the processing device 110. In some embodiments, the sample information may be obtained from the acquisition module 210. It should be noted that the motion recognition model may be any one of the machine learning models, a combination of the plurality of machine learning models, or include other machine learning models, and may be selected according to the actual situation. The training input to the machine learning model is not limited to one-stage (one-cycle) motion signal, and may be a partial motion signal in one-stage signal, a multi-stage motion signal, or the like.
In step 2030, an action recognition model is extracted.
In some embodiments, this step may be performed by processing device 110. In some embodiments, the processing device 110 and/or the processing module 220 may extract the action recognition model. In some embodiments, the action recognition model may be stored in the processing device 110, the processing module 220, or the mobile terminal.
In step 2040, a user action signal is acquired.
In some embodiments, this step may be performed by the acquisition module 210. For example, in some embodiments, the myoelectric sensor in the acquisition module 210 may acquire the myoelectric signal of the user, and the gesture sensor in the acquisition module 210 may acquire the gesture signal of the user. In some embodiments, the user action signal may also include other physiological parameter signals such as an electrocardiographic signal, a respiration signal, a temperature signal, a humidity signal, etc. while the user is in motion. In some embodiments, the user motion signal (e.g., the electromyographic signal and/or the gesture signal) may be acquired and then subjected to a segmentation process of flow 700, a spur process of flow 900, a transformation process of flow 1300, etc., to form at least one segment of the electromyographic signal and/or the gesture signal.
In step 2050, a user action is determined based on the user action signal by the action recognition model.
This step may be performed by processing device 110 and/or processing module 220. In some embodiments, processing device 110 and/or processing module 220 may determine the user action based on an action recognition model. In some embodiments, the action recognition model that completes the training may include one or more machine learning models. In some embodiments, the motion recognition model may include, but is not limited to, one or more of a machine learning model that classifies a user motion signal, a machine learning model that identifies a quality of a user motion, a machine learning model that identifies a number of user motions, a machine learning model that identifies a fatigue index of a user performing a motion. Different machine learning models may have different recognition effects. For example, a machine learning model that classifies a user's motion signal may take the user's motion signal as input data to output a corresponding motion type. For another example, a machine learning model that identifies the quality of a user's motion may take the user's motion signal as input data to further output the quality of the motion (e.g., standard motion, false motion). For another example, a machine learning model that identifies a fatigue index of a user performing an action may take a user's action signal (e.g., an electrocardiographic signal frequency) as input data to output the user's fatigue index. In some embodiments, the user action signal and the determination result (output) of the machine learning model may also be used as sample information for training the action recognition model, and the action recognition model may be trained to optimize the relevant parameters of the action recognition model. It should be noted that the motion recognition model is not limited to the above-described trained machine learning model, and may be a preset model, for example, a manually preset condition judgment algorithm or a manually added parameter (e.g., confidence) based on the trained machine learning model.
In step 2060, the user action is fed back based on the determination result.
In some embodiments, this step may be performed by wearable device 130 and/or mobile terminal device 140. Further, the processing device 110 and/or the processing module 220 send a feedback instruction to the wearable device 130 and/or the mobile terminal device 140 based on the determination result of the user action, and the wearable device 130 and/or the mobile terminal device 140 feedback the user based on the feedback instruction. In some embodiments, feedback may include issuing a prompt (e.g., text information, picture information, video information, voice information, indicator light information, etc.) and/or performing a corresponding action (current stimulus, vibration, pressure change, thermal change, etc.) to stimulate the user's body. For example, when the user performs sit-up, by monitoring the motion signal thereof, it is determined that the trapezius muscle is excessively hard during the exercise (that is, the motion of the head and neck of the user is not standard during the exercise), in which case the input/output module 260 (e.g., vibration prompter) in the wearable device 130 and the mobile terminal device 140 (e.g., smart watch, smart phone, etc.) perform corresponding feedback actions (e.g., applying vibration to the body part of the user, sending out voice prompt, etc.) to prompt the user to adjust the force-exerting part in time. In some embodiments, during the movement process of the user, by monitoring the motion signal during the movement process of the user, the motion type, the motion quality and the motion times of the user during the movement process are determined, and the mobile terminal device 140 can output the corresponding movement record so that the user can know the movement condition during the movement process.
In some embodiments, when feedback is provided to the user, the feedback may be matched to the user's perception. For example, when the user's motion is not standard, vibration stimulus is applied to the region corresponding to the user's motion, and the user can know that the motion is not standard based on the vibration stimulus, and the vibration stimulus is within the range acceptable to the user. Further, a matching model may be built based on the user action signal and the user perception, finding the best balance point between the user perception and the real feedback.
In some embodiments, the motion recognition model may also be trained from the user motion signals. In some embodiments, training the motion recognition model based on the user motion signal may include evaluating the user motion signal to determine a confidence level of the user motion signal. The magnitude of the confidence level may be indicative of the quality of the user action signal. For example, the higher the confidence, the better the quality of the user action signal. In some embodiments, the evaluation of the user action signal may be performed at the stage of acquisition of the action signal, preprocessing, segmentation, and/or recognition.
In some embodiments, training the motion recognition model from the user motion signal may further include determining whether the confidence level is greater than a confidence level threshold (e.g., 80), and if the confidence level is greater than or equal to the confidence level threshold, training the motion recognition model based on the user motion signal corresponding to the confidence level as sample data; if the confidence coefficient is smaller than the confidence coefficient threshold value, the user action signal corresponding to the confidence coefficient is not used as sample data to train the action recognition model. In some embodiments, the confidence level may include, but is not limited to, a confidence level at any one stage of acquisition action signal, signal preprocessing, signal segmentation, or signal recognition. For example, the confidence level of the action signal collected by the obtaining module 210 is used as a judgment standard. In some embodiments, the confidence level may also be a joint confidence level for any of several stages of acquisition action signal, signal preprocessing, signal segmentation or signal recognition. The joint confidence may be calculated based on the confidence of each stage and by averaging or weighting, etc. In some embodiments, training the motion recognition model based on the user motion signal may be performed in real-time, periodically (e.g., one day, one week, one month, etc.), or satisfying a certain amount of data.
It should be noted that the above description of the process 2000 is only for illustration and description, and is not intended to limit the application scope of the present disclosure. Various modifications and changes to the process 2000 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
In some embodiments, when the user action is not standard, the processing device 110 and/or the processing module 220 may issue a feedback instruction to the wearable device 130 and/or the mobile terminal 140 based on the determination result of the user action, and the wearable device 130 and/or the mobile terminal 140 feedback the user based on the feedback instruction. For example, the input/output module 260 (e.g., vibration prompter) in the wearable device 130 and the mobile terminal device 140 (e.g., smart watch, smart phone, etc.) may perform corresponding feedback actions (e.g., apply vibrations to a user's body part, issue voice prompts, etc.) to alert the user that the athletic activity is not standard or is wrong. In this case, although the user can receive a message indicating that there is an irregular movement during movement, it is impossible to specify the cause of irregular movement, for example, irregular movement posture, incorrect muscle force position, incorrect muscle force magnitude, or the like, based on the feedback movement. On the other hand, if the user himself/herself feels good after receiving the feedback action of the nonstandard movement action from the movement monitoring system 100, the reliability of the movement monitoring system 100 by the user may be reduced. For example, when a user performs bicep lifting, the normal posture of the motion is shoulder relaxation, and the user subjectively recognizes that the user has relaxed himself or herself, but in reality the shoulder is not self-powered, resulting in excessive trapezius muscle exertion. At this time, the primary impression of the user is contradicted by the analysis result of the wearable device 130 and/or the mobile terminal device 140, and the user may consider that the feedback result of the wearable device 130 and/or the mobile terminal device 140 is inaccurate. Therefore, the embodiment of the present disclosure also provides a method for displaying a motion monitoring interface, which uses a display device to display information related to a user's motion (for example, a force-exerting position of a muscle, a force-exerting intensity of a muscle, and a motion model of a user), so that the user can intuitively observe a problem existing in a self-motion according to display content, and timely adjust the motion to perform scientific motion.
FIG. 21A is an exemplary flow chart of a method of displaying a motion monitoring interface according to some embodiments of the present description. As shown in fig. 21A, the process 2100 may include:
step 2110, obtaining motion signals from at least one sensor when the user is in motion.
In some embodiments, step 2110 may be performed by the acquisition module 210. In some embodiments, the motion signal of the user when moving may refer to human parameter information of the user when moving. In some embodiments, the body parameter information may include, but is not limited to, one or more of an electromyographic signal, a posture signal, an electrocardiographic signal, a temperature signal, a humidity signal, an blood oxygen concentration, a respiratory rate, and the like. In some embodiments, a sensor in the acquisition module 210 may acquire motion signals as the user moves. In some embodiments, the electromyographic sensors in the acquisition module 210 may acquire electromyographic signals of the user during exercise. For example, when a user performs sitting posture chest clipping, the myoelectric sensor corresponding to the chest muscle, latissimus dorsi and other positions in the wearable device can collect the myoelectric signals of the corresponding muscle positions of the user. In some embodiments, the gesture sensor in the acquisition module 210 may acquire gesture signals of the user during motion. For example, when the user performs a barbell pushing motion, an attitude sensor in the wearable device corresponding to the position of the human triceps brachii can collect an attitude signal of the position of the user triceps brachii. In some embodiments, the at least one sensor may include, but is not limited to, one or more of a gesture sensor, an electrocardio sensor, a myo-sensor, a temperature sensor, a humidity sensor, an inertial sensor, an acid-base sensor, an acoustic wave transducer, and the like. Different types of sensors may be placed at different locations of the user's body depending on the signal to be measured, so that different types and/or different locations of sensors may collect different motion signals.
In some embodiments, the motion signal may be a motion signal obtained by filtering, rectifying, and/or wavelet transforming the motion signal of the user during the motion acquired by the plurality of sensors in the acquisition module 210, the segmentation processing of the flow 700, the burr processing of the flow 900, or a combination of any one or more of the above processing flows. As previously described, signal processing such as filtering, rectifying, and/or wavelet transformation, segmentation processing of flow 700, and spur processing of flow 900 may be performed by processing module 220 and/or processing device 110. The acquisition module 210 may acquire the processed action signal from the processing module 220 and/or the processing device 110.
In step 2120, information about the user's motion is determined by processing the motion signal.
In some embodiments, step 2120 may be performed by processing module 220. In some embodiments, the information related to the user's motion may include one or more of a user's motion type, motion frequency, motion intensity, motion model, and the like. In some embodiments, the processing module 220 may analyze the motion signal of the user, determine characteristic information of the motion signal (e.g., amplitude information of the electromyographic signal, frequency information, and/or angular velocity of the gesture signal, angular velocity direction, acceleration value of the angular velocity), and determine information related to the motion of the user based on the characteristic information of the motion signal.
In some embodiments, the information related to the user's movement may include the strength of the force of at least one muscle while the user is in motion. In some embodiments, the processing module 220 may determine the strength of the force exerted by at least one muscle of the user from the electromyographic signals acquired by the electromyographic sensor. For example, when the user performs a squatting motion, the myoelectric sensors disposed at the positions of the gluteus maximus and quadriceps of the human body can collect myoelectric signals of the positions of the corresponding muscles of the user, and the processing module 220 can determine the strength of the force of the gluteus maximus and quadriceps of the user based on the signal strength of the obtained myoelectric signals.
In some embodiments, the processing module 220 may determine the type of action of the user based on the action signal. For example, the processing module 220 may determine the type of action of the user based on the action signal and an action recognition model (e.g., the action recognition model depicted in fig. 20). For another example, the user may manually input the action type. Further, the processing module 220 may determine muscles located at the user's training site (also referred to as muscles of the exercise site) and muscles located at the user's non-training site (also referred to as muscles of the non-exercise site) based on the type of motion of the user. The muscles at the non-training site may be those that are prone to the user's misplacement of force when performing a certain action or those that are prone to the injury site. Different action types may correspond to muscles in different exercise positions and muscles in non-exercise positions. In some embodiments, the user may preset the muscles of the exercise position and the muscles of the non-exercise position corresponding to each type of action. In some embodiments, processing module 220 may determine whether the force application location of the user is correct and whether the gesture of the motion is standard based on the force application strength of the muscles of the user's exercise location and/or the muscles of the non-exercise location. For example, if the strength of the force exerted by the muscles in the exercise position is too small (e.g., less than a threshold) and/or the strength of the force exerted by the muscles in the non-exercise position is too large (e.g., greater than a threshold), the force exerted by the user may be deemed to be wrong, in which case the input/output module 260 may send a feedback signal to the user to prompt the user to adjust the athletic activity in time.
In some embodiments, the information related to the user's motion may include a user action model representing the actions of the user's motion. For example, when a user performs a dumbbell action, gesture sensors disposed at the deltoid muscle, the upper limb joint (e.g., the elbow joint) and the like of the human body may collect gesture signals of the deltoid muscle, the upper limb joint and the like of the user, and the processing module 220 may process each gesture signal to obtain characteristic information (e.g., angular velocity information, acceleration information, stress information, displacement information) corresponding to each gesture signal, and the processing module 220 may generate an action model of the dumbbell action of the user according to the characteristic information. For more on the user action model when generating user motions from gesture signals, see fig. 22 and its related description.
Step 2130, displaying information related to the user's motion.
In some embodiments, step 2130 may be performed by input/output module 260. In some embodiments, information related to the movement of the user may be displayed on a display device (e.g., screen) of the wearable device 130 or the mobile terminal device 140 to enable the user to intuitively observe the movement conditions during the movement itself.
In some embodiments, as shown in fig. 21B, the interface of the display device may display a human front muscle profile 2101 and a human back muscle profile 2102, and when the user starts to exert a force, the color of the muscle corresponding to the user's force exerting part in the human muscle profiles (e.g., the human front muscle profile 2101, the human back muscle profile 2102) may change, so that the user may intuitively feel the force exerting intensity of the muscle of his body according to the color change of the corresponding muscle part in the human muscle profile. For example, when the user performs sit-up exercise, the intensity of the force of muscles such as rectus abdominis, oblique abdominis, and transverse abdominis of the user's abdomen, and oblique quadriceps of the user's shoulder may be displayed in the human body muscle distribution map. In some embodiments, the greater the strength of the force exerted by a muscle of the user, the darker the color (e.g., closer to red) in the human muscle profile corresponding to that muscle.
In some embodiments, processing module 220 and/or the user may determine whether the sit-up activity is standard based on the strength of the exertion of the muscles at different locations. For example, when the user performs sit-up exercise, if the strength of the forces of the rectus abdominis, the extraabdominal oblique, the intraabdominal oblique, and the transverse abdominal muscles of the user's abdomen are higher than a first strength threshold (the first strength threshold may be set according to the strength of the forces of the corresponding muscles when the professional performs standard sit-up exercise), and the strength of the forces of the muscles of the trapezius of the user's shoulder is less than a second strength threshold (the second strength threshold may be set according to the strength of the forces of the corresponding muscles when the professional performs standard sit-up exercise), the processing module 220 may determine that the user is not standard for this sit-up exercise, otherwise may determine that the user is not standard for this sit-up exercise.
It should be noted that the human front muscle distribution map 2101 and the human back muscle distribution map 2102 shown in fig. 21B are only examples, and the human front muscle distribution map 2101 and the human back muscle distribution map 2102 may be arranged in an up-down arrangement, a left-right arrangement, or other arrangement that facilitates observation in the interface.
In some embodiments, the input/output module 260 may obtain user input regarding the target muscle. The target muscle may refer to a muscle that the user is more interested in exercising. For example, the target muscle may be a muscle that the user exercises with emphasis during a certain training session. In some embodiments, the location and/or number of target muscles may be related to the type of action of the user. For example, the target muscles may include one or more of gluteus maximus, quadriceps femoris, tibialis anterior, and the like when the user performs a squat maneuver. For another example, when the user performs sit-up exercises, the target muscles may include one or more of rectus abdominis, extrarectus oblique, intraabdominal oblique, transverse and trapezius muscles, and the like. In some embodiments, the processing module 220 may determine the type of motion of the user based on the motion signal and automatically determine the target muscle based on the type of motion of the user. In some embodiments, the user may manually determine the action type, and the processing module 220 may determine the target muscle from the action type entered by the user based on the correspondence between the action type and the target muscle. In some embodiments, the user may manually determine the target muscle. For example, the user may set a particular muscle in the human muscle profile as the target muscle by clicking on the particular muscle. For another example, the user may set a particular muscle as the target muscle by entering the name of the particular muscle in the interface of the display device.
In some embodiments, the interface of the display device may include status bars (e.g., status bar 2103 and status bar 2104 shown in fig. 21B). The status bar may be used to present information about the target muscle (e.g., strength of force of the target muscle). For example, when the target muscle input by the user is a pectoral muscle, the strength of the force of the pectoral muscle may be displayed by a status bar. In some embodiments, the color of the status bar is related to the strength of the force of the target muscle. For example, the darker the color of the status bar, the greater the strength of the force of the target muscle may be indicated. By displaying the status bar on the interface, the user can more intuitively feel the strength of the force of the target muscle, and the strength of the force of the muscle can be more quantitatively represented. In some embodiments, the status bar may show a proportional relationship between the force strength of the target muscle and the standard force strength (or between the maximum force strengths). The standard force intensity can be set according to the force intensity of the corresponding muscle when the professional performs standard action. The maximum force strength can be set according to the force strength limit of the human muscle. For example, if the status bar is in a full-lattice state, it is indicated that the strength of the force exerted by the target muscle of the user is consistent with the standard strength of force. The user can more intuitively feel the difference between the muscle strength and the standard muscle strength through the state bar displayed on the interface, so that the muscle strength of the user can be adjusted in time.
In some embodiments, the number of status bars may be related to the number of target muscles. For example, when the user sets the triceps brachii as the target muscle, two status bars may be displayed on the left and right sides of the interface, respectively, the left status bar (e.g., status bar 2103 in fig. 21B) may be used to display the strength of the force exerted by the user's left arm triceps brachii, and the right status bar (e.g., status bar 2104 in fig. 21B) may be used to display the strength of the force exerted by the user's right arm triceps brachii. The two state bars are used for respectively displaying the force strength of the target muscles on the left side and the right side of the user, so that the user can be helped to judge whether the force of the muscles on the left side and the right side of the body is balanced during exercise, and the damage to the body caused by uneven force of the left side and the right side of the body is avoided. It should be noted that the status bar shown in fig. 21B is by way of example only, and the status bar may be any number and may be disposed at any location of the interface.
In some embodiments, the input/output module 260 may include a sound output device (e.g., a speaker). The sound output device may emit sound (e.g., flame combustion sound, bell sound, water flow sound), and the volume of the emitted sound may be related to the strength of the force of the target muscle. For example, the volume of the emitted sound is positively correlated with the strength of the force of the target muscle, i.e., the greater the strength of the force of the target muscle, the greater the volume of the emitted sound; the weaker the strength of the force of the target muscle, the lower the volume of the sound emitted. In some embodiments, the sound output device may include a left channel and a right channel, and the different channels may correspond to the strength of the force of different target muscles. For example, the sound emitted by the left channel may correspond to the strength of the force of a target muscle on the left side of the user's body (e.g., the left brachial triceps), and the sound emitted by the right channel may correspond to the strength of the force of a target muscle on the right side of the user's body (e.g., the right brachial triceps). The user can feel the strength of the force of the muscles at different positions by using the multichannel sounding mode of the sound output device, and the user can judge whether the forces of the muscles at the left side and the right side of the body are balanced or not by hearing only, so that the experience of the user can be further improved.
It should be noted that the above description of the process 2100 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to the process 2100 may be made by those skilled in the art under the guidance of this specification. For example, step 2120 may be split into multiple steps, performing processing and determination of the action signal, respectively. However, such modifications and variations are still within the scope of the present description.
FIG. 22 is an exemplary flow chart of a method of displaying a motion monitoring interface according to some embodiments of the present description. As shown in fig. 22, the process 2200 may include:
step 2210, based on the gesture signal, generates a user action model representing the user motion action.
In some embodiments, step 2210 may be performed by processing module 220. In some embodiments, the user action model may include a user 3D action model, a user 2D action model, and the like. The user 3D motion model and/or the user 2D motion model may reproduce the motion of the user motion. It will be appreciated that the motion reproduction of the user's motion may reflect the pose of the user's motion to some extent without requiring that the motion reproduction be exactly identical to the user's actual motion.
In some embodiments, the processing module 220 may generate a user action model representing the user's athletic actions from the gesture signals acquired by the gesture sensor. In some embodiments, the plurality of gesture sensors may be placed at different positions of the wearable device 130 (e.g., positions corresponding to the trunk, limbs, joints in the wearable device 130) according to the gesture signals that need to be acquired, so as to measure gesture signals corresponding to different parts of the human body, where the plurality of position gesture signals may reflect the relative motion situation between the different parts of the human body. In some embodiments, the gesture signal is associated with a type of gesture sensor. For example, when the attitude sensor is an angular velocity triaxial sensor, the acquired attitude signal is angular velocity information. For another example, when the posture sensor is an angular velocity triaxial sensor and an acceleration triaxial sensor, the acquired posture signal is angular velocity information and acceleration information. For another example, when the posture sensor is a strain sensor, the strain sensor may be disposed at a joint position of the user, and by measuring a resistance of the strain sensor that varies with a tensile length, the acquired posture signals may be displacement information, stress, and the like, and by these posture signals, a bending angle and a bending direction at the joint of the user may be represented. For another example, the gesture sensor is an ultrasonic sensor, placed at a fixed location on a joint or limb of the user, and the position of the sensor is determined by measuring the time of flight (TOF) of the sound waves, thereby determining the gesture of the user. The gesture signal acquired by the gesture sensor and the corresponding characteristic information (such as angular velocity direction, angular velocity value, angular velocity acceleration value, angle, displacement information, stress, etc.) thereof can reflect the gesture of the user's motion. The processing module 220 may generate a user action model representing the user motion action based on the gesture of the user motion. For example, the processing module 220 may generate a virtual character (e.g., a three-dimensional or two-dimensional animated model) to exhibit the pose of the user's motion.
In some embodiments, the processing module 220 may determine other types of information related to the user's motion (e.g., muscle information) based on other types of motion signals (e.g., electromyographic signals) and present the other types of information related to the user's motion on the user motion model. In some embodiments, the processing module 220 may determine the strength of the force exerted by the at least one muscle of the user based on the electromyographic signals, and the processing module 220 may display the strength of the force exerted by the at least one muscle of the user on a corresponding location of the user action model. For example, when the user performs a squat exercise, the processing module 220 may obtain electromyographic signals from electromyographic sensors disposed at the gluteus maximus, quadriceps and tibialis anterior, and the processing module 220 may determine muscle strength of the gluteus maximus, quadriceps and tibialis anterior from the electromyographic signals, respectively, and display the muscle strength of the gluteus maximus, quadriceps and tibialis anterior at positions corresponding to the gluteus maximus, quadriceps and tibialis anterior in the user exercise model. In some embodiments, different muscle strength magnitudes may correspond to different display colors. By simultaneously displaying other types of information related to the movement of the user in the user action model, the user can more intuitively and comprehensively know the movement state of the user.
Step 2220, obtain the standard action model.
In some embodiments, step 2220 may be performed by acquisition module 210. In some embodiments, the standard motion model may be a motion model generated based on standard motion information (e.g., standard posture information, standard myoelectricity information) of a professional (e.g., fitness trainer) while exercising. In some embodiments, the standard motion model may include a standard 3D motion model, a standard 2D motion model, and the like. The standard 3D motion model and/or the standard 2D motion model may reproduce the motion of the professional's movements. It will be appreciated that the motion reproduction of standard sport may reflect to some extent the posture of the professional's sport without requiring that the motion reproduction be exactly identical to the professional's actual motion. In some embodiments, the standard motion model may present multiple types of motion-related information (e.g., muscle information) as the professional exercises.
In some embodiments, different types of actions correspond to different standard action models. For example, sit-up exercises correspond to sit-up standard motion models, dumbbell fly motions correspond to dumbbell fly standard motion models. In some embodiments, a plurality of standard motion models corresponding to a plurality of motion types may be pre-stored in a storage device of the motion monitoring system 100, and the obtaining module 210 may obtain the standard motion model corresponding to the user motion type from the storage device according to the motion type of the user.
Step 2230, displaying the user action model and the standard action model.
In some embodiments, step 2230 may be performed by the input/output module 260. In some embodiments, the display device may display the user action model and the standard action model simultaneously. For example, the user action model and the standard action model can be displayed in a superimposed or parallel manner, and the user can more intuitively and rapidly judge whether the own motion action is standard or not by observing and comparing the user action model and the standard action model, so that the motion action can be adjusted in time.
In some embodiments, it may be determined whether the user's actions require adjustment by comparing the degree of overlap between the contours of the user's action model and the contours of the standard action model. For example, if it is determined that the degree of overlap of the contour of the user action model and the contour of the standard action model is greater than a threshold (e.g., 90%, 95%, 98%), the user's action standard may be determined without adjustment. If it is determined that the degree of coincidence between the contour of the user motion model and the contour of the standard motion model is less than the threshold (e.g., 90%, 95%, 98%), it may be determined that the motion of the user is not standard. The input/output module 260 may issue a prompt to the user to alert the user to adjust the athletic activity.
In some embodiments, it may be determined whether the user's actions require adjustment by comparing the muscle information presented on the user's action model to the muscle information presented by the standard action model. For convenience of illustration, we will describe the left arm performing a bicep motion as an example. In bicep lifting, the muscles primarily involved in exercise include bicep, deltoid, trapezius and pectoral muscles. 23A-23C are schematic illustrations of motion monitoring interfaces according to some embodiments of the present description. Fig. 23A to 23C are a user action model 010 (also referred to as an electromyogram 010 of a virtual user figure) and a standard action model 020 (also referred to as a reference electromyogram 020 of a virtual reference figure) displayed on a display device, respectively. In fig. 23A to 23C, a myoelectric painting 010 of the virtual user character may be displayed in the left half of the movement monitoring interface, and a reference myoelectric painting 020 of the virtual reference character may be displayed in the right half of the movement monitoring interface. The motion monitoring interface shown in fig. 23A corresponds to an electromyogram at a time before the start of an action. As shown in fig. 23A, before the action starts, the user and professional are in a relaxed state, so that all muscles are not forced. At this time, in the electromyogram 010 of the virtual user character, the user display area 011 corresponding to the biceps brachii, the user display area 012 corresponding to the deltoid, the user display area 013 corresponding to the trapezius, and the user display area 014 corresponding to the pectoral muscle are not displayed in color. The user display area 021 corresponding to biceps brachii, the user display area 022 corresponding to deltoid, the user display area 023 corresponding to trapezius, and the user display area 024 corresponding to pectoral muscle in the reference muscle electromotive painting 020 of the virtual reference character are also not displayed in color.
The motion monitoring interface shown in fig. 23B may be a corresponding myoelectric animation at a certain moment in the course of the biceps motion. During bicep movements, the principal point of force theoretically may be the biceps brachii muscle, and in some cases the pectoral muscle may be lightly stressed, such as when the user is not lifting his head and supporting his chest. In a standard bicep curl motion, the trapezius should be involved in no or little effort. As shown in fig. 23B, the display color of the user display area 013 corresponding to the trapezius muscle in the myoelectric painting 010 of the virtual user character is darker than the color of the reference display area 023 corresponding to the trapezius muscle in the reference myoelectric painting 020 of the virtual reference character, which indicates that the trapezius muscle has a relatively large force phenomenon when the user lifts his/her head, and exceeds the force degree of the trapezius muscle in the standard bicep lifting operation.
The motion monitoring interface shown in fig. 23C corresponds to a myoelectric animation from the end of the bicep movement to a certain point before the start of the next movement cycle. In a set of consecutive two-headed bending movements, the complete relaxation is not possible after the end of one complete movement cycle until the start of the next complete movement cycle. I.e. when the barbell reaches the bottommost part, the biceps brachii cannot be completely relaxed, but a certain force should be maintained, so that the best exercise effect is achieved. As shown in fig. 23C, in the electromyogram 010 of the virtual user character, the color is not displayed in the user display area 011 corresponding to the biceps brachii, and the user is seen to be in a completely relaxed state. And the reference display area 021 corresponding to the biceps brachii muscle in the reference muscle electromotive painting 020 of the virtual reference character is darker.
In summary, the user can clearly and intuitively see the difference between the strength of the user's muscle in the myoelectric animation 010 of the virtual user character and the strength of the standard muscle in the reference myoelectric animation 020 of the virtual reference character by observing the myoelectric animation 010 of the virtual user character and the reference myoelectric animation 020 of the virtual reference character, so as to find the problem of the current movement and adjust the current movement in time. For more information on displaying user action models and standard action models reference may be made to the priority of international application No. PCT/CN2021/093302 filed on day 2021, 5, 12, the entire contents of which are incorporated herein by reference.
It should be noted that the above description of the process 2200 is for illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to the process 2200 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
FIG. 24 is an exemplary flow chart of a method of displaying a motion monitoring interface according to some embodiments of the present description. As shown in fig. 24, the process 2400 may include:
Step 2410, segmenting the motion signal based on the electromyographic or gesture signal.
In some embodiments, step 2410 may be performed by processing module 220. In some embodiments, the process of acquisition of motion signals (e.g., electromyographic signals, gesture signals) while the user is moving is continuous, and the motion while the user is moving may be a combination of multiple sets of motions or a combination of motions of different motion types. To analyze individual actions in the user's motion, the processing module 220 may segment the user action signal based on the electromyographic or gesture signals of the user's motion. In some embodiments, segmenting the motion signal may refer to dividing the motion signal into signal segments of the same or different durations, or extracting one or more signal segments of a particular duration from the motion signal. In some embodiments, each segment of action signal may correspond to one or more complete actions of the user. For example, when the user performs a squat exercise, the user may take the standing posture from standing posture to squat posture, recover the standing posture again, and the action signal collected by the obtaining module 210 during this process may be regarded as one section (or one period) of action signal, and after that, the action signal generated when the user completes the squat exercise next time and collected by the obtaining module 210 is regarded as another section of action signal. The electromyographic signals and posture signals of the corresponding parts can be changed by the change of each action step when the user moves. Based on this, the processing module 220 may segment the user's motion signal based on the electromyographic or gestural signal. For example, the processing module 220 may segment the motion signal of the user based on the feature information corresponding to the electromyographic signal or the feature information corresponding to the gesture signal. For details on the segmentation of the motion signal based on the electromyographic or gesture signal reference may be made to fig. 6-8 of the present description and the related description.
Step 2420, monitoring the motion of the user motion based on the at least one segment of motion signal, and determining the monitoring result.
In some embodiments, step 2420 may be performed by processing module 220. In some embodiments, the at least one segment of motion signal may be a motion signal of at least one training process by the user. In some embodiments, a training process may refer to a process by which a user completes one training action. For example, the user completing a squat maneuver may be a training process. In some embodiments, a training process may also refer to a process in which a user performs multiple identical or different training actions. For example, the user may complete a number of squat actions in succession as a training process. For another example, the user sequentially completing the squat and jump-in-place actions in succession may be a training process. In some embodiments, a training process may refer to a process of training actions performed by a user over a period of time. For example, a training process may be a process of training actions completed within a day, a week, a month, or a year.
It should be noted that a motion signal may be a motion signal of a complete training process or a motion signal of a partial training process in a complete training process. In some embodiments, for a complex complete training process, there are different ways of applying force and different strength of applying force to muscles at different stages of the complete training process, that is, there are different motion signals at different stages of the training process, and the real-time performance of motion monitoring of the user can be improved by monitoring the motion signals at different stages of the complete training process.
In some embodiments, the monitoring results may include one or more of a type of action, a number of actions, a quality of action, a time of action, physiological parameter information of the user, a core stability of the user, an intermittent time, an expected recovery time, etc. of the user during at least one training session. The physiological parameter information of the user may include, but is not limited to, one or more of heart rate (e.g., average heart rate, maximum heart rate), blood pressure, body temperature, energy expenditure during exercise, and the like. In most training, the muscle groups of the abdomen and the waist are kept in a tension state to keep the trunk stable, so that the training efficiency is improved, and the injury risk is reduced. The ability of the lumbar and abdominal muscle group to maintain force is referred to as core stability. Intermittent time may refer to the interval between two consecutive actions. For example, when the user performs a squat maneuver, the intermittent time may refer to the time interval between the user starting the first squat maneuver and starting the second squat maneuver. The expected recovery time may refer to the time that the user returns each body part (e.g., muscle) from a state of motion to a normal state after the training is completed. For example, the expected recovery time may be a time for which the muscles of the user are recovered from a fatigue state to a relaxed state after the exercise is completed.
In some embodiments, the user athletic activity may be monitored based on at least one segment of the activity signal, with the monitoring result determined. In some embodiments, the monitoring result (e.g., motion type, motion quality) may be determined based on at least one segment of motion signal (e.g., electromyographic signal, gesture signal) and at least one segment of preset motion signal (e.g., preset electromyographic signal, preset gesture signal). The at least one predetermined motion signal may be a standard motion signal acquired by a sensor during standard motion by a professional. The preset action signal may be stored in the database in advance. In some embodiments, the action type or the action quality of the user during the movement can be determined by determining the matching degree between the characteristic information corresponding to the at least one section of action signal and the characteristic information corresponding to the at least one section of preset action signal. For example, if it is determined that the matching degree between the feature information corresponding to the one section of motion signal of the user and the feature information corresponding to the one section of preset motion signal is higher than a certain threshold (for example, 95%), it may be determined that the motion type of the user when moving is consistent with the motion type of the preset motion signal. For another example, if it is determined that the matching degree between a section of motion signal of the user and a section of preset motion signal corresponding to the same type is higher than a certain threshold (for example, 95%), it may be determined that the motion quality of the user during the motion meets the requirement, and no adjustment is needed. In some embodiments, the monitoring result (e.g., heart rate, energy expenditure) of the user's motion may be determined based on characteristic information corresponding to physiological signals (e.g., electrocardiographic signals, respiratory signals) of the user acquired by different types of sensors. For more description regarding determining a user's motion type, number of motions, quality of motions, time of motions, physiological parameter information of the user, etc., reference may be made to fig. 19-20 of the present specification and the related description thereof.
In some embodiments, the method of monitoring the user based on at least one segment of the motion signal and determining the monitoring result may be an algorithm that is not based on another segment of the motion signal. In some embodiments, the algorithm is a machine learning based model in which an action signal is input, passed through a neural network model, or a conventional machine learning model, capable of giving the action type, number of actions, quality of actions, or points of error that the action exists. In some embodiments, the algorithm is a state machine transition based algorithm that is capable of outputting the type of action, the number of actions, the quality of the action, or the point of error that the action exists when the action is subject to a series of states. In some embodiments, the algorithm is a combination of threshold determinations that give the action type, number of actions, quality of action, or the point of error that the action exists by determining whether the action signal satisfies a series of conditions.
In some embodiments, the core stability of the user may be determined based on the electromyographic signals acquired by the electromyographic sensors. For example, the core stability of the user may be determined based on the ratio of the time the user's abdominal muscles are forced during one training session. The higher the duty cycle of the user's abdominal muscle exertion time during a training session, the higher the user's core stability. In some embodiments, the core stability of the user may be determined based on the gesture signal acquired by the gesture sensor. For example, the core stability of the user may be determined based on the magnitude of the motion of the user's torso during a training session. In some embodiments, the core stability of the user may be determined based on the electromyographic signals and the gesture signals. For example, the core stability of the user may be determined based on the duty cycle of the time the user's abdominal muscles develop and the magnitude of the user's torso movement during a training session.
In some embodiments, the monitoring results may include muscle information of the user. In some embodiments, the muscle information of the user may include, but is not limited to, at least one of a degree of participation of at least one muscle, energy expenditure of at least one muscle, a degree of fatigue of at least one muscle, an equalization of at least two muscles, an ability of at least one muscle, and the like.
The degree of muscle engagement (also referred to as contribution) and the degree of fatigue may indicate whether the user has effectively exercised the target training muscle (e.g., the key training muscle) during exercise, and whether other non-target training muscles have developed a force compensation, so that the user's exercise quality may be assessed. In some embodiments, the energy expenditure of the muscle of the user may be determined based on the electromyographic signals and the training time of the muscle. In some embodiments, the degree of engagement of each muscle may be determined based on the proportion of the energy expenditure of each muscle in the user's exercise to the energy expenditure of all muscles. For example, assuming that the user consumes 500 kcal of energy for all muscles in a certain training and 250 kcal of energy for pectoral muscles, it can be determined that the participation degree (contribution degree) of pectoral muscles is 50%. In some embodiments, the degree of muscle engagement may be determined based on characteristic information of the electromyographic signals. The characteristic information of the electromyographic signals may include amplitude information (e.g., root mean square amplitude, integrated electromyographic, amplitude envelope) and/or frequency information (e.g., average power frequency, median frequency, short time zero crossing rate) of the electromyographic signals. For example, the degree of participation of a muscle may be determined based on the percentage of integrated myoelectricity of the muscle during a training session (or in one action).
In some embodiments, the electromyographic signals may be pre-processed and the degree of muscle engagement determined based on the magnitude information and/or frequency information of the pre-processed electromyographic signals. In some embodiments, because different muscles have different types of muscle fibers and different amounts of muscle, the magnitude of the electromyographic signals that different muscles can send out is also different. For example, with the same subjective effort, muscle groups such as biceps brachii tend to give larger electromyographic signals, while muscle groups such as pectoral muscles give smaller electromyographic signals. Therefore, the electromyographic signals can be normalized to eliminate or attenuate the difference in the magnitude of the electromyographic signals emitted by different muscle groups. In some embodiments, there is a non-linear relationship between the electromyographic signal and the force exerted by the user, e.g., as the force exerted by the user is greater, the magnitude of the electromyographic signal increases slowly. Therefore, the amplitude of the electromyographic signals can be processed in a nonlinear manner, and the processed electromyographic signals are used for determining the participation degree of muscles.
The degree of muscle fatigue may be used to evaluate the maximum ability of the user's muscles and the ability of the muscles to grow, thereby reflecting whether the user's muscles have been adequately exercised. When the user performs exercise (especially strength training), the exercise can make muscles enter a fatigue state, and then the natural restoration of the body is utilized to form excessive recovery, so that the increase of muscle strength, volume, endurance and explosive force is brought, and therefore, the assessment of the muscle fatigue degree of the user after the exercise is very necessary. In some embodiments, the degree of muscle fatigue may be determined based on characteristic information of the electromyographic signals. For example, the degree of fatigue of the muscle may be determined based on the degree of change (e.g., degree of decline) in the characteristic value (e.g., average power frequency, median frequency, short time zero crossing rate) of the electromyographic signal during at least one training session (e.g., between multiple actions). For another example, if the magnitude of the electromyographic signal is detected to exhibit a decreasing trend during multiple movements by the user, it is indicated that the muscle has gradually entered a state of fatigue. The faster the magnitude of the electromyographic signal decreases (i.e., the higher the slope of the magnitude), the higher the degree of fatigue of the muscle. For another example, if a high degree of jitter in the amplitude of the electromyographic signal is detected, it is indicated that the muscle has gradually entered a state of fatigue. For another example, the fatigue state of the muscle may be determined based on the degree of stability of the myoelectric amplitude envelope. The lower the stability of the myoelectric amplitude envelope, the higher the fatigue of the muscle. In some embodiments, the degree of fatigue of the muscle may be determined based on characteristic information of the gesture signal (e.g., angular velocity direction, acceleration of angular velocity, angle, displacement information, stress). For example, if a high degree of shaking of the posture signal is detected, the user's motion is shaken or severely deformed, which indicates that the muscle is in a fatigue state.
In some embodiments, the degree of fatigue of the muscle may be determined using a trained machine learning model. For example, a trained machine learning model may be generated by training an initial model based on sample information. In some embodiments, the sample information may include sample motion signals and sample muscle fatigue levels for a plurality of users. The sample fatigue level may be determined based on the sample motion signal. In some embodiments, a training algorithm may be used to train the initial model based on the sample information to generate a trained machine learning model. Exemplary training algorithms may include gradient descent algorithms, newton algorithms, quasi-newton algorithms, conjugate gradient algorithms, generative resistance learning algorithms, and the like. The trained machine learning model is used to determine the degree of fatigue of the user's muscles based on the user's motion signals. For example, a user's motion signal may be input to a trained machine learning model, which may output the degree of fatigue of the user's muscles.
In some embodiments, it may be determined whether the current movement exceeds the load of the user based on the degree of fatigue of the user's muscles. For example, when it is determined that the fatigue level of a certain muscle of the user exceeds a first fatigue threshold, it may be determined that the current amount of motion has exceeded the load of the user, at which time a prompt may be sent to the user to alert the user to reduce the amount of motion or stop the motion, preventing injury. For another example, when it is determined that the fatigue degree of a certain muscle of the user is lower than the second fatigue threshold, it may be determined that the current movement amount of the user is insufficient, an expected training effect cannot be achieved, or it is indicated that the user has more residual force, and at this time, a prompt may be sent to the user to remind the user to increase the movement amount and ensure the training effect. In some embodiments, the recovery time may be predicted based on the user's fatigue level and fed back to the user to help the user plan the next movement in advance.
In some embodiments, the balance of the at least two muscles may be the degree of balance of the movements of the left and right muscles in the same muscle group of the user's body. For example, the balance of at least two muscles may refer to the balance of the left and right pectoral large muscles of the user. In the exercise process of the user, the muscles on the left side and the right side of the body are in an unbalanced state, so that the attractive appearance of the motion can be influenced, the standard degree of the motion can be influenced, and when the unbalanced degree of the muscles on the left side and the right side of the body is high, the user can face the risk of injury. Therefore, it is necessary to monitor the balance of the muscles on the left and right sides of the user's body. In some embodiments, the balance of the muscle may include the balance of the strength of the force of the muscle, the balance of the degree of fatigue of the muscle, the balance of the energy expenditure of the muscle, and the like.
In some embodiments, the equality of at least two muscles may be determined based on characteristic information of the motion signal (e.g., electromyographic signal, gesture signal). In some embodiments, it may be determined whether the strength of the force of the two muscles is balanced by comparing the magnitude information (e.g., root mean square magnitude, integrated myoelectricity, magnitude envelope) of the myoelectricity signals of the two muscles. For example, if the difference in the amplitude information of the electromyographic signals of two muscles is within the threshold range, the strength of the force of the two muscles can be considered to be approximately the same. In some embodiments, it may be determined whether the fatigue levels of the two muscles are the same by comparing the frequency information (e.g., average power frequency, median frequency, short time zero crossing rate) of the electromyographic signals of the two muscles. For example, if the difference in the frequency information of the electromyographic signals of two muscles is within the threshold range, it can be considered that the fatigue levels of the two muscles are substantially the same. In some embodiments, the equality of the user's action gestures may be determined by comparing characteristic information (e.g., acceleration, angular velocity) of the gesture signals of the two muscles to determine whether the motion velocities, angles of motion of the limbs on the left and right sides of the user's body are consistent. In some embodiments, the degree of balance of the muscles on the left and right sides of the user's body may be comprehensively determined based on the degree of balance of the strength of the force of at least two muscles, the degree of fatigue of at least two muscles, the degree of balance of the motion gesture of the user, and the like. In some embodiments, when it is determined that the degree of muscle balance on the left and right sides of the user is poor, a prompt may be sent to the user to remind the user to strengthen the exercise of certain muscle groups or to improve the posture of the current exercise to ensure the effect of the exercise.
The ability of the muscles can be measured by the amount of training the user reaches when he is exhausted in training. In some embodiments, the muscle's ability may be represented by a characteristic amount determined by one or more of energy expenditure, number of exercise groups, number of times, weight, time, etc. For example, the muscle's ability may be expressed by the total work obtained by multiplying the total number of movements by the total weight, or by the power obtained by multiplying the total number of movements by the total weight divided by time. In some embodiments, the user's muscle fatigue level may be determined based on the electromyographic signals and/or the gesture signals, and the user's training amount (e.g., energy expenditure) when the user's muscle fatigue level is high (e.g., above a fatigue threshold) may be determined, and the user's training amount (e.g., energy expenditure) at this time may be taken as the user's muscle's ability.
Step 2430 determines a feedback mode of the action based on the monitoring result.
In some embodiments, step 2430 may be performed by processing module 220.
In some embodiments, the feedback pattern of actions may include one or more of feedback manner, feedback priority, feedback content, and the like. In some embodiments, feedback means may include, but is not limited to, one or more of a text prompt, a voice prompt, an image prompt, a video prompt, a vibration prompt, a pressure prompt, and the like. For example, the text prompts may be displayed via a display of the input/output module 260. The voice prompts may be implemented by the input/output module 260 and/or speakers in the wearable device 130 playing sounds. The image cues and video cues may be implemented by the input/output module 260 and/or a display in the wearable device 130. The vibration alert may be accomplished by vibration of the input/output module 260 and/or the vibration module in the wearable device 130. The pressure cues may be implemented by electrodes in the wearable device 130. In some embodiments, the feedback manner of the motion may be determined according to the type of motion of the user motion. For example, when the user is running, the monitoring result can be fed back to the user in a mode of selecting a voice prompt, a vibration prompt or a pressure prompt because the text prompt is not easy to be received by the user.
In some embodiments, feedback priority may include immediate feedback, feedback after one action is complete, feedback after one training is complete, and so on. Immediate feedback may refer to the input/output module 260 immediately feeding back to the user by way of corresponding feedback when the user experiences a problem during exercise (e.g., the strength of the force of the muscle is too high). One action/training completion feedback may be that the user feedback to the user is provided by way of training advice from the input/output module 260 after a certain action/training is completed. In some embodiments, the feedback priority of an action may be determined based on the type of action of the user. For example, if the motion type of the user motion is a motion that easily causes injury to the user, such as a crouching motion, the knee is easily buckled, so that the knee of the user is damaged, at this time, the priority of the feedback mode of the motion may be higher and a more striking feedback mode (for example, a text prompt with a sign) may be selected for feedback, so that the user can receive feedback in time and adjust the motion gesture. For another example, if the motion type of the user motion is a bicep motion, the user arm is easy to keep a force continuously at the lowest point but is in a relaxed state, so that the training efficiency is low, but the user body is not damaged, at this time, the priority of the feedback mode of the motion may be low, for example, the feedback may be performed by means of text prompt after the user training is completed.
In some embodiments, it may be determined whether the user athletic movement is incorrect based on the monitoring results and a feedback priority of the movement may be determined based on the type of user athletic movement error. The type of action error may reflect the degree of damage to the user's body when the user is in action error. In some embodiments, the action error types may be classified into a primary action error type, a secondary action error type, and a tertiary action error type. The first-level action error type may be an action error type which easily causes injury (such as knee button in squatting) of a user, the second-level action error type may be an action error type which causes a target training muscle to not be effectively exercised (such as arm bending mode to exert force in sitting posture chest-clamping action, so that biceps is exercised and pectoral muscle is not exercised), and the third-level action error type may be an action error type which causes lower training efficiency (such as running speed is too slow). In some embodiments, when the action error type is a primary action error type, the feedback priority may be immediate feedback; when the action error type is a secondary action error type, the feedback priority can be feedback after one action is completed; when the action error type is three-level action error type, the feedback priority may be feedback after one training is completed.
In some embodiments, the feedback content may include one or more of monitoring results (e.g., action type, number of actions, quality of action, time of action), action error type, degree of completion of action, training advice, and the like. In some embodiments, the processing module 220 may determine the feedback content according to the motion monitoring result of the motion type, the motion error type, and the like of the user motion. For example, after the user finishes one training, the input/output module 260 may feed back training information (e.g., action type, action number, action quality, action time) in the training process to the user, so as to help the user to fully understand the training process. For another example, when a user experiences a malfunction during exercise (e.g., a knee button in a squat maneuver), the input/output module 260 may prompt the user for a current malfunction, helping the user adjust the maneuver in time. In some embodiments, when a user experiences a motor error (e.g., a muscle force error) during movement, the user's error may be displayed at the corresponding muscle location in the user's motion model. For example, the user may be prompted for muscle strength errors at the corresponding muscle locations in the user action model using edge blinking, logos, text, symbols (e.g., exclamation marks), etc.
Step 2440, performing motion feedback on the user according to the mode of motion feedback.
In some embodiments, step 2440 may be performed by input/output module 260.
In some embodiments, the input/output module 260 may present the monitoring results to the user in text, graphics (e.g., line graphs, bar graphs, pie charts, bar graphs), sound, images, video, etc.
FIG. 25 is a schematic illustration of a motion monitoring interface, according to some embodiments of the present description. As shown in fig. 25, basic training information and exercise count of the user after one training is displayed in a text manner in an interface 2500. In some embodiments, the user may pre-formulate a training program before training begins, and after training ends, compare the basic training information after training with the training program, thereby helping the user determine the completion of the training program.
FIG. 26 is a schematic diagram of a motion monitoring interface according to some embodiments of the present disclosure, as shown in FIG. 26, showing the energy expenditure of various muscle portions of a user after a single exercise in a pie chart and text format in interface 2600. As can be seen from fig. 26, in this training, the muscle energy consumption of the user is sequentially from high to low, the pectoral muscle, biceps brachii, latissimus dorsi, and other muscles. The duty ratio of the energy consumption of each part of muscle can be intuitively observed by a user through a cake chart mode.
FIG. 27 is a schematic illustration of a motion monitoring interface, according to some embodiments of the present description. As shown in fig. 27, the fatigue level of the muscle, the evaluation of the fatigue level, and the maximum capacity evaluation of the muscle after one training by the user are graphically and literally displayed in an interface 2700. As shown in fig. 27, different degrees of muscle fatigue may be represented by circular patterns of different colors, and fatigue evaluation may be performed on each part of the muscle according to the degree of muscle fatigue and the maximum capacity of the muscle (e.g., full play, remaining, easy writing).
FIG. 28 is a schematic view of a motion monitoring interface shown according to some embodiments of the present description. As shown in fig. 28, the balance of the muscles on the left and right sides of the body of the user after one training is shown in a bar graph in the interface 2800. Each muscle has a bar corresponding thereto, and the position, length and/or color of the bar may indicate the equilibrium of the corresponding kind of muscle. For example, the longer and/or darker the length of the bar corresponding to the muscle, the worse the equilibrium of the muscle. As shown in fig. 28, the columnar bars corresponding to the pectoral muscle and biceps brachii are positioned on the right side, and can indicate that the energy of the pectoral muscle on the right side and biceps brachii on the right side is higher; the columnar bars corresponding to the latissimus dorsi are positioned on the left side and can indicate that the energy of the latissimus dorsi muscle on the left side is higher. In addition, the length of the columnar bars corresponding to pectoral muscles is longer (or darker) than the length of the columnar bars corresponding to biceps brachii, indicating that the balance of pectoral muscles is lower than that of latissimus dorsi.
FIG. 29 is a schematic diagram of a motion monitoring interface shown according to some embodiments of the present description. As shown in fig. 29, the stress time duty ratio of the abdominal muscle during one training is displayed in the interface 2900 in the form of a status bar, so that the core stability of the user can be reflected. For example, as can be seen from fig. 29, the user has a 70% stress time ratio of the abdominal muscles during one training session (e.g., sit-up), and the core stability is better.
In some embodiments, the monitoring results may be displayed in a user model (e.g., a human front muscle distribution model 2101, a human back muscle distribution model 2102, as shown in fig. 21B, a user action model 010, as shown in fig. 23A-23C). For example, one or more of energy expenditure of at least one muscle of the user, a degree of fatigue of the at least one muscle, training equality of the at least two muscles, a capacity of the at least one muscle, etc. may be displayed at least one specific location in the user model, wherein the at least one specific location in the user model corresponds to a location of the at least one muscle in the user. In some embodiments, different muscle energy consumption, different muscle fatigue levels, different muscle training equality, and/or different muscle capacities correspond to different display colors, thereby enabling a user to more intuitively feel the training results. In some embodiments, the input/output module 260 may obtain user input regarding the target muscle and display information of the target muscle in a display interface.
FIG. 30 is a schematic diagram of a motion monitoring interface shown according to some embodiments of the present description. As shown in fig. 30, the user's muscle contribution (e.g., percent muscle energy expenditure) during a training session is displayed in the form of a body muscle profile in interface 3000. As can be seen from fig. 30, the contribution of the left pectoral large muscle of the user is 20%, the contribution of the right pectoral large muscle is 30%, and the contribution of both the left biceps brachii and the right biceps brachii are 20%. In some embodiments, the higher the muscle contribution, the darker the color of the muscle at the corresponding location in the muscle profile.
FIG. 31 is a schematic illustration of a motion monitoring interface according to some embodiments of the present description. As shown in fig. 31, the fatigue level of the muscle of the user during one training is displayed in the interface 3100 in the form of a human muscle profile. For example, the higher the degree of fatigue of a muscle, the darker the color of the muscle at the corresponding location in the muscle profile.
It should be noted that the interface presentation shown in fig. 25-31 is by way of example only, and in some embodiments, the balance and/or muscle capacity of at least two muscles may be displayed in the interface in the form of a human muscle profile. In some embodiments, multiple monitoring results may be presented simultaneously in multiple ways in one interface. For example, the user's muscle contribution and muscle fatigue during a training session may be displayed simultaneously in the body muscle profile. For another example, the energy consumption of each part of the muscle of the user after one training may be displayed in a pie chart manner in the interface, and simultaneously the energy consumption of each part of the muscle of the user during one training may be displayed in the human muscle profile.
In some embodiments, the athletic monitoring system 100 may count athletic data during multiple user training sessions to generate athletic records, thereby helping users to learn about their own physical performance and fitness changes during long-term athletic activities and helping users maintain good athletic habits.
FIG. 32 is a schematic diagram of a motion monitoring interface shown according to some embodiments of the present description. As shown in fig. 32, the degree of muscle contribution (or muscle energy expenditure) of the user at various locations over different training periods (e.g., training periods in units of days, weeks, months, years) is shown in interface 3200 by bar graph 3210. For example, the contribution of different muscles may be displayed in different colors in a bar. In some embodiments, the user may select a target muscle in the muscle profile 3220 in the interface 3200. For example, the user may click on a certain muscle in the muscle profile 3220 as the target muscle. As shown in fig. 33, when the user selects the pectoral muscle 3330 in the muscle profile 3320 as the target muscle, the contribution of the pectoral muscle in different training periods is shown by a bar graph 3310 in the interface 3300. By making long-term statistics on the contribution degree of each muscle group, the user can be helped to know the training preference and the training history of the user, for example, which muscles are frequently exercised and which muscles are not exercised for a long time, so that the user can be helped to better formulate a training plan.
FIG. 34 is a schematic diagram of a motion monitoring interface shown according to some embodiments of the present description. As shown in fig. 34, the maximum energy expenditure of each muscle of the user during a training session is shown in interface 3400 by bar graph 3410, reflecting the ability of each muscle. In some embodiments, the user may select a target muscle in the muscle profile 3420 in the interface 3400. For example, the user may click on a certain muscle in the muscle profile 3420 as the target muscle. As shown in fig. 35, when the user selects pectoral muscle 3530 in muscle profile 3520 as the target muscle, the maximum energy expenditure of the pectoral muscle over different training periods is now shown in interface 3500 by line graph 3510. Through carrying out long-term statistics on the abilities of all muscle groups, a user can know the increase of the abilities of the user, so that the user is helped to better formulate a training plan.
FIG. 36 is a schematic illustration of a motion monitoring interface shown according to some embodiments of the present description. As shown in fig. 36, the user's muscle balance is illustrated in interface 3600 by bar chart 3610. In some embodiments, the user may select a target muscle in the muscle profile 3620 in the interface 3600. For example, the user may click on a certain muscle in the muscle profile 3620 as the target muscle. The interface may then exhibit equilibrium of the target muscle over different training periods. By recording the balance (or core stability) of the muscles for a long time, the user can be helped to know the deficiency of the exercise of the user, and the training program can be adjusted in time.
It should be noted that the above description of the process 2400 is merely for purposes of example and illustration, and is not intended to limit the scope of the present disclosure. Various modifications and changes to the flow 2400 may be made by those skilled in the art under the guidance of this specification. However, such modifications and variations are still within the scope of the present description.
In some embodiments, the motion monitoring system 100 may calibrate the user action signal acquired by the sensor. In some embodiments, the electromyographic signals collected by the electromyographic sensors are susceptible to a variety of factors (e.g., user individual differences, user skin status, location of the electromyographic sensors, muscle strength, muscle fatigue), where factors such as user individual differences, user skin status, and location of the electromyographic sensors may result in direct comparison of the collected electromyographic signals for different users. Therefore, the electromyographic signals need to be calibrated, so that the influence of factors such as individual differences of users, skin states of the users, installation positions of the electromyographic sensors and the like on the electromyographic signals is eliminated or weakened. In some embodiments, prior to the start of the exercise (e.g., the warm-up phase), the exercise monitoring system 100 may guide the user through a series of calibration actions (e.g., push-ups, etc. that are capable of mobilizing a large number of muscle groups to activate a large portion of the muscle groups to be detected). For example, a display (e.g., screen) of the wearable device 130 or the mobile terminal device 140 may display the calibration action, and the user may follow the indication to perform the corresponding calibration action. The processing module 220 may use the electromyographic signals collected by the electromyographic sensor during the calibration action of the user as a reference value, and calibrate all the electromyographic signals collected during the current movement of the user. For example, describing a push-up motion as an example, before starting the exercise, the exercise monitoring system 100 may guide the user to perform a plurality of sets of push-up exercises (e.g., 3-5 push-ups), and collect myoelectric signals of activated muscles such as pectoral muscles, bicep muscles, triceps brachii, rectus abdominis muscles of the user through the myoelectric sensor, with a specific multiple of the myoelectric amplitude of the muscles activated by the push-up motion as a reference value. In some embodiments, the multiple may range between 1.2-5 times. For example, the multiple may be between 1.2-3 times. In some embodiments, each muscle may correspond to a different multiple. The multiple may be a value preset by the user or the athletic monitoring system 100, or may be a value determined by analyzing a characteristic of the electromyographic signal. In some embodiments, the reference value of the electromyographic signal of the target user in the current motion may be determined based on a plurality of historical electromyographic signals acquired during calibration actions performed during a plurality of historical motions of the target user. In some embodiments, the reference value of the electromyographic signal of the target user in the current exercise may be determined based on a plurality of electromyographic signals acquired by a plurality of users during the calibration action. The accuracy and rationality of the reference value of the electromyographic signals in the motion can be improved by adjusting the electromyographic signals acquired by the target user in the calibration action by using a plurality of historical electromyographic signals acquired by the target user in the calibration action and/or electromyographic signals acquired by other users in the calibration action.
In some embodiments, the athletic monitoring system 100 may guide the user through the warming exercise and display the results of the user's warming exercise. The sports performance of the user can be improved by the warm-up sports before sports, muscle twitches of the user are prevented from occurring in the sports process, and the injury risk is reduced. In some embodiments, a display (e.g., screen) of wearable device 130 or mobile terminal device 140 may display a series of warming actions to direct the user to make a warming exercise. In some embodiments, the processing module 220 may determine the user's warming exercise results from the user's physiological information. For example, since the warming up exercise may cause the user's heart rate to increase, body temperature to increase, and perspiration amount to increase, a sensor (e.g., an electrode) or other hardware device provided on the wearable device 130 may detect contact resistance generated by the electrode contacting the human body, thereby determining the perspiration state of the human body, and determining whether the warming up exercise of the user is sufficient according to the perspiration state of the human body. For another example, it may be determined whether the user's warm-up is adequate based on the user's degree of muscular fatigue. For another example, it may be determined whether the user's warm-up exercise is sufficient based on information of the user's amount of exercise, heart rate, body temperature, and the like. In some embodiments, a warm-up suggestion may be presented to the user based on the result of the warm-up exercise, e.g., to indicate to the user that the warm-up exercise is sufficient to begin a formal exercise, or to indicate to the user that the warm-up exercise needs to continue.
In some embodiments, the processing module 220 may determine whether the working state of the sensor is normal based on the action signal collected by the sensor. The operational state of the sensor may include a contact state between the sensor and the skin. The contact state between the sensor and the skin may include the degree of fitting between the sensor and the skin, the contact resistance between the sensor and the skin, and the like. The quality of the motion signal collected by the sensor arranged on the skin of the user is related to the contact state between the sensor and the skin, for example, when the fitting degree of the sensor and the skin is poor, more noise exists in the motion signal collected by the sensor, so that the motion signal cannot reflect the real motion state of the user. In some embodiments, the degree of fit between the sensor and the skin may be determined based on the quality of the motion signal (e.g., the amount of noise in the motion signal) and/or the contact impedance between the sensor and the skin. If the bonding degree between the sensor and the skin is lower than a certain threshold, the abnormal working state of the sensor can be determined, and prompt information can be sent to a user at the moment so as to remind the user to check the state of the sensor. FIG. 37 is a schematic view of a motion monitoring interface shown according to some embodiments of the present description. As shown in fig. 37, interface 3700 may exhibit a human muscle profile 3710, and the sensor at the right pectoral muscle location is marked with a lower degree of fit to the user's skin by dashed line 3720. In some embodiments, the locations of the sensor that are less conforming to the user's skin may be marked by other means (e.g., by using different colors).
In some embodiments, the user's motion signal may include a signal related to a user characteristic. The processing module 220 may determine the user's characteristic information based on the signal related to the user's characteristic. The feature information of the user may include body type information, body composition information, and the like. Body shape information may include waist circumference, chest circumference, hip circumference, arm length, leg length, shoulder width, etc. The body composition information may include body weight, body fat rate, fat distribution, fat thickness, muscle distribution, bone density, and the like. For example, a plurality of strain sensors may be provided at a plurality of locations of the user's body, and the acquired motion signals may be displacement information, stress, or the like by measuring the magnitude of resistance of the strain sensors that varies with the length of extension, by which the body shape information of the user can be characterized. For another example, electrical signals may be applied to electrodes provided at a plurality of portions of the user's body, and information of the internal electrical conductivity characteristics of the human body may be extracted by measuring the body surface potential, thereby performing positioning measurement on the body composition of the user.
In some embodiments, the athletic monitoring system 100 may monitor the user's characteristic information over a long period of time and display the results of the statistical analysis to the user to help the user better understand the physical condition and to formulate a more rational athletic plan. For example, the athletic monitoring system 100 may recommend appropriate athletic activities to the user, such as, for example, increased muscle movement, decreased fat movement, stretching movement, etc., based on changes in the user's characteristic information over a period of time (e.g., the user's fat distribution profile, the user's portion of muscle distribution profile).
In some embodiments, a wearable device of an appropriate size may be recommended to the user based on the user's body type information. For example, if the user becomes thin after long-term exercise, a prompt may be sent to the user to alert the user that a new wearable device may be replaced. For another example, when the user is selecting other types of wearable devices, the user may be recommended an appropriate size based on the user's body type information.
In some embodiments, the user may select to perform the perception training mode when the user is wearing the wearable device 130 to perform a motion. In the perception training mode, when a user's muscle (e.g., a target muscle) is exerting a force, the display means (e.g., screen) of the wearable device 130 or the mobile terminal device 140 may display the strength of the force exerted by the muscle. For example, the strength of the force of the target muscle may be displayed by a status bar (e.g., status bars 2103 and 2104 shown in fig. 21B). For another example, the strength of the force of the target muscle may be demonstrated by the magnitude of the sound emitted by the sound output device (e.g., speaker). For another example, the change in the strength of the force of the target muscle may be demonstrated by changing the brightness and color of the corresponding muscle portion in the user model. In some embodiments, if the strength of the force of the user's target muscle is consistent with the standard strength of force, the user may be prompted (e.g., voice prompts, text prompts, etc.) to help the user to strengthen the sense of controlling the muscle. Through the perception training mode, the device can help users learn to control limbs and muscles, increase the control capacity of the brain and the nervous system on the muscles, effectively improve the athletic performance, improve the action mode and even correct the posture.
In some embodiments, the athletic monitoring system 100 may formulate a user's athletic plan based on the user's relevant information. The relevant information of the user may include characteristic information (e.g., gender, body type information, body composition information), movement history, injury history, health status, desired training goals (e.g., muscle-increasing training, fat-reducing training, cardiopulmonary-enhancing training, posture-correcting training), desired training intensity (e.g., high-intensity training, medium-intensity training, low-intensity training), training category preferences (e.g., instrument training, deadweight training, anaerobic training, aerobic training), etc. of the user. In some embodiments, an exercise program may be formulated by a professional (e.g., an fitness trainer) based on information about the user and uploaded to the exercise monitoring system 100. The user may modify and adjust the movement plan according to the actual situation. FIG. 38 is a schematic diagram of a motion monitoring interface shown according to some embodiments of the present description. As shown in FIG. 38, the user may specify an appropriate exercise program for the user based on the user's inputs and selections by inputting or selecting a training target (e.g., muscle desired to be enhanced, target to be enhanced), training intensity (e.g., high intensity training, medium intensity training, low intensity training), training preferences (e.g., instrument training, deadweight training, anaerobic training, aerobic training), training time and planning period, etc. in interface 3800.
In some embodiments, the athletic monitoring system 100 may estimate the useful life (e.g., remaining usable time, remaining washable number, remaining usable number) of the wearable device. For example, the wearable device may include a garment life analysis module. The garment life analysis module may determine a loss level of the wearable device based on a contact impedance between the sensor and the user, a quality of motion signals (e.g., electromyographic sensor signals, inertial sensor signals, stress sensor signals) acquired by the sensor, a status of the wearable device (e.g., number of cleanings, time used, number of times used), and predict a life time based on the loss level of the wearable device. In some embodiments, when the lifetime of the wearable device is less than a certain usage time (e.g., one week) or less than a certain number of uses (e.g., 5 times), a prompt may be sent to the user to alert the user to timely replace the new wearable device.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements and adaptations of the application may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within the present disclosure, and therefore, such modifications, improvements, and adaptations are intended to be within the spirit and scope of the exemplary embodiments of the present disclosure.

Claims (15)

1. A method for displaying a motion monitoring interface, the method comprising:
acquiring action signals of a user during movement from at least one sensor, wherein the action signals at least comprise electromyographic signals or gesture signals;
determining information related to the movement of the user by processing the motion signal; and
displaying the information related to the movement of the user.
2. The method of claim 1, wherein the determining information related to the user's motion by processing the motion signal comprises:
based on the electromyographic signals, a strength of the force of at least one muscle of the user is determined.
3. The method of claim 2, wherein the displaying the information related to the user's motion comprises:
acquiring user input about a target muscle; and
displaying a status bar, the color of the status bar being related to the strength of the force of the target muscle, or
A sound is emitted, and the volume of the sound is related to the strength of the force of the target muscle.
4. The method of claim 1, wherein the determining information related to the user's motion by processing the motion signal comprises:
Based on the gesture signal, a user action model is generated that represents an action of the user motion.
5. The method of claim 4, wherein the displaying the information related to the user's motion comprises:
obtaining a standard action model; and
and displaying the user action model and the standard action model.
6. The method of claim 4, wherein the displaying the information related to the user's motion comprises:
determining a strength of force of at least one muscle of the user based on the electromyographic signal; and
the strength of the force exerted by the at least one muscle is displayed on the user action model.
7. The method of claim 1, wherein the determining information related to the user's motion by processing the motion signal comprises:
segmenting the motion signal based on the electromyographic signal or the gesture signal; and
and monitoring the motion of the user motion based on at least one section of the motion signal, and determining a monitoring result.
8. The method of claim 7, wherein the method further comprises:
Determining a mode of action feedback based on the monitoring result; and
and according to the action feedback mode, carrying out action feedback on the user.
9. The method of claim 7, wherein the monitoring result includes information of a muscle of the user corresponding to at least one point in time, the muscle information of the user including at least one of energy consumption of at least one muscle, a degree of fatigue of the at least one muscle, an equilibrium of the at least two muscles, a capacity of the at least one muscle, the displaying the information related to the movement of the user, comprising:
at least one of energy expenditure of at least one muscle of the user, a degree of fatigue of the at least one muscle, a training balance of the at least two muscles, a capacity of the at least one muscle is displayed at least one location in a user model, wherein the at least one location in the user model corresponds to a location of the at least one muscle in the user.
10. The method of claim 9, wherein the displaying the information related to the user's motion comprises:
Acquiring user input about a target muscle; and
displaying information of the target muscle.
11. The method of claim 7, wherein the displaying the information related to the user's motion comprises:
and displaying the monitoring result in at least one mode of characters, charts, sounds, images and videos.
12. The method according to claim 1, wherein the method further comprises:
and calibrating the action signal.
13. The method according to claim 1, wherein the method further comprises:
judging whether the working state of the sensor is normal or not based on the action signal; and
and if the working state of the sensor is abnormal, displaying prompt information.
14. The method of claim 1, wherein the action signal comprises a signal related to the user characteristic, the method further comprising:
determining body type information and/or body composition information of the user based on the signals related to the user characteristics; and
and displaying the body type information and/or the body composition information of the user.
15. An electronic device, the electronic device comprising:
A display device configured to display content;
an input device configured to receive user input;
at least one sensor configured to detect a motion signal when a user moves, wherein the motion signal comprises at least an electromyographic signal or a gesture signal; and
a processor connected to the display device, the input device, and the at least one sensor, the processor configured to:
acquiring motion signals of the user during movement from the at least one sensor;
determining information related to the movement of the user by processing the motion signal; and
the display device is controlled to display information related to the movement of the user.
CN202210270372.0A 2022-03-18 2022-03-18 Motion monitoring method and device Pending CN116785659A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210270372.0A CN116785659A (en) 2022-03-18 2022-03-18 Motion monitoring method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210270372.0A CN116785659A (en) 2022-03-18 2022-03-18 Motion monitoring method and device

Publications (1)

Publication Number Publication Date
CN116785659A true CN116785659A (en) 2023-09-22

Family

ID=88038465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210270372.0A Pending CN116785659A (en) 2022-03-18 2022-03-18 Motion monitoring method and device

Country Status (1)

Country Link
CN (1) CN116785659A (en)

Similar Documents

Publication Publication Date Title
US9750454B2 (en) Method and device for mobile training data acquisition and analysis of strength training
US9226706B2 (en) System, apparatus, and method for promoting usage of core muscles and other applications
US20180055375A1 (en) Systems and methods for determining an intensity level of an exercise using photoplethysmogram (ppg)
CN107961523A (en) Human body training system and intelligent body-building system based on heart rate detection
US20230233103A1 (en) Motion monitoring methods and systems
US20230210402A1 (en) Methods and devices for motion monitoring
Wang et al. Motion analysis of deadlift for trainers with different levels based on body sensor network
CN116785659A (en) Motion monitoring method and device
CN115105819B (en) Motion monitoring method and system
CN210575125U (en) Intelligent sports equipment
US10779748B2 (en) Biometric electromyography sensor device for fatigue monitoring and injury prevention and methods for using same
TWI837620B (en) Method and system for motion monitoring
TW202239378A (en) Method and system for motion monitoring
RU2813471C1 (en) Methods and systems for identifying user action
CN115105819A (en) Motion monitoring method and system
US20230337989A1 (en) Motion data display method and system
Liang et al. WMS: Wearables-Based Multi-Sensor System for In-home Fitness Guidance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination