CN113384863A - Damage risk assessment method, medium, chip, terminal device and system - Google Patents

Damage risk assessment method, medium, chip, terminal device and system Download PDF

Info

Publication number
CN113384863A
CN113384863A CN202010175108.XA CN202010175108A CN113384863A CN 113384863 A CN113384863 A CN 113384863A CN 202010175108 A CN202010175108 A CN 202010175108A CN 113384863 A CN113384863 A CN 113384863A
Authority
CN
China
Prior art keywords
user
terminal device
injury
motion
gravity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010175108.XA
Other languages
Chinese (zh)
Inventor
董晓杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010175108.XA priority Critical patent/CN113384863A/en
Publication of CN113384863A publication Critical patent/CN113384863A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0625Emitting sound, noise or music
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0655Tactile feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/01User's weight
    • A63B2230/015User's weight used as a control parameter for the apparatus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/62Measuring physiological parameters of the user posture
    • A63B2230/625Measuring physiological parameters of the user posture used as a control parameter for the apparatus

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Physiology (AREA)
  • Public Health (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The embodiment of the application relates to an injury risk assessment method, which is used for a first terminal device and comprises the following steps: acquiring personal data of a user, wherein the personal data comprises the height of the user; in response to the first operation, acquiring motion data through a sensor of the first terminal device, the motion data being related to movement of the first terminal device; determining vertical amplitude and a landing angle according to personal data and motion data, wherein the vertical amplitude is related to the change of the height of the gravity center of the user in motion, and the landing angle is related to the landing condition of the user in motion; determining the damage risk brought to the user by the movement according to the vertical amplitude and the landing included angle; in the event that the risk of injury is greater than or equal to the threshold, the user is alerted to the possible injury while in motion. The embodiment of the application can more accurately find the abnormal movement posture, reduce the injury risk brought to the limbs of the user in the movement process and enable the user to move more scientifically. The application also relates to a medium, a chip, a terminal device and a system.

Description

Damage risk assessment method, medium, chip, terminal device and system
Technical Field
One or more embodiments of the present application generally relate to the field of artificial intelligence, and in particular, to a method, medium, chip, terminal device, and system for damage risk assessment.
Background
Running is a simplest exercise mode and is deeply favored by people. Indoor treadmill scenarios, outdoor free running scenarios, and marathon competitions are common.
If a person who keeps running for a long time does not have a correct running posture, the knee pain is easy to occur. If the wrong running posture is not corrected in a timely manner, damage to the knee can build up and eventually run off because of intolerance of the pain associated with the knee. Therefore, the impact force to the knee part in the running process can be reduced by correcting the wrong running posture in time, and the risk of knee injury is reduced.
Fig. 1 shows an interface schematic diagram of a running posture evaluating device (e.g., a handheld terminal, etc.) in the prior art, as shown in the figure, the running posture evaluating device generally uses wearable devices of waist, wrist and foot to acquire various indexes of a runner during running, such as average touchdown time, average touchdown impact, average eversion angle, average swing angle and touchdown mode, and then analyzes each index of the runner based on the range of the various indexes statistically obtained from running data of a plurality of excellent runners. For example, as shown in fig. 1, the average touchdown time of a runner is 366 ms, which is at a longer level relative to the statistical range of excellent runners, while the average touchdown impact, average eversion angle, and average swing angle are at a normal level relative to the statistical range of excellent runners.
However, the physical conditions of each person are different, and the evaluation of the running posture according to the index range obtained by the big data statistics of excellent runners is not accurate enough, and may not be effective.
Disclosure of Invention
The present application is described below in terms of several aspects, embodiments and advantages of which are mutually referenced.
A first aspect of the application relates to a method for injury risk assessment of a limb of a user, the method comprising: acquiring the weight of a user and exercise data related to the exercise of the user; determining, from the movement data, a vertical amplitude associated with a change in the height of the centre of gravity of the limb during movement and/or a footprint angle formed by at least a portion of the limb and the ground in the event that at least a portion of the limb lands during movement; assessing the risk of injury to the limb from exercise based on at least one of body weight, vertical amplitude, and angle to ground; and in the event that the risk of injury is greater than or equal to the threshold, alerting the user that the limb may be injured by the movement. Wherein the motion data comprises at least one of acceleration, angular velocity and direction of motion.
According to the embodiment of the application, the abnormal movement posture can be more accurately found on the basis of not increasing the cost by carrying out the evaluation of the limb injury risk based on at least one of the weight, the vertical amplitude and the landing angle of the user.
Further, when the injury risk is larger than or equal to the threshold value, a prompt is sent to the user, so that the user can adjust the wrong exercise posture in time, the injury risk brought to the limbs of the user in the exercise process is reduced, and the user can exercise more scientifically.
In some embodiments, determining, from the motion data, a vertical amplitude associated with a change in the height of the center of gravity of the limb in motion comprises: obtaining a center of gravity of a limb of the user based at least in part on the height of the limb of the user and the gender of the user; obtaining a change in elevation of the center of gravity during the movement based at least in part on the center of gravity, wherein the elevation of the center of gravity comprises a vertical distance of the center of gravity from the ground; and determining a vertical amplitude based at least in part on the change in the height of the center of gravity, wherein the vertical amplitude comprises a difference between a maximum value and a minimum value of the height of the center of gravity.
In some embodiments, determining, from the motion data, a footprint angle that the at least a portion of the limb makes with the ground while the at least a portion of the limb is grounded in the motion comprises: determining the included angle of contact based on the height of the center of gravity and the location of at least a portion of the limb during movement by:
Figure BDA0002410549520000021
wherein, alpha represents a landing angle; h isGRepresenting the height of the center of gravity; l represents the landing distance of the user's limb, including the horizontal distance of the landing point from the center of gravity.
In some embodiments, determining, from the motion data, a footprint angle that the at least a portion of the limb makes with the ground while the at least a portion of the limb is grounded in the motion comprises: determining a footprint angle based at least in part on the height of the center of gravity and the footprint of at least a portion of the limb during the movement by:
Figure BDA0002410549520000022
wherein, alpha represents a landing angle; h isGRepresenting the height of the center of gravity; l represents the landing distance of the user's limb, including the horizontal distance of the landing point from the center of gravity.
In some embodiments, the landing distance is a positive value if the landing point is located forward of the center of gravity, and the landing distance is zero if the landing point is located rearward of the center of gravity.
In some embodiments, assessing the risk of injury to the limb from the movement based on at least one of body weight, vertical amplitude, and angle to footprint, comprises: determining an injury risk factor indicative of a degree of impact of at least one of the body weight, the vertical amplitude, and the strike angle on the injury risk based on the at least one of the body weight, the vertical amplitude, and the strike angle.
In some embodiments, the injury risk factor is greater where the user's weight is heavier, or where the vertical amplitude is greater, or where the included angle of touchdown is smaller.
In some embodiments, determining an injury risk factor indicative of a degree of impact of at least one of body weight, vertical amplitude, and strike angle on injury risk based on at least one of body weight, vertical amplitude, and strike angle comprises: determining a damage risk factor based on the body weight, the vertical amplitude and the landing angle by the following formula:
Figure BDA0002410549520000023
wherein IR represents a damage risk factor; m represents the weight of the user; h represents a vertical amplitude; g represents the gravitational acceleration of the limb; and alpha represents a ground contact angle.
In some embodiments, assessing the risk of injury to the limb from the exercise based on at least one of body weight, vertical amplitude, and angle to footprint, further comprises: selecting an injury probability corresponding to the injury risk factor from a plurality of injury probabilities associated with a plurality of known samples; wherein the plurality of damage probabilities are obtained based on statistics of damage risk factors and damage outcomes for each of the plurality of known samples.
In some embodiments, alerting the user that the motion may damage the limb in the event the risk of injury is greater than or equal to a threshold value, comprises: and in the case that the damage probability is determined to be larger than or equal to the threshold value, reminding the user of possible damage to the limb through at least one of a language mode, a vibration mode and a display mode.
In some embodiments, in the event the risk of injury is greater than or equal to the threshold, alerting the user that the limb may be injured with respect to the movement, further comprising: in the event that it is determined that the impairment probability is greater than or equal to the threshold, the user is advised to adjust the posture of the motion.
According to some embodiments of the application, the user is advised to adjust the movement posture, so that the user can adjust the wrong movement posture in time, the injury risk of the limbs of the user caused by the movement process is reduced, and the user can move scientifically.
In some embodiments, obtaining motion data related to the motion of the user comprises: motion data relating to the motion of the user is acquired in response to receiving the operation of the user.
A second aspect of the present application provides a terminal device for performing injury risk assessment on a limb of a user, the terminal device comprising: an input/output unit for acquiring a weight of a user; a sensor for acquiring motion data relating to a motion of a user; a risk factor determination module for determining, based on the movement data, a vertical amplitude associated with a change in the height of the center of gravity of the limb during the movement, and/or a landing angle formed by at least a portion of the limb and the ground when the at least a portion of the limb lands in the movement; the injury risk assessment module is used for assessing injury risks brought to limbs by movement based on at least one of the weight, the vertical amplitude and the landing included angle; the input/output unit is further used for reminding the user that the limb is possibly damaged by the movement if the damage risk is larger than or equal to the threshold value. Wherein the motion data comprises at least one of acceleration, angular velocity and direction of motion.
According to the embodiment of the application, the abnormal movement posture can be more accurately found on the basis of not increasing the cost by carrying out the evaluation of the limb injury risk based on at least one of the weight, the vertical amplitude and the landing angle of the user.
Further, when the injury risk is larger than or equal to the threshold value, a prompt is sent to the user, so that the user can adjust the wrong exercise posture in time, the injury risk brought to the limbs of the user in the exercise process is reduced, and the user can exercise more scientifically.
In some embodiments, the risk factor determination module is configured to determine a vertical amplitude associated with a change in a height of a center of gravity of the limb in motion based on the motion data, including: obtaining a center of gravity of a limb of the user based at least in part on the height of the limb of the user and the gender of the user; obtaining a change in elevation of the center of gravity during the movement based at least in part on the center of gravity, wherein the elevation of the center of gravity comprises a vertical distance of the center of gravity from the ground; and determining a vertical amplitude based at least in part on the change in the height of the center of gravity, wherein the vertical amplitude comprises a difference between a maximum value and a minimum value of the height of the center of gravity.
In some embodiments, the risk factor determining module is configured to determine, from the motion data, a landing angle that the at least one portion of the limb makes with the ground while the at least one portion of the limb lands in the motion, and includes: determining a footprint angle based at least in part on the height of the center of gravity and the footprint of at least a portion of the limb during the movement by:
Figure BDA0002410549520000031
wherein, alpha represents a landing angle; h isGRepresenting the height of the center of gravity; l represents the landing distance of the user's limb, including the horizontal distance of the landing point from the center of gravity.
In some embodiments, the risk factor determining module is configured to determine, from the motion data, a landing angle that the at least one portion of the limb makes with the ground while the at least one portion of the limb lands in the motion, and includes: determining a footprint angle based at least in part on the height of the center of gravity and the footprint of at least a portion of the limb during the movement by:
Figure BDA0002410549520000041
wherein, alpha represents a landing angle; h isGRepresenting the height of the center of gravity; l represents the landing distance of the user's limb, including the horizontal distance of the landing point from the center of gravity.
In some embodiments, the landing distance is a positive value if the landing point is located forward of the center of gravity, and the landing distance is zero if the landing point is located rearward of the center of gravity.
In some embodiments, the injury risk assessment module is configured to assess risk of injury to the limb from the movement based on at least one of weight, vertical amplitude, and angle to ground, including being configured to: based on at least one of the body weight, the vertical amplitude, and the footprint angle, a damage risk factor is determined that indicates a degree of influence of the at least one of the body weight, the vertical amplitude, and the footprint angle on the risk of damage.
In some embodiments, the injury risk factor is greater where the user's weight is heavier, or where the vertical amplitude is greater, or where the included angle of touchdown is smaller.
In some embodiments, the injury risk assessment module is to determine an injury risk factor indicative of a degree of impact of at least one of weight, vertical amplitude, and strike angle on the injury risk based on at least one of weight, vertical amplitude, and strike angle, including to: determining a damage risk factor based on the body weight, the vertical amplitude and the landing angle by the following formula:
Figure BDA0002410549520000042
wherein IR represents a damage risk factor; m represents the weight of the user; h represents a vertical amplitude; g represents the gravitational acceleration of the limb; and alpha represents a ground contact angle.
In some embodiments, the injury risk assessment module is configured to assess risk of injury to the limb from the exercise based on at least one of body weight, vertical amplitude, and angle to ground, further comprising: selecting an injury probability corresponding to the injury risk factor from a plurality of injury probabilities associated with a plurality of known samples; wherein the plurality of damage probabilities are obtained based on statistics of damage risk factors and damage outcomes for each of the plurality of known samples.
In some embodiments, the input/output unit is configured to alert the user that the limb may be injured by the movement if the injury risk is greater than or equal to a threshold, and includes: and under the condition that the damage risk evaluation module determines that the damage probability is greater than or equal to the threshold value, reminding the user of possible damage to limbs through at least one of a language mode, a vibration mode and a display mode.
In some embodiments, the input/output unit is configured to alert the user that the limb may be injured by the movement if the injury risk is greater than or equal to a threshold value, and further includes: in the event that the injury risk assessment module determines that the injury probability is greater than or equal to a threshold, the user is advised to adjust the posture of the motion.
According to the embodiment of the application, the abnormal movement posture can be more accurately found on the basis of not increasing the cost by carrying out the evaluation of the limb injury risk based on at least one of the weight, the vertical amplitude and the landing angle of the user.
In some embodiments, the sensor is for acquiring motion data related to the motion of the user, including: motion data relating to a motion of a user is acquired in response to an input/output unit receiving an operation of the user.
A third aspect of the present application provides an injury risk assessment method, which may be applied to a first terminal device and may include: acquiring personal data of a user, wherein the personal data comprises the height of the user; in response to the first operation, acquiring motion data through a sensor of the first terminal device, the motion data being related to movement of the first terminal device; determining vertical amplitude and a landing angle according to personal data and motion data, wherein the vertical amplitude is related to the change of the height of the gravity center of the user in motion, and the landing angle is related to the landing condition of the user in motion; determining the damage risk brought to the user by the movement according to the vertical amplitude and the landing included angle; and alerting the user to possible injuries while in motion if the risk of injury is greater than or equal to the threshold.
According to the embodiment of the application, the damage risk is evaluated based on the vertical amplitude and the landing angle, so that the abnormal movement posture of the user can be more accurately found on the basis of not increasing the cost.
Further, when the injury risk is larger than or equal to the threshold value, a prompt is sent to the user, so that the user can adjust the wrong exercise posture in time, the injury risk brought to the limbs of the user in the exercise process is reduced, and the user can exercise more scientifically.
In some embodiments, the profile further includes a gender of the user, and determining the vertical amplitude from the profile and the motion data includes: determining the gravity center of the user according to the height of the user and the gender of the user; determining the change of the height of the gravity center of the user in motion according to the gravity center, wherein the height of the gravity center comprises the vertical distance of the gravity center from the ground; and determining a vertical amplitude based on the change in the height of the center of gravity, wherein the vertical amplitude comprises a difference between a maximum value and a minimum value of the height of the center of gravity.
In some embodiments, determining the toe angle based on the profile and the athletic data includes: determining a landing angle according to the height of the center of gravity and the landing position of the user in motion by the following formula:
Figure BDA0002410549520000051
wherein, alpha represents a landing angle; h isGRepresenting the height of the center of gravity; l represents a landing distance of the user, including a horizontal distance of the landing point from the center of gravity.
In some embodiments, determining the toe angle based on the profile and the athletic data includes: determining a landing angle according to the height of the center of gravity and the landing position of the user in motion by the following formula:
Figure BDA0002410549520000052
wherein, alpha represents a landing angle; h isGRepresenting the height of the center of gravity; l represents a landing distance of the user, including a horizontal distance of the landing point from the center of gravity.
In some embodiments, the profile further includes a weight of the user, and determining a risk of injury to the user from the movement based on the vertical amplitude and the angle to land includes: and determining an injury risk factor indicating the influence degree of the body weight, the vertical amplitude and the landing angle on the injury risk according to the body weight, the vertical amplitude and the landing angle.
In some embodiments, the injury risk factor is greater where the user's weight is heavier, or where the vertical amplitude is greater, or where the included angle of touchdown is smaller.
In some embodiments, determining an injury risk factor indicative of a degree of impact of the weight, the vertical amplitude, and the strike angle on the injury risk based on the weight, the vertical amplitude, and the strike angle comprises: determining a damage risk factor according to the body weight, the vertical amplitude and the landing angle through the following formula:
Figure BDA0002410549520000061
wherein IR represents a damage risk factor; m represents the weight of the user; h represents a vertical amplitude; g represents the gravitational acceleration of the limb; and alpha represents a ground contact angle.
In some embodiments, determining the risk of injury to the limb from the movement based on the vertical amplitude and the included angle of touchdown further comprises: selecting an injury probability corresponding to the injury risk factor from a plurality of injury probabilities associated with a plurality of known samples; wherein the plurality of damage probabilities are obtained based on statistics of damage risk factors and damage outcomes for each of the plurality of known samples.
In some embodiments, alerting the user to possible injuries while in motion if the injury risk is greater than or equal to a threshold value comprises: and in the case that the damage probability is determined to be larger than or equal to the threshold value, reminding the user of possible damage in motion through at least one of a language mode, a vibration mode and a display mode.
In some embodiments, in the event that the risk of injury is greater than or equal to a threshold, alerting the user to possible injury while in motion, further comprising: in the event that it is determined that the impairment probability is greater than or equal to the threshold, the user is advised to adjust the posture of the motion.
According to some embodiments of the application, the user is advised to adjust the movement posture, so that the user can adjust the wrong movement posture in time, the injury risk of the limbs of the user caused by the movement process is reduced, and the user can move scientifically.
In some embodiments, in response to the first operation, comprising: and receiving first information from the second terminal equipment, wherein the first information is used for instructing the first terminal equipment to acquire the motion data.
In some embodiments, alerting the user to possible injuries while in motion if the injury risk is greater than or equal to a threshold value comprises: and sending second information to the second terminal equipment, wherein the second information is used for enabling the second terminal equipment to remind the user of possible damage in the movement.
In some embodiments, the motion data includes at least one of acceleration, angular velocity, and direction of motion.
A fourth aspect of the present application provides a system, comprising a first terminal device and a second terminal device, the second terminal device being configured to: acquiring personal data of a user and sending the personal data to first terminal equipment; receiving a first operation from a user, and responding to the first operation, sending first information to the first terminal equipment, wherein the first information is used for instructing the first terminal equipment to acquire motion data, and the motion data is related to the movement of the first terminal equipment; the first terminal device is configured to: acquiring motion data through a sensor of a first terminal device in response to receiving first information from a second terminal device; determining vertical amplitude and a landing angle according to personal data and motion data, wherein the vertical amplitude is related to the change of the height of the gravity center of the user in motion, and the landing angle is related to the landing condition of the user in motion; determining the damage risk brought to the user by the movement according to the vertical amplitude and the landing included angle; and sending second information to the second terminal equipment under the condition that the damage risk is larger than or equal to the threshold value, wherein the second information is used for enabling the second terminal equipment to remind the user of possible damage in motion.
A fifth aspect of the application provides a machine-readable medium having stored thereon instructions which, when executed on a machine, cause the machine to perform any of the above methods.
A sixth aspect of the present application provides a terminal device, including: a processor; a memory having instructions stored thereon that, when executed by the processor, cause the terminal device to perform any of the above methods.
A seventh aspect of the application provides a chip on which a computer program is stored, which when executed by a processor performs any of the above methods.
Drawings
FIG. 1 is a schematic view of an interface of a running posture assessment apparatus of the prior art;
FIG. 2 is a schematic illustration of a user in motion according to an embodiment of the present application;
fig. 3 is a flowchart illustrating a method for evaluating a motion gesture of a user by the wearable device 201 and the terminal device 202 in fig. 2 according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a user interface 400 of the terminal device 202 of FIG. 2 according to an embodiment of the present application;
fig. 5 is a schematic diagram of the terminal device 202 in fig. 2 establishing a bluetooth communication connection with the wearable device 201 according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a user interface 600 of the terminal device 202 of FIG. 2 according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a user interface 700 of the terminal device 202 of FIG. 2 according to an embodiment of the present application;
FIG. 8 is a schematic illustration of vertical amplitude according to an embodiment of the present application;
FIG. 9 is a schematic view of a contact angle according to an embodiment of the present application;
FIG. 10A is a schematic diagram of a user interface 1000A of the terminal device 202 of FIG. 2 according to an embodiment of the present application;
FIG. 10B is a schematic diagram of a user interface 1000B of the terminal device 202 of FIG. 2 according to an embodiment of the present application;
fig. 11 is a flowchart illustrating a method for the wearable device 201 in fig. 2 to evaluate the risk of injury to a limb of a user during movement according to an embodiment of the present application;
fig. 12 is a flowchart illustrating an intention of a method of the wearable device 201 in fig. 2 evaluating a motion posture of a user according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 14 is another schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the prior art, the technical scheme of running posture evaluation based on big data statistics of excellent runners is not accurate enough; although wearable equipment such as intelligent shoe-pad, intelligent running shoes can utilize the data that pressure sensor gathered, more comprehensive, more accurate running appearance evaluation, it is purpose-built for professional sportsman usually, and is too high to the common user cost, and intelligent shoe-pad, intelligent running shoes function singleness moreover need be changed shoe-pad before running, the experience that running shoes brought for the user is also not good enough.
In the embodiment of the application, the damage risk of the limbs of the user in the exercise process can be evaluated by utilizing the sensor capable of acquiring the exercise data of the user and the personal data of the user, and the damage risk is used as the evaluation of the exercise posture without professional intelligent insoles, intelligent running shoes and the like, so that the accuracy of the evaluation of the exercise posture can be improved on the basis of not improving the evaluation cost of the exercise posture.
FIG. 2 shows a schematic view of a user in motion, and the motion being performed by the user may include, but is not limited to, a fast walk, a jog, or other types of motion. As shown in fig. 2, in an embodiment, a user may wear a wearable device 201 (or referred to as a wearable terminal device 201) at a wrist part and/or an ankle part during a movement, and a sensor (for example, but not limited to, an acceleration sensor, a gyroscope, a magnetometer, and the like) in the wearable device 201 may acquire movement data of the user during the movement, such as, but not limited to, acceleration, angular velocity, a movement direction, and the like, in real time. Examples of wearable device 201 may include, but are not limited to, a smart wristwatch, a smart bracelet, and the like.
As shown in fig. 2, in another embodiment, the user may wear the terminal device 202 at an arm part during exercise, and a sensor (for example, but not limited to, an acceleration sensor, a gyroscope, a magnetometer, and the like) in the terminal device 202 may acquire motion data of the user during exercise, such as, but not limited to, acceleration, angular velocity, direction of motion, and the like, in real time. Examples of terminal device 202 may include, but are not limited to, a smart wristwatch, a smart bracelet, a smart phone, a portable or mobile device, a personal digital assistant, a cellular phone, a handheld PC, and so forth.
It should be noted that the wearing positions of the wearable device 201 and the terminal device 202 are not limited to those shown in fig. 2, for example, the user may also hold the wearable device 201 and/or the terminal device 202 during the exercise. In addition, in the embodiment of the present application, the wearable device 201 and the terminal device 202 may be collectively referred to as a terminal device.
The method of evaluating the movement posture of a user is described below with reference to fig. 3-12 by way of several embodiments.
Method example 1
In this embodiment, the wearable device 201 in fig. 2 is worn on a limb of a user, and the terminal device 202 may control the wearable device 201 to obtain motion data of the user during the motion process in real time, and evaluate the injury risk of the limb of the user during the motion process according to the profile and the motion data of the user.
Fig. 3 shows a flow diagram of a method for evaluating a motion posture of a user by the wearable device 201 and the terminal device 202 in fig. 2, and it should be noted that although in the embodiment of the present application, the steps of the method are presented in a specific order, the order of the steps may be changed in different embodiments, and in some embodiments, one or more steps shown in the order in the present specification may be performed simultaneously. As shown in fig. 3, the method includes:
s301, the terminal device 202 receives the personal data of the user.
Fig. 4 shows a user interface 400 of the terminal device 202, through which the user can input the profile, for example, as shown in fig. 4, the user can input gender "male" through a menu item 401, "6/1/1989/402/185 cm in birth date", 403/83 kg in weight, through a menu item 404. Note that the personal profile received by the terminal device 202 is not limited to that shown in fig. 4.
S302, Bluetooth communication connection is established between the terminal device 202 and the wearable device 201;
fig. 5 shows a schematic diagram of the terminal device 202 establishing a bluetooth communication connection with the wearable device 201, as shown in fig. 5, the wearable device 201 is worn around the wrist of the user, the terminal device 202 includes a user interface 500, and the user can select a device name of the wearable device 201, for example, "BAND 1", from an "available devices" menu item 501 in the user interface 500, so that the terminal device 202 establishes a bluetooth communication connection with the wearable device 201.
The Communication between the terminal device 202 and the wearable device 201 is not limited to bluetooth Communication, and may be Near Field Communication (NFC), Wireless Fidelity (WiFi), or the like.
S303, the terminal device 202 transmits the profile of the user to the wearable device 201 based on the bluetooth communication.
Wherein, the personal data of the user at least comprises the height, the weight and the sex of the user. S304, the terminal device 202 receives an instruction from the user to turn on the motion posture evaluation function.
Fig. 6 and 7 show a user interface 600 and a user interface 700 of the terminal device 202, respectively, and the user may click a "start motion" key 601 in the user interface 600 when preparing to start a motion, and in response to the click operation by the user, the terminal device 202 may then display the user interface 700 for inquiring of the user whether to turn on the motion gesture evaluation function, and in response to the user clicking a "yes" key 701 in the user interface 700, the terminal device 202 receives an instruction to turn on the motion gesture evaluation function.
Note that, in the case where the wearable device 201 automatically turns on the motion posture evaluation function, the terminal device 202 may receive an instruction to start the motion from the user only through the user interface 500, and need not receive an instruction to turn on the motion posture evaluation function through the user interface 700.
S305, the terminal device 202 sends, to the wearable device 201, an instruction to start acquiring the motion data of the user and an instruction to evaluate the risk of injury of the limb of the user during the motion process based on the bluetooth communication.
Note that the profile of the user may be transmitted together with the instruction in S305, instead of being transmitted in S303. In addition, in a case where the wearable device 201 automatically turns on the motion posture evaluation function, the terminal device 202 may transmit an instruction to start acquiring the motion data of the user to the wearable device 201 without transmitting an instruction to evaluate a risk of injury of the limb of the user during the motion.
S306, the wearable device 201 obtains the motion data of the user in real time through the sensor during the motion process of the user. S307, the wearable device 201 evaluates the injury risk of the limb of the user during the exercise process based on the personal data and the exercise data of the user.
In one example, wearable device 201 may utilize the injury probability of a user's limb to characterize the risk of injury to the user's limb (e.g., without limitation, the knee) during motion.
To determine the damage probability of the user's limb, the wearable device 201 may first determine the vertical amplitude and the angle of touchdown of the user during the movement based on the user's profile and the movement data. Fig. 8 shows a schematic diagram of the vertical amplitude, which, as shown in fig. 8, refers to the difference between the maximum and minimum of the height of the centre of gravity of the user's limb during movement, i.e. the vertical distance of the centre of gravity from the ground. The landing angle refers to the angle that at least a portion of the user's limb makes with the ground in the case that at least a portion of the user's limb lands on the ground during exercise, and fig. 9 shows a schematic view of the landing angle, wherein the landing angle is shown as the angle α that the user's leg makes with the ground in the case that the user's foot lands on the ground. In the embodiment of the application, the weight, the vertical amplitude and the included angle of the ground in the personal data of the user are collectively referred to as risk factors causing the exercise to bring injury risk to the limbs of the user.
The wearable device 201 may also determine an injury risk factor indicating a degree of influence of the risk factor on the injury risk according to the risk factor, and select an injury probability corresponding to the injury risk factor from a plurality of injury probabilities associated with a plurality of known samples, wherein the plurality of injury probabilities are obtained based on statistics of the injury risk factor and the injury result for each of the plurality of known samples.
S308, the wearable device 201 determines whether the damage probability of the limb of the user is greater than or equal to a threshold, if so, continues to execute S309, and if not, returns to execute S306;
s309, the wearable device 201 generates a reminding message for reminding the user to adjust the exercise posture.
In one example, the reminding message may include two parts, namely, a reason for causing a high probability of injury to the user's limb and an adjustment method of the movement posture. For example, wearable device 201 may generate the following alert message: "detect you are touchdown the contained angle at present too big, have great limbs and injure the risk, please reduce the stride in time, in order to avoid causing limbs and injure").
In one example, the alert message may be a voice message or a text message.
In another example, the wearable device 201 may also send the reason causing the high probability of injury to the limb of the user and the adjustment method of the motion posture to the terminal device 202, and the terminal device 202 generates the alert message.
S310, the wearable device 201 sends the reminding message to the terminal device 202 based on bluetooth communication.
And S311, the terminal device 202 reminds the user to adjust the movement posture according to the reminding message.
In one example, terminal device 202 may prompt the user to adjust the motion gesture by displaying a text-form reminder message on the display screen. For example, fig. 10A shows a user interface 1000A of the terminal device 202, where the user interface 1000A includes a prompt box 1001A for displaying a reminder message.
In another example, the terminal device 202 may play a voice-form reminding message through a speaker to remind the user to adjust the motion posture.
In another example, the terminal device 202 may also vibrate the body while playing and/or displaying the reminder message.
It should be noted that, in this embodiment, the terminal device 202 may also be used to evaluate the risk of injury of the limb of the user during the movement. In this case, the terminal device 202 may not perform S303, and at S305, send an instruction to start acquiring motion data, instead of an instruction to turn on the motion posture evaluation function, to the wearable device 201 based on the bluetooth communication; the wearable device 201 may not perform S307-S309, and at S310, send the acquired motion data, instead of the reminder message, to the terminal device 202; the terminal device 202 may further determine the damage probability of the user's limb based on the user's profile and the motion data, and generate a prompt message if it is determined that the damage probability of the user's limb is greater than or equal to a threshold value.
It should be noted that, in this embodiment, the risk of injury to the limb of the user during the exercise may also be evaluated by a cloud server not shown in fig. 2. In this case, the terminal device 202 may not perform S303, and at S305, send an instruction to start acquiring motion data, instead of an instruction to turn on the motion posture evaluation function, to the wearable device 201 based on the bluetooth communication; the wearable device 201 may not perform S307-S309, and at S310, send the acquired motion data, instead of the reminder message, to the terminal device 202; the terminal device 202 may send the personal data and the motion data of the user to the cloud server, the cloud server determines the injury probability of the limb of the user, and generates a reminding message to send to the terminal device 202 when it is determined that the injury probability of the limb of the user is greater than or equal to the threshold value.
It should be noted that, in the present embodiment, in order to save power consumption of the wearable device 201, between steps S306 and S307, the following steps may be added:
the wearable device 201 determines whether the acquired motion data changes, and if not, stops acquiring the motion data, and sends an indication that no motion is detected to the terminal device 102 based on bluetooth communication, and if so, executes step S307;
the terminal device 202, in response to receiving an indication from the wearable device 201 that no motion is detected, may display a user interface 1000B as shown in fig. 10B for asking the user whether the user still needs to continue the motion, and in response to the user clicking on the "yes" button 1001B, send to the wearable device 201 an indication to start acquiring motion data of the user and, optionally, an indication to evaluate risk of injury to the user's limb during the motion, and in response to the user clicking on the "no" button 1002B, not send to the wearable device 201 an indication to start acquiring motion data of the user and an indication to evaluate risk of injury to the user's limb during the motion.
In response to receiving the indication from the terminal device 202 to start acquiring the motion data of the user and (optionally) the indication that the risk of injury to the limb of the user during the motion needs to be assessed, the wearable device 201 will continue to acquire the motion data and execute step S307.
S307 in fig. 3 is further explained below with reference to fig. 11.
Fig. 11 shows a flowchart of a method for the wearable device 201 in fig. 2 to assess the risk of injury to a limb of a user during movement, and it should be noted that although in the embodiments of the present application, the individual steps of the method are presented in a specific order, the order of the steps may be changed in different embodiments, and in some embodiments, one or more of the steps shown in order in the present specification may be performed simultaneously. As shown in fig. 11, the method includes:
s1101, the wearable device 201 determines the weight, height, and gender of the user according to the profile of the user, wherein the weight of the user may be used as a risk factor for causing the exercise to bring a risk of injury to the limbs of the user.
S1102, the wearable device 201 determines a vertical amplitude as a risk factor according to the profile and the motion data of the user.
In one example, the wearable device 201 may determine the center of gravity of the user and a ratio of the center of gravity to the height corresponding to the height of the user based on the height and gender of the user and big data statistics, and determine the height of the center of gravity of the user in an upright state, i.e., the vertical distance of the center of gravity from the ground, according to the ratio. The height of the center of gravity of the user in the upright state may be determined by other means, for example, by an existing gravity center measuring instrument, or may be determined by capturing an image of the user by an imaging device and using an image analysis method.
Further, the change of the height of the center of gravity of the user in the movement process can be determined according to the center of gravity of the user and the movement data of the user, and the difference value between the maximum value and the minimum value of the height of the center of gravity is determined, so that the vertical amplitude of the limb of the user is obtained. It should be noted that, the manner of determining the vertical amplitude based on the motion data acquired by the sensor of the wearable device 201 is not limited herein, and the vertical amplitude may be acquired based on any one of the existing technologies, for example, but not limited to, the one-dimensional neural network may be pre-trained by using the motion data of the user and/or the motion data of other users as training data and using the vertical amplitude as a training label, and the vertical amplitude corresponding to the motion data of the user may be obtained through the trained neural network.
S1103, the wearable device 201 determines a landing angle as a risk factor according to the motion data of the user.
In one example, wearable device 201 may determine a touchdown point (e.g., of a foot) of at least a portion of a user's limb and a touchdown distance from the user's motion data, where the touchdown distance refers to a horizontal distance between the touchdown point and the center of gravity. It should be noted that, the manner of determining the landing point and the landing distance based on the motion data acquired by the sensor of the wearable device 201 is not limited herein, and the landing point and the landing distance may be acquired based on any one of the existing technologies, for example, but not limited to, a neural network (e.g., a multi-layer sensor) may be pre-trained by using the motion data of the user and/or the motion data of other users as training data and using the landing distance as a training tag, and a landing distance corresponding to the motion data of the user is obtained through the trained neural network.
Further, it is possible to determine the positional relationship between the landing point and the center of gravity, and set the landing distance to a positive value in the case where it is determined that the landing point is located forward of the center of gravity, and set the landing distance to a zero value in the case where it is determined that the landing point is located rearward of the center of gravity; it should be noted that, in the embodiment of the present application, the case where the landing point is located behind the center of gravity is not considered, that is, the risk of injury of the user's limb in the case where the landing point is located behind the center of gravity is not evaluated.
Further, the landing angle may be determined based on a height of a center of gravity of the user's limb when at least a portion of the user's limb lands on the ground, and the landing distance; as shown in fig. 9, the ground contact angle can be determined according to the following formula:
Figure BDA0002410549520000121
wherein alpha represents a landing included angle and the unit is rad; h isG1Means the height of the center of gravity of the user's limb when at least a part of the user's limb lands on the ground, and the height h of the center of gravity of the user's limb in the standing state can be usedGTo approximately represent hG1(ii) a l represents the landing distance of the user's limb in m.
In another example, the touchdown angle may be determined based on a distance between a center of gravity of the user's limb and a touchdown point when at least a portion of the user's limb is touchdown, and a touchdown distance; as shown in fig. 9, the ground contact angle can be determined according to the following formula:
Figure BDA0002410549520000122
wherein h isG2Means a distance between the center of gravity of the user's limb and the landing point when at least a part of the user's limb lands on the ground, and the unit is m, and the height h of the center of gravity of the user's limb in the standing state can be usedGTo approximately represent hG2(ii) a l represents the landing distance of the user's limb in m.
S1104, the wearable device 201 determines, according to the risk factor, an injury risk factor indicating an extent of influence of the risk factor on the injury risk.
In one example, the injury risk factor may satisfy the following condition: the heavier the weight of the user, the larger the vertical amplitude, the smaller the landing angle and the larger the injury risk factor.
In one example, wearable device 201 may determine the injury risk factor according to the following formula:
Figure BDA0002410549520000123
wherein IR represents a damage risk factor; m represents the weight of the user in Kg; h represents the vertical amplitude of the user limb in the movement process, and the unit is m; g represents the acceleration of gravity in m/s2(ii) a Alpha represents the landing angle of the user limb during movement and is given by rad.
S1105, the wearable device 201 selects an injury probability corresponding to the injury risk factor from a plurality of injury probabilities associated with a plurality of known samples.
In one example, the plurality of injury probabilities may be obtained in advance based on a plurality of known samples obtained from a plurality of users, each of the plurality of known samples including an injury risk factor for each of the known samples and an injury result corresponding to the injury risk factor, and the injury result may include that a limb (e.g., a knee) is injured or that the limb is not injured; for example, the injury risk factor for user a is 1.2, the injury result corresponding to the injury risk factor is undamaged, the injury risk factor for user B is 10.5, and the injury result corresponding to the injury risk factor is injured.
In an example, the injury risk factor of each known sample may be an average injury risk factor obtained by calculating an average of a plurality of injury risk factors of the user corresponding to the known sample during exercise, or may be an injury risk factor of the user corresponding to the known sample during exercise.
In one example, the damage probability corresponding to a certain damage risk factor may be determined based on statistics of a plurality of known samples. For example, among a plurality of known samples, the damage risk factor of 100 samples is 2.7, and among the 100 samples, the damage result of 15 samples is damaged, and then the damage probability corresponding to the damage risk factor is 15/100 ═ 0.15; if the damage risk factor determined by the wearable device 201 at S1104 is 2.7, then at S1105, the wearable device 201 may determine that the damage risk factor corresponds to a damage probability of 0.15.
In another example, the damage risk factors of a plurality of known samples can be clustered through a common clustering algorithm (K-means clustering algorithm, spectral clustering algorithm, hierarchical clustering algorithm, etc.); and for each category, determining the range of the damage risk factors corresponding to the category according to the damage risk factors in the category, and determining the damage probability corresponding to the category according to the damage results corresponding to the damage risk factors in the category. For example, the category C includes 80 injury risk factors, and the corresponding injury risk factors range from 7.8 to 10.4, where 40 injury risk factors correspond to injury results that are already injured, and then the injury probability corresponding to the category C is 0.5; if the damage risk factor determined by the wearable device 201 in S1104 falls within the range of damage risk factors corresponding to category C, then in S1105, the wearable device 201 may determine that the damage probability 25 corresponding to the damage risk factor is 0.5.
It should be noted that, although it is shown in the above embodiments that the injury risk factor is determined by the weight, the vertical amplitude, and the landing angle of the user, so as to evaluate the injury risk of the limb, the injury risk factor may be determined based on any one or more of the weight, the vertical amplitude, and the landing angle of the user, wherein it is required to satisfy that the larger the weight, the larger the vertical amplitude, and/or the smaller the landing angle, the larger the injury risk factor.
It should be noted that the method illustrated in fig. 11 is not limited to be performed by the wearable device 201, and may also be performed by the terminal device 202, the cloud server, or other electronic devices in fig. 2, for example.
Method example 2
In this embodiment, the wearable device 201 in fig. 2 is worn on a limb of a user, acquires motion data of the user during the motion in real time in response to an instruction directly received from the user, and evaluates the risk of injury to the limb of the user during the motion according to the profile and the motion data of the user.
Fig. 12 shows a flow diagram of a method for evaluating a motion posture of a user by the wearable device 201 in fig. 2, and it should be noted that although in the embodiment of the present application, the steps of the method are presented in a specific order, the order of the steps may be changed in different embodiments, and in some embodiments, one or more steps shown in order in the present specification may be performed simultaneously. As shown in fig. 12, the method includes:
s1201, the wearable device 201 receives the profile of the user, and the step may refer to the related description of S301 of fig. 3, which is not described herein again.
S1202, the wearable device 201 receives an indication from the user to turn on the motion posture evaluation function.
In one example, wearable device 201 may have user interface 600 and user interface 700 shown in fig. 6 and 7, the user may click on "start motion" button 601 in user interface 600 when preparing to start motion, in response to the user's clicking operation, wearable device 201 receives an indication to turn on motion that indicates wearable device 201 starts acquiring motion data of the user, and wearable device 201 may then display user interface 700 for asking the user whether to turn on the motion gesture evaluation function, in response to the user clicking on "yes" button 701 in user interface 700, wearable device 201 receives an indication to turn on the motion gesture evaluation function that indicates wearable device 201 needs to evaluate the risk of injury to the limb during motion.
In the case where the wearable device 201 automatically turns on the motion posture evaluation function, the wearable device 201 may receive an instruction to start the motion from the user only through the user interface 600, and may not receive an instruction to turn on the motion posture evaluation function through the user interface 700.
S1203, the wearable device 201 obtains the motion data of the user in real time during the motion process of the user.
S1204, the wearable device 201 evaluates the risk of injury of the limb of the user during the exercise process based on the personal data and the exercise data of the user, which may refer to S307 of fig. 3 and the related description of fig. 11, and will not be described herein again.
S1205, the wearable device 201 determines whether the damage probability of the user' S limb is greater than or equal to a threshold, if so, continues to execute S1206, and if not, returns to execute S1203;
s1206, the wearable device 201 generates a reminding message for reminding the user to adjust the exercise posture, which may refer to the related description of S309 of fig. 3 and is not described herein again.
S1207, the wearable device 201 reminds the user to adjust the exercise posture according to the reminding message, which may refer to the description of S311 of fig. 3 and is not described herein again.
Method example 3
In this embodiment, the terminal device 202 in fig. 2 is worn on the limb of the user, and in response to the indication received from the user, obtains the motion data of the user during the motion in real time, and evaluates the injury risk of the limb of the user during the motion according to the profile and the motion data of the user. The method for evaluating the motion posture of the user by the terminal device 202 in fig. 2 is the same as the method for evaluating the motion posture of the user by the wearable device 201 in fig. 2, and reference may be specifically made to fig. 12 and the related description of fig. 12, and details are not repeated here.
In the embodiment of the application, risk factors such as vertical amplitude, landing included angle and weight are determined based on personal data and motion data of a user, and the risk factors are used for evaluating limb injury risks, so that abnormal motion postures can be found more accurately on the basis of not increasing the cost.
Further, when the injury probability is larger than or equal to the threshold value, the user can adjust wrong movement posture in time by sending out a prompt to the user, so that the injury risk brought to the limbs of the user in the movement process is reduced, and the user can move scientifically.
Fig. 13 shows a schematic structural diagram of a terminal device according to an embodiment of the present application, where the terminal device has a function of performing injury risk assessment on a limb of a user, and the terminal device may be the wearable device 201, the terminal device 202, a cloud server in the above embodiments, or may be a portable media player, a navigation device, a server, a network device, a graphics device, a video game device, a set-top box, a laptop device, a virtual reality and/or augmented reality device, an internet of things device, an industrial control device, an in-vehicle infotainment device, a streaming media client device, an electronic book, a reading device, a POS machine, and other types of electronic devices.
As shown in fig. 13, the terminal device may include a sensor 1310, an input/output unit 1320, a communication unit 1330, and a limb damage evaluation unit 1340, wherein the limb damage evaluation unit 1340 further includes a risk factor determination module 1341, a damage risk factor determination module 1342, and a damage probability determination module 1343. Where one or more components of the terminal device may be implemented in one or more of hardware, software, firmware, for example, may consist of an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory that executes one or more software or firmware programs, a combinational logic circuit, or any combination of other suitable components that provide the described functionality. Note that the structure of the terminal device is not limited to that shown in fig. 13, and for example, in the case where the terminal device is a cloud server, the terminal device may not have the sensor 1310.
Examples of sensors 1310 may include, but are not limited to, acceleration sensors, gyroscopes, magnetometers, and the like, according to some embodiments of the present application. The sensor 1310 may acquire motion data of the user during the motion in real time.
Examples of the input/output unit 1320 may include, but are not limited to, a speaker, a microphone, a display (e.g., a liquid crystal display, a touch screen display, etc.), and the like, according to some embodiments of the present application. The input/output unit 1320 may receive personal data input by the user, and may output a reminder to the user according to the reminder message when the injury probability of the limb of the user is greater than or equal to a threshold, for example, but not limited to, a reminder in at least one of a language mode, a vibration mode, and a display mode.
According to some further embodiments of the application, the input/output unit 1320 may also receive an indication from the user, which may cause the terminal device to turn on the function of performing the injury risk assessment on the limb of the user.
According to some further embodiments of the present application, the user's profile and the indication from the user may also be received from other apparatuses through the communication unit 1330, for example, in case the terminal device is the wearable device 201 of fig. 1, the user's profile and the indication from the user may be received from the terminal device 202 of fig. 1 through the communication unit 1330. The communication unit 1330 may communicate via wireless means (e.g., WiFi communication,
Figure BDA0002410549520000151
Communications, cellular mobile network communications, etc.) and/or wired (e.g., ethernet, serial communications, etc.) to communicate with other devices.
According to some embodiments of the present application, the risk factor determining module 1341 may determine the weight, height, sex, and the like of the user from the personal data of the user obtained by the input/output unit 1320 or the communication unit 1330, and determine the vertical amplitude and the landing angle of the limb of the user during the exercise process in real time based on the exercise data of the user obtained by the sensor 1310 in real time, which may specifically refer to the description of S1102 and S1103 in fig. 10, and will not be described herein again.
According to some embodiments of the present application, the injury risk factor determination module 1342 may determine, in real time, an injury risk factor indicating an influence degree of the risk factor on the injury risk according to the risk factor, which may specifically refer to the description of S1104 in fig. 10 and is not described herein again.
According to some embodiments of the present application, the damage probability determination module 1343 may select a damage probability corresponding to the damage risk factor from a plurality of damage probabilities associated with a plurality of known samples, which may specifically refer to the description of S1105 in fig. 10 and will not be described herein again.
According to some embodiments of the present application, the damage probability determination module 1343 may further determine whether the damage probability is greater than or equal to a threshold, and generate a prompt message when determining that the damage probability is greater than or equal to the threshold, which may specifically refer to the description of S309 in fig. 2 and is not described herein again.
According to some embodiments of the present application, the damage probability determination module 1343 may further send a prompt message to the input/output unit 1320, prompt the user to move through the input/output unit 1320 and possibly damage the limb, or send a prompt message to the communication unit 1330, and the communication unit 1330 sends the prompt message to another device to prompt the user to move through another device and possibly damage the limb.
In some examples, examples of reminders may include, but are not limited to, reminders in a verbal manner, a vibratory manner, and a display manner.
In the above embodiments, the injury risk factor determination module 1342 and the injury probability determination module 1343 may be collectively referred to as an injury risk assessment module.
In the embodiment of the application, risk factors such as vertical amplitude, landing included angle and weight are determined based on personal data and motion data of a user, and the risk factors are used for evaluating limb injury risks, so that abnormal motion postures can be more accurately found on the basis of not increasing the cost.
Further, when the injury probability is larger than or equal to the threshold value, the user can adjust wrong movement posture in time by sending out a prompt to the user, so that the injury risk brought to the limbs of the user in the movement process is reduced, and the user can move scientifically.
Fig. 14 shows another schematic structural diagram of a terminal device according to an embodiment of the present application, the terminal device having a function of injury risk assessment for a limb of a user, and examples of the terminal device may include, but are not limited to, an artificial intelligence terminal, a wearable device (e.g., a smart watch, a smart bracelet, etc.), a portable or mobile device, a mobile phone, a personal digital assistant, a cellular phone, a handheld PC, a portable media player, a handheld device, a navigation device, a server, a network device, a graphics device, a video game device, a set-top box, a laptop device, a virtual reality and/or augmented reality device, an internet of things device, an industrial control device, an in-vehicle infotainment device, a streaming media client device, an electronic book, a reading device, a POS machine, and other devices.
As shown in fig. 14, the terminal device may include a processor 1410, an external memory interface 1420, an internal memory 1421, a Universal Serial Bus (USB) interface 1430, a charging management module 1440, a power management module 1441, a battery 1442, an antenna 1, an antenna 2, a mobile communication module 1450, a wireless communication module 1460, an audio module 1470, a speaker 1470A, a receiver 1470B, a microphone 1470C, an earphone interface 1470D, a sensor module 1480, keys 1490, a motor 1491, an indicator 1492, a camera 1493, a display 1494, and a Subscriber Identification Module (SIM) card interface 1495, and the like. Wherein the sensor module 1480 may include a pressure sensor 1480A, a gyroscope sensor 1480B, an air pressure sensor 1480C, a magnetic sensor 1480D, an acceleration sensor 1480E, a distance sensor 1480F, a proximity light sensor 1480G, a fingerprint sensor 1480H, a temperature sensor 1480J, a touch sensor 1480K, an ambient light sensor 1480L, a bone conduction sensor 1480M, and the like.
In other embodiments of the present application, a terminal device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 1410 may include one or more processing units, such as: the processor 1410 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. The processor 1410 may be configured to perform various operations in the method embodiments described above in connection with fig. 3, 11, 12.
A memory may also be provided in processor 1410 for storing instructions and data. In some embodiments, the memory in the processor 1410 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 1410. If the processor 1410 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 1410, thereby increasing the efficiency of the system.
In some embodiments, processor 1410 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc. It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an exemplary illustration, and does not form a limitation on the structure of the terminal device. In other embodiments of the present application, the terminal device may also adopt different interface connection manners or a combination of multiple interface connection manners in the foregoing embodiments.
The charging management module 1440 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger.
The power management module 1441 is used to connect the battery 1442, the charging management module 1440 and the processor 1410. The power management module 1441 receives input from the battery 1442 and/or the charging management module 1440, and provides power to the processor 1410, the internal memory 1421, the display 1494, the camera 1493, and the wireless communication module 1460. The power management module 1441 may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), etc. In other embodiments, a power management module 1441 may also be disposed in the processor 1410. In other embodiments, the power management module 1441 and the charging management module 1440 may be disposed in the same device.
The wireless communication function of the terminal device can be implemented by the antenna 1, the antenna 2, the mobile communication module 1450, the wireless communication module 1460, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in a terminal device may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 1450 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device. The mobile communication module 1450 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 1450 may receive electromagnetic waves from the antenna 1, filter, amplify, etc. the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 1450 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 1450 may be disposed in the processor 1410. In some embodiments, at least some of the functional blocks of the mobile communication module 1450 may be provided in the same device as at least some of the blocks of the processor 1410.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 1470A, the receiver 1470B, etc.) or displays an image or video through the display 1494. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 1410, and may be located in the same device as the mobile communication module 1450 or other functional modules.
The wireless communication module 1460 may provide solutions for wireless communication applied to a terminal device, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 1460 may be one or more devices integrating at least one communication processing module. The wireless communication module 1460 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 1410. The wireless communication module 1460 may also receive a signal to be transmitted from the processor 1410, frequency modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it out.
The terminal device realizes the display function through the GPU, the display screen 1494, the application processor and the like. The GPU is a microprocessor for image processing, connected to the display 1494 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 1410 may include one or more GPUs that execute program instructions to generate or change display information.
The internal memory 1421 may be used to store computer-executable program code, which includes instructions. The internal memory 1421 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device, and the like. In addition, the internal memory 1421 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 1410 performs various operations in the method embodiments described above in connection with fig. 3, 11, and 12 by executing instructions stored in the internal memory 1421 and/or instructions stored in a memory provided in the processor.
The terminal device may implement audio functions through the audio module 1470, the speaker 1470A, the receiver 1470B, the microphone 1470C, the earphone interface 1470D, the application processor, and the like. Such as music playing, recording, etc.
The audio module 1470 is used to convert digital audio information into an analog audio signal output and also used to convert an analog audio input into a digital audio signal. The audio module 1470 may also be used to encode and decode audio signals. In some embodiments, the audio module 1470 may be disposed in the processor 1410, or some functional modules of the audio module 1470 may be disposed in the processor 1410.
The speaker 1470A, also referred to as a "horn," is used to convert electrical audio signals into acoustic signals. The terminal device can listen to music through the speaker 1470A or listen to a handsfree call.
A receiver 1470B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device 201 answers a call or voice information, the voice can be answered by placing the receiver 1470B close to the ear of the person.
The microphone 1470C, also referred to as a "microphone", is used to convert sound signals into electrical signals. When making a call or sending voice information, the user can input a voice signal into the microphone 1470C by speaking the user's mouth near the microphone 1470C. The electronic device 201 may be provided with at least one microphone 1470C. In other embodiments, the electronic device 201 may be provided with two microphones 1470C to implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 201 may further include three, four, or more microphones 1470C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headset interface 1470D is used to connect wired headsets. The headset interface 1470D may be the USB interface 1430, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
It should be noted that although it is shown in fig. 14 that the processor 1410 may be directly connected to the internal memory 1421, the audio module 1470, the sensor module 1480, the indicator 1492, the camera 1493, the display 1494, the keys 1490, and the like, the terminal device may also include system control logic (not shown), and the processor 1410 may be connected to one or more of the internal memory 1421, the audio module 1470, the sensor module 1480, the indicator 1492, the camera 1493, the display 1494, and the keys 1490 through the system control logic. The system control logic may include, among other things, any suitable interface controllers to provide any suitable interface to the processor 1410 and/or any suitable device or component in communication with the system control logic.
In some embodiments of the present application, there is also provided a system comprising the wearable device 201 and the terminal device 202 described above.
While the description of the present application will be described in conjunction with the preferred embodiments, it is not intended to limit the features of the present invention to that embodiment. Rather, the invention has been described in connection with embodiments for the purpose of covering alternatives and modifications as may be extended based on the claims of the present application. In the following description, numerous specific details are included to provide a thorough understanding of the present application. The present application may be practiced without these particulars. Moreover, some of the specific details have been omitted from the description in order to avoid obscuring or obscuring the focus of the present application. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
Further, various operations will be described as multiple discrete operations, in a manner that is most helpful in understanding the illustrative embodiments; however, the order of description should not be construed as to imply that these operations are necessarily order dependent. In particular, these operations need not be performed in the order of presentation.
The terms "comprising," "having," and "including" are synonymous, unless the context dictates otherwise. The phrase "A/B" means "A or B". The phrase "A and/or B" means "(A and B) or (A or B)".
As used herein, the term "module" or "unit" may refer to, be, or include: an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
In the drawings, some features of the structures or methods are shown in a particular arrangement and/or order. However, it is to be understood that such specific arrangement and/or ordering may not be required. In some embodiments, these features may be arranged in a manner and/or order different from that shown in the illustrative figures. Additionally, the inclusion of structural or methodical features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, these features may not be included or may be combined with other features.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as computer programs or program code executing on programmable systems comprising multiple processors, a storage system (including volatile and non-volatile memory and/or storage elements), multiple input devices, and multiple output devices.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this application, a processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. The program code can also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in this application are not limited in scope to any particular programming language. In any case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. In some cases, one or more aspects of at least some embodiments may be implemented by representative instructions stored on a computer-readable storage medium, which represent various logic in a processor, which when read by a machine causes the machine to fabricate logic to perform the techniques described herein. These representations, known as "IP cores" may be stored on a tangible computer-readable storage medium and provided to a number of customers or manufacturing facilities to load into the manufacturing machines that actually make the logic or processor.
Such computer-readable storage media may include, but are not limited to, non-transitory tangible arrangements of articles of manufacture or formation by machines or devices that include storage media such as: hard disk any other type of disk including floppy disks, optical disks, compact disk read-only memories (CD-ROMs), compact disk rewritables (CD-RWs), and magneto-optical disks; semiconductor devices such as Read Only Memory (ROM), Random Access Memory (RAM) such as Dynamic Random Access Memory (DRAM) and Static Random Access Memory (SRAM), Erasable Programmable Read Only Memory (EPROM), flash memory, Electrically Erasable Programmable Read Only Memory (EEPROM); phase Change Memory (PCM); magnetic or optical cards; or any other type of media suitable for storing electronic instructions.
Thus, embodiments of the present application also include non-transitory computer-readable storage media that contain instructions or that contain design data, such as Hardware Description Language (HDL), that define the structures, circuits, devices, processors, and/or system features described herein.

Claims (17)

1. A damage risk assessment method is applied to a first terminal device, and is characterized by comprising the following steps:
acquiring personal data of a user, wherein the personal data comprises the height of the user;
in response to a first operation, acquiring motion data through a sensor of the first terminal device, the motion data being related to movement of the first terminal device;
determining a vertical amplitude and a touchdown angle according to the personal data and the exercise data, wherein the vertical amplitude is related to the change of the height of the gravity center of the user in the exercise, and the touchdown angle is related to the touchdown condition of the user in the exercise; determining the damage risk brought to the user by the movement according to the vertical amplitude and the landing included angle; and
alerting the user to possible injuries in the sport if the injury risk is greater than or equal to a threshold.
2. The method of claim 1, wherein the profile further includes a gender of the user, and wherein determining a vertical amplitude from the profile and the motion data comprises:
determining a center of gravity of the user based on the height of the user and the gender of the user;
determining, from the center of gravity, a change in the height of the center of gravity of the user in the motion, wherein the height of the center of gravity comprises a vertical distance of the center of gravity from the ground; and
determining the vertical amplitude from the change in the height of the center of gravity, wherein the vertical amplitude comprises a difference between a maximum and a minimum of the height of the center of gravity.
3. The method of claim 1 or 2, wherein said determining a landing angle based on said profile and said athletic data comprises:
determining the included landing angle according to the height of the center of gravity and the landing point of the user in the movement by the following formula:
Figure FDA0002410549510000011
wherein α represents the ground contact angle; h isGRepresents the height of the center of gravity; l represents a touchdown distance of the user, the touchdown distance including a horizontal distance of the touchdown point from the center of gravity.
4. The method of claim 1 or 2, wherein said determining a landing angle based on said profile and said athletic data comprises:
determining the included landing angle according to the height of the center of gravity and the landing point of the user in the movement by the following formula:
Figure FDA0002410549510000012
wherein α represents the ground contact angle; h isGRepresents the height of the center of gravity; l represents a touchdown distance of the user, the touchdown distance including a horizontal distance of the touchdown point from the center of gravity.
5. The method of any one of claims 1 to 4, wherein the profile further comprises a weight of the user, and wherein said determining the risk of injury to the user from the movement based on the vertical amplitude and the included angle of strike comprises:
determining an injury risk factor indicating the influence degree of the body weight, the vertical amplitude and the landing angle on the injury risk according to the body weight, the vertical amplitude and the landing angle.
6. The method of claim 5, wherein the injury risk factor is greater if the weight of the user is heavier, or if the vertical amplitude is greater, or if the ground contact angle is smaller.
7. The method of claim 5, wherein determining a damage risk factor indicative of a degree of influence of the body weight, the vertical amplitude, and the ground contact angle on the risk of damage based on the body weight, the vertical amplitude, and the ground contact angle comprises:
determining the injury risk factor according to the body weight, the vertical amplitude and the landing angle through the following formula:
Figure FDA0002410549510000021
wherein IR represents the injury risk factor; m represents the weight of the user; h represents the vertical amplitude; g represents the gravitational acceleration of the limb; and alpha represents the included angle of the ground.
8. The method of any of claims 5 to 7, wherein said determining said risk of injury to said limb from said motion based on said vertical amplitude and said included stance angle, further comprises:
selecting an injury probability corresponding to the injury risk factor from a plurality of injury probabilities associated with a plurality of known samples;
wherein the plurality of damage probabilities are obtained based on statistics of damage risk factors and damage outcomes for each of the plurality of known samples.
9. The method of claim 8, wherein said alerting the user that the user may be injured in the sport if the injury risk is greater than or equal to a threshold value comprises:
in the event that it is determined that the damage probability is greater than or equal to the threshold, alerting the user to possible damage in the sport by at least one of verbal means, vibratory means, and display means.
10. The method of claim 8 or 9, wherein said alerting the user that the user may be injured in the sport if the injury risk is greater than or equal to a threshold value, further comprises:
in an instance in which it is determined that the impairment probability is greater than or equal to the threshold, suggesting that the user adjust the posture of the motion.
11. The method of any of claims 1 to 10, wherein the responding to the first operation comprises:
and receiving first information from a second terminal device, wherein the first information is used for instructing the first terminal device to acquire the motion data.
12. The method of claim 11, wherein said alerting the user that the user may be injured in the sport if the injury risk is greater than or equal to a threshold value comprises:
and sending second information to the second terminal equipment, wherein the second information is used for enabling the second terminal equipment to remind the user of possible damage in the movement.
13. The method of any of claims 1 to 11, wherein the motion data comprises at least one of acceleration, angular velocity, and direction of motion.
14. A machine-readable medium having stored thereon instructions which, when executed on the machine, cause the machine to perform the method of any one of claims 1 to 13.
15. A terminal device, comprising:
a processor;
a memory having instructions stored thereon that, when executed by the processor, cause the terminal device to perform the method of any of claims 1 to 13.
16. A chip, characterized in that,
stored on the chip is a computer program which, when executed by a processor, performs the method of any of the preceding claims 1 to 13.
17. A system comprising a first terminal device and a second terminal device, characterized by:
the second terminal device is configured to:
acquiring personal data of a user and sending the personal data to the first terminal equipment; and
receiving a first operation from the user, and responding to the first operation, sending first information to the first terminal equipment, wherein the first information is used for instructing the first terminal equipment to acquire motion data, and the motion data is related to the movement of the first terminal equipment;
the first terminal device is configured to:
in response to receiving the first information from the second terminal device, acquiring the motion data through a sensor of the first terminal device;
determining a vertical amplitude and a touchdown angle according to the personal data and the exercise data, wherein the vertical amplitude is related to the change of the height of the gravity center of the user in the exercise, and the touchdown angle is related to the touchdown condition of the user in the exercise;
determining the damage risk brought to the user by the movement according to the vertical amplitude and the landing included angle; and
and sending second information to the second terminal equipment when the damage risk is greater than or equal to a threshold value, wherein the second information is used for enabling the second terminal equipment to remind the user of possible damage in the motion.
CN202010175108.XA 2020-03-13 2020-03-13 Damage risk assessment method, medium, chip, terminal device and system Pending CN113384863A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010175108.XA CN113384863A (en) 2020-03-13 2020-03-13 Damage risk assessment method, medium, chip, terminal device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010175108.XA CN113384863A (en) 2020-03-13 2020-03-13 Damage risk assessment method, medium, chip, terminal device and system

Publications (1)

Publication Number Publication Date
CN113384863A true CN113384863A (en) 2021-09-14

Family

ID=77615903

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010175108.XA Pending CN113384863A (en) 2020-03-13 2020-03-13 Damage risk assessment method, medium, chip, terminal device and system

Country Status (1)

Country Link
CN (1) CN113384863A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105311816A (en) * 2014-07-31 2016-02-10 精工爱普生株式会社 Notification device, exercise analysis system, notification method, and exercise support device
CN106037640A (en) * 2016-05-18 2016-10-26 深圳多跑体育科技有限公司 Injury remote analysis system and method
CN108211309A (en) * 2017-05-25 2018-06-29 深圳市未来健身衣科技有限公司 The guidance method and device of body building
CN109640817A (en) * 2016-04-13 2019-04-16 强健手臂技术公司 For motion tracking, assessment and the system of monitoring and device and its application method
US10555689B1 (en) * 2019-02-08 2020-02-11 The Florida International University Board Of Trustees CPS pressure based sensing system for symmetry measurements

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105311816A (en) * 2014-07-31 2016-02-10 精工爱普生株式会社 Notification device, exercise analysis system, notification method, and exercise support device
CN109640817A (en) * 2016-04-13 2019-04-16 强健手臂技术公司 For motion tracking, assessment and the system of monitoring and device and its application method
CN106037640A (en) * 2016-05-18 2016-10-26 深圳多跑体育科技有限公司 Injury remote analysis system and method
CN108211309A (en) * 2017-05-25 2018-06-29 深圳市未来健身衣科技有限公司 The guidance method and device of body building
US10555689B1 (en) * 2019-02-08 2020-02-11 The Florida International University Board Of Trustees CPS pressure based sensing system for symmetry measurements

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
(瑞典)拉尔斯•彼得松(LARS PETERSON)等: "《运动损伤学 预防、治疗与康复》", 31 March 2019 *
刘宇,傅维杰著: "《生物力学研究前沿系列 人体运动生物力学》", 31 July 2018 *

Similar Documents

Publication Publication Date Title
US10001386B2 (en) Automatic track selection for calibration of pedometer devices
CN110070863A (en) A kind of sound control method and device
CN112447273A (en) Method and electronic device for assisting fitness
CN112783330A (en) Electronic equipment operation method and device and electronic equipment
CN110263617A (en) Three-dimensional face model acquisition methods and device
CN114067776A (en) Electronic device and audio noise reduction method and medium thereof
CN113892920A (en) Wearable device wearing detection method and device and electronic device
CN110956971A (en) Audio processing method, device, terminal and storage medium
WO2022213834A1 (en) Method for determining exercise guidance information, electronic device, and exercise guidance system
KR20200120105A (en) Electronic device and method for providing information to relieve stress thereof
CN111191018B (en) Response method and device of dialogue system, electronic equipment and intelligent equipment
CN113384863A (en) Damage risk assessment method, medium, chip, terminal device and system
CN108922224B (en) Position prompting method and related product
CN109285563A (en) Voice data processing method and device during translation on line
CN114100101B (en) Running posture detection method and equipment
CN114073496A (en) Pulse wave measuring device and pulse wave measuring method, system and medium thereof
CN113359120B (en) Method and device for measuring user activity distance and electronic device
CN115249364A (en) Target user determination method, electronic device and computer-readable storage medium
CN112447272A (en) Prompting method for fitness training and electronic equipment
US20230285808A1 (en) Electronic apparatus for providing personalized exercise coaching and operating method thereof
US20230412759A1 (en) Electronic device and control method for controlling speed of workout video
CN108389107B (en) Protective tool recommendation method and related product
CN110286718B (en) Wearable device, method for utilizing wearing physiological information thereof, and computer-readable storage medium
EP4372521A1 (en) Wearable electronic device and method by which wearable electronic device provides brushing teeth information
CN116414339A (en) Wearable device and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210914