CN116072291A - Motion analysis method, motion analysis device, electronic equipment and computer storage medium - Google Patents

Motion analysis method, motion analysis device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN116072291A
CN116072291A CN202111276279.2A CN202111276279A CN116072291A CN 116072291 A CN116072291 A CN 116072291A CN 202111276279 A CN202111276279 A CN 202111276279A CN 116072291 A CN116072291 A CN 116072291A
Authority
CN
China
Prior art keywords
data
user
value
joint
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111276279.2A
Other languages
Chinese (zh)
Inventor
孙宇
刘航
徐腾
陈霄汉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN202111276279.2A priority Critical patent/CN116072291A/en
Priority to PCT/CN2022/127953 priority patent/WO2023072195A1/en
Publication of CN116072291A publication Critical patent/CN116072291A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4528Joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • A61B5/4585Evaluating the knee
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • A61B5/4595Evaluating the ankle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Physiology (AREA)
  • Computational Mathematics (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Primary Health Care (AREA)
  • Pure & Applied Mathematics (AREA)
  • Rheumatology (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Algebra (AREA)
  • Computing Systems (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Quality & Reliability (AREA)

Abstract

The embodiment of the application discloses a motion analysis method, a motion analysis device, electronic equipment and a computer storage medium, wherein the method comprises the steps of acquiring first data of a target object according to body parameters of the target object; acquiring second data of the target object and a ground-leaving state of the foot; calculating a first value and a second value of the ankle joint based on the ground clearance state; constructing a first coordinate system based on the motion gesture of the target object, wherein the first coordinate system is used for constructing a homogeneous transformation matrix and acquiring a first coordinate of a human joint in the first coordinate system; calculating the angular velocity of the lower leg according to the first coordinate and the first data based on the homogeneous transformation matrix; third and fourth values of the knee joint are calculated based on the first data, the second data, the first and second values of the ankle joint, and the angular velocity of the lower leg. By adopting the embodiment of the application, a simpler method can be obtained to calculate the stress condition of the joint so as to judge the risk of sports injury.

Description

Motion analysis method, motion analysis device, electronic equipment and computer storage medium
Technical Field
The present disclosure relates to the field of motion analysis technologies, and in particular, to a motion analysis method, a motion analysis device, an electronic device, and a computer storage medium.
Background
Exercise is an important component in the daily life of human beings, and the exercise requires the relevant coordination of different joints of the human body. In the exercise process, people can cause exercise injuries with different degrees due to nonstandard exercise postures and overlarge joint stress.
At present, joint stress can be simulated and solved through a professional motion capture system and inverse dynamics analysis software, so that motion guidance is completed. However, the algorithm of the scheme is complex, the equipment cost is high, and the scheme is inconvenient for users to use daily.
Therefore, how to obtain a simpler and lower cost method to calculate joint stress to effectively provide damage risk early warning is a technical problem that needs to be solved by those skilled in the art.
Disclosure of Invention
The embodiment of the application discloses a motion analysis method, a motion analysis device, electronic equipment and a computer storage medium, wherein the joint stress condition is calculated by a simpler method to provide damage risk early warning.
In a first aspect, embodiments of the present application provide a motion analysis method, the method including:
acquiring first data of a target object according to body parameters of the target object; the first data comprise the mass, mass center and rotational inertia of a human body link;
Acquiring second data of the target object and a ground-leaving state of the foot; the second data comprise the movement speed, the angular speed and the human joint position information of the centroid of the human link;
calculating a first value and a second value of the ankle joint based on the ground clearance state;
constructing a first coordinate system based on the motion gesture of the target object, wherein the first coordinate system is used for constructing a homogeneous transformation matrix and acquiring a first coordinate of the human joint in the first coordinate system;
calculating the angular velocity of the lower leg according to the first coordinate and the first data based on the homogeneous transformation matrix;
third and fourth values of the knee joint are calculated based on the first data, the second data, the first and second values of the ankle joint, and the angular velocity of the lower leg.
In the method, through acquiring the human body inertia parameters and the motion gesture data of the target object and combining the ground leaving state, joint stress and moment of the ankle joint of the target object are calculated according to the momentum theorem and the momentum moment theorem, joint force and moment of the knee joint are calculated by combining the joint force and moment of the ankle joint. According to the method, the joint force and moment of the ankle joint can be calculated based on the ground leaving state of the target object, the joint force and moment of the knee joint are calculated again, the calculation method is simple, and professional motion analysis equipment is not needed in the use process so as to reduce cost.
With reference to the first aspect, in one possible implementation manner, the method further includes: calculating the angular velocity of the thigh according to the first coordinate and the first data based on the homogeneous transformation matrix; and calculating a fifth value and a sixth value of the hip joint based on the first data, the second data, the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint, and the angular velocity of the thigh.
The method can calculate the joint force and moment of the hip joint by combining the angular velocity of the thigh, the inertial parameters of the human body and the movement posture data on the basis of calculating the joint force and moment of the ankle joint and the knee joint.
With reference to the first aspect, in one possible implementation manner, calculating the first value and the second value of the ankle joint based on the ground-leaving state includes: when the ground-leaving state is a first state, the first value and the second value of the ankle joint are both 0; when the ground leaving state is a second state, calculating a first value and a second value of the ankle joint according to the first data and the second data; when the ground leaving state is a third state, calculating a first value and a second value of the ankle joint according to the first data and the second data; the second data are position information of the human body joints, the second coordinates are barycentric projection coordinates of the target object, the third coordinates and the fourth coordinates are ankle joint coordinates, and the second coordinates, the third coordinates and the fourth coordinates are obtained according to the second data.
The ground separation state is divided into three cases, including a single-foot ground contact state, a double-foot ground contact state and a double-foot empty state, and based on different detected ground separation states, the joint force and moment of the ankle joint are calculated according to the different conditions. Based on different ground leaving states, different data are used for calculation, and the joints and moments of the ankle joints in different ground leaving states can be calculated rapidly and simply.
With reference to the first aspect, in one implementation manner, the calculating the first value and the second value of the ankle joint according to the first data and the second data includes: the first and second values are calculated by the following formula,
F 1 +F 2 =Σm i Δv ci +G
M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
wherein: f1 and F2 are first values of the ankle joint,m1 and M2 are the second numerical value of the ankle joint, M i V is the mass of the human body link ci G is the weight of the user calculated according to the body parameters, J i For the moment of inertia, ω, of the human body segment i An angular velocity of the centroid of the human body link, r i G is gravity acceleration according to the vector from the mass center of the human body link to the reference point.
According to the method, based on the ground-leaving state of single foot ground contact, the joint force and moment of the ankle joint in the vacated state are 0, and the ankle joint in the ground contact state can be combined with the related information of the human inertial parameters and the human joint movement data to calculate the joint force and moment according to the formula.
With reference to the first aspect, in one possible implementation manner, calculating a first value and a second value of the ankle joint according to the first data and the second data; the second data is position information of the human body joint, the second coordinate is a gravity center projection coordinate of the target object, the third coordinate and the fourth coordinate are ankle joint coordinates, and the second coordinate, the third coordinate and the fourth coordinate are obtained according to the second data, and the method comprises the following steps: the first and second values are calculated by the following formula,
F 1 +F 2 =Σm i Δv ci +G
M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
Figure BDA0003329381120000021
Figure BDA0003329381120000022
Figure BDA0003329381120000023
Figure BDA0003329381120000024
/>
wherein: f1 and F2 are the first numerical value of the ankle joint, M1 and M2 are the second numerical value of the ankle joint, M i V is the mass of the human body link ci G is the weight of the user calculated according to the body parameters, J i For the moment of inertia, ω, of the human body segment i An angular velocity of the centroid of the human body link, r i For the vector from the centroid of the human body link to the reference point, P proj For the second coordinate, P 1 For the third coordinate, P 2 And the fourth coordinate.
The method can calculate joint force and moment according to the above formula by combining the related information of human inertial parameters, the motion data of the human joints and the joint node coordinates of the ankle joints based on the ground-leaving state of the two feet.
With reference to the first aspect, in one possible implementation manner, calculating, based on the homogeneous transformation matrix, an angular velocity of the lower leg according to the first coordinate and the first data includes: calculating the rotation angle of the human body joint by corresponding the first coordinate to the following formula, calculating the angular velocity of the lower leg based on the rotation angle of the human body joint,
Figure BDA0003329381120000031
Figure BDA0003329381120000032
(i is more than or equal to 1 and less than or equal to 6,i is a positive integer)
Wherein m is a coefficient, p is the first coordinate, a, d and alpha are known distances or angles in the first coordinate system, and θ is an initial included angle+a rotation angle of the human joint.
With reference to the first aspect, in one possible implementation manner, the calculating the third value and the fourth value of the knee joint based on the first data, the second data, the first value and the second value of the ankle joint, and the angular velocity of the lower leg includes: the third and fourth values are calculated by the following formula,
Figure BDA0003329381120000033
Figure BDA0003329381120000034
wherein F3 is the third value, M4 is the fourth value, M shank For the calf mass in the first data,
Figure BDA0003329381120000035
for the velocity of the centroid of the calf in said second data, r shank R is a vector of the center of mass of the shank from a reference point, which is derived from the first data and the second data foot For the vector of the foot mass center from the reference point according to the first data and the second data, J shank For the moment of inertia of the calf in said first data, and (2)>
Figure BDA0003329381120000036
Is the angular velocity of the lower leg.
With reference to the first aspect, in one possible implementation manner, the calculating, based on the homogeneous transformation matrix, an angular velocity of the thigh according to the first coordinate and the first data includes: calculating a rotation angle of the human body joint corresponding to the first coordinate, calculating an angular velocity of the thigh based on the rotation angle of the human body joint,
Figure BDA0003329381120000037
Figure BDA0003329381120000038
(i is more than or equal to 1 and less than or equal to 6,i is a positive integer)
Wherein m is a coefficient, p is the first coordinate, a, d and alpha are known distances or angles in the first coordinate system, and θ is an initial included angle+a rotation angle of the human joint.
With reference to the first aspect, in one possible implementation manner, the calculating the fifth value and the sixth value of the hip joint based on the first data, the second data, the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint, and the angular velocity of the thigh includes: the fifth and sixth values are calculated by the following formula,
Figure BDA0003329381120000041
Figure BDA0003329381120000042
Wherein F5 is the fifth value, M6 is the sixth value, M thigh For thigh mass in the first data,
Figure BDA0003329381120000043
r is the speed of the thigh centroid in the second data thigh R is a vector of the thigh centroid from the reference point, which is derived from the first data and the second data shank For vectors of shank centroid distance reference point derived from the first data and the second data, J thigh For the moment of inertia of the thigh in the first data, and (2)>
Figure BDA0003329381120000044
Is the angular velocity of the thigh.
With reference to the first aspect, in one possible implementation manner, the first coordinate system includes: a reference sub-coordinate system, a first sub-coordinate system, and a second sub-coordinate system;
the homogeneous transformation matrix is constructed based on the relation among the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system; the relation among the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system comprises the distance and the angle among coordinate axes;
the first coordinates are coordinates of the human joint in the reference sub-coordinate system.
With reference to the first aspect, in one possible implementation manner, acquiring the ground-off state of the foot includes: displaying a first user interface, wherein the first user interface is used for displaying the setting of the ground reference value; the ground clearance reference value is used for judging the ground clearance state of the foot; a set operation for the ground reference value is received. The ground-leaving state of the foot can be judged by acquiring the setting of the ground-leaving reference value.
With reference to the first aspect, in one possible implementation manner, the method further includes: displaying a second user interface, wherein the second user interface displays a first image of the target object, and a first area and a first mark are overlapped on the first image; the first region is a region of the human joint on the first image, and the first identifier is at least one of a first value and a second value of the ankle joint, a third value and a fourth value of the knee joint, and a fifth value and a sixth value of the hip joint. In the method, the joint force and moment of the joint of the human body can be displayed on the moving image, and the stress condition of the target object can be displayed more intuitively.
With reference to the first aspect, in one possible implementation manner, after calculating the fifth value and the sixth value of the hip joint, the method further includes: judging whether or not a risk of movement is generated based on at least one of the first and second values of the ankle joint, the third and fourth values of the knee joint, and the fifth and sixth values of the hip joint.
In one possible implementation manner, determining whether the exercise risk is generated includes: judging the magnitude of a ratio of at least one of a first value and a second value of the ankle joint, a third value and a fourth value of the knee joint, and a fifth value and a sixth value of the hip joint to a first reference value to a first threshold; and if the ratio is greater than a first threshold, outputting risk prompt information.
In one possible implementation, outputting risk prompting information includes: outputting a first prompt; alternatively, a first prompt is output, the first prompt including a first option; a second operation is received on the first option, and a second prompt is output. Therefore, the method can output the degree of damage risk of the joints of the human body, and also can output a guiding scheme for actions or exercise courses with damage risk, thereby being beneficial to the adjustment of the exercise posture of the user and reducing the possibility of damage risk.
In one possible implementation, the first reference value is the human joint stress threshold.
With reference to the first aspect, in one possible implementation manner, before acquiring the first data of the target object according to the body parameter of the target object, the method further includes: performing body measurement evaluation on a user; the body measurement assessment includes assessing a body state including a lesion site and a degree of damage to the lesion site. By detecting the physical state of the user, whether the user has a damaged part or not and the damage degree of the damaged part can be known, so that the corresponding exercise action can be adjusted or the user can be prompted to correspondingly lighten the stress of the damaged part, and the exercise damage risk of the user is reduced.
In one possible implementation, assessing the physical state includes: detecting a first location placed at a damaged location of the user, and detecting a time when the first location is placed at the damaged location; the first part is a body part of the user; and determining the damage degree according to the time. By detecting the time of placing the body part at the damaged part, the damage condition can be conveniently informed without a complex process.
In one possible implementation, the first reference value is a load bearing reference value, which is adjusted according to the body measurement evaluation. The bearing reference value can be dynamically adjusted according to the information of the body measurement evaluation, and the bearing reference value can be dynamically adjusted according to the information of the body measurement evaluation after the change, so that the bearing reference value can be adjusted according to the self condition of a user, and the damage risk can be accurately reduced.
With reference to the first aspect, in one possible implementation manner, the target object is a moving image in the user or a selected exercise course. By detecting the actual movement of the user, the real-time analysis movement situation of the user can be analyzed. By detecting moving images in a selected movement lesson, the movement conditions in the movement lesson can be analyzed by simulating the detection of moving images in the selected movement lesson. Further, a risk prompt may be output regarding the selected athletic lesson to determine whether the athletic performance in the athletic lesson is appropriate for the user.
In a second aspect, the present application provides a motion analysis apparatus comprising means for performing the method of the first aspect described above.
In a third aspect, the present application provides an electronic device comprising a touch screen, a memory, one or more processors, a plurality of applications, and one or more programs; wherein the one or more programs are stored in the memory; wherein the one or more processors, when executing the one or more programs, cause the electronic device to implement the method of the first aspect.
In a fourth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of the first aspect.
In summary, according to the embodiment of the application, the ankle joint stress is calculated by distinguishing different ground leaving states, and then the stress conditions of the knee joint and the hip joint are obtained by establishing a coordinate system on the lower limb, so that the stress conditions of the joint can be calculated by a simpler method to judge the risk of sports injury.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a software structural block diagram of an electronic device according to an embodiment of the present application;
FIG. 3A is a schematic diagram of a user interface for an application menu on an electronic device provided in an embodiment of the present application;
FIGS. 3B-3E are schematic diagrams of one scenario involved in the present application;
FIG. 3F is a schematic diagram of one scenario involved in the present application;
FIGS. 3G-3H are schematic diagrams of another scenario of the present application;
FIGS. 4A-4D are a set of interface schematic diagrams provided in embodiments of the present application;
FIGS. 4E-4I are a schematic diagram of another set of interfaces provided in an embodiment of the present application;
FIGS. 4J-4N are schematic diagrams illustrating another set of interfaces provided in an embodiment of the present application;
FIGS. 5A-5C are a schematic diagram of another set of interfaces provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of another set of interfaces provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of another set of interfaces provided in an embodiment of the present application;
FIGS. 8A-8C are a set of schematic diagrams illustrating another set of interfaces provided in an embodiment of the present application;
9A-9E are another set of interface schematics provided by embodiments of the present application;
FIG. 10 shows a flow chart of a method of motion analysis;
FIG. 11 is a schematic illustration of a bone node according to an embodiment of the present application;
FIG. 12 is a schematic view of a reference frame provided in an embodiment of the present application;
FIG. 13 is a flow chart of a motion analysis method according to an embodiment of the present application;
FIG. 14 is a schematic view of a center of gravity ground projection according to an embodiment of the present disclosure;
fig. 15 is a schematic diagram of a lower limb coordinate system according to an embodiment of the present disclosure;
fig. 16 is a schematic diagram of an initial state quantity of a lower limb coordinate system according to an embodiment of the present application;
fig. 17 is a flowchart of another motion analysis method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Embodiments of an electronic device, a user interface for such an electronic device, and for such an electronic device are described below. In some embodiments, the electronic device may be a portable electronic device that also includes other functionality such as personal digital assistant and/or music player functionality, such as a cell phone, tablet computer, wearable electronic device with wireless communication functionality (e.g., a smart watch), and so forth. Exemplary embodiments of portable electronic devices include, but are not limited to, portable electronic devices that carry iOS, android, microsoft or other operating systems.
The term "User Interface (UI)" in the description and claims of the present application and in the drawings is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form that the user can receive. The user interface of the application program is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, the interface source code is analyzed and rendered on the terminal equipment, and finally the interface source code is presented as content which can be identified by a user, such as a control of pictures, words, buttons and the like. Controls (controls), also known as parts (widgets), are basic elements of a user interface, typical controls being toolbars (toolbars), menu bars (menu bars), text boxes (text boxes), buttons (buttons), scroll bars (scrollbars), pictures and text. The properties and content of the controls in the interface are defined by labels or nodes. One node corresponds to a control or attribute in the interface, and the node is rendered into visual content for a user after being analyzed and rendered. In addition, many applications, such as the interface of a hybrid application (hybrid application), typically include web pages. A web page, also referred to as a page, is understood to be a special control embedded in an application interface, which is source code written in a specific computer language, such as hypertext markup language (hyper text markup language, HTML), cascading style sheets (cascading style sheets, CSS), java script (javascript v, JS), etc., and which can be loaded and displayed as user-recognizable content by a browser or web page display component similar to the browser function. The specific content contained in a web page is also defined by tags or nodes in the web page source code, such as HTML defines the elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of the user interface is a graphical user interface (graphical user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may visually include visual interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status boxes, navigation bars, widgets, etc.
The following embodiments of the present application provide a motion analysis method, a graphical user interface, and an electronic device, which can calculate joint force/moment, determine risk of injury of related motions, provide risk assessment and early warning of injury, and reduce possibility of generating motion injury.
Exemplary electronic devices provided in the following embodiments of the present application are described below.
Fig. 1 shows a schematic structural diagram of an electronic device.
The electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an environment sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device. In other embodiments of the present application, the electronic device may include more or fewer components than shown, or certain components may be combined, or certain components may be separated, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing unit, GPU), an image signal processor (Image Signal Processor, ISP), a controller, a memory, a video codec, a digital signal processor (Digital Signal Processor, DSP), a baseband processor, and/or a Neural network processor (Neural-network Processing Unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. In some embodiments, the electronic device may also include one or more processors 110.
The controller can be a neural center and a command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the electronic device.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor (mobile industry processor interface, MIPI) interface, a general-purpose input/output (GPIO) interface, a SIM interface, and/or a USB interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K through different I2C bus interfaces, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interfaces to implement a touch function of the electronic device.
I2S may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
The PCM interface may also be used for audio communication to quantize and encode analog signal samples. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, UART interfaces are typically used to connect the processor 110 with the wireless communication module 160: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing function of the electronic device. The processor 110 and the display screen 194 communicate via a DSI interface to implement the display functionality of the electronic device.
The GPIO interface may be configured by software. The GPIO interface may be configured to be controlled and may also be configured as a data signal. In some embodiments, GPIO interfaces may be used to connect the processor 110 to the camera 193, display 194, wireless communication module 160, audio module 170, sensor module 180, etc. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge an electronic device, or may be used to transfer data between the electronic device and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device. In other embodiments, the electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charger embodiments, the charge management module 140 may receive a charging input of the wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging collar of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle times, battery health, and other parameters. In other embodiments, the power management module 141 may also be disposed in the processor 11. In other embodiments, the power management module 141 and the charge management module 140 may also be disposed in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on an electronic device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (Low Noise Amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules in the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. for application on an electronic device. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device implements display functions via a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a Mini LED, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 194, N being a positive integer greater than 1.
In some embodiments of the present application, the display screen 194 displays moving images of the user.
The electronic device may implement display functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute instructions to generate or change display information.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image or video visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to an ISP to be converted into a digital image or video signal. The ISP outputs digital image or video signals to the DSP for processing. The DSP converts digital image or video signals into standard RGB, YUV, etc. format image or video signals. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
In some embodiments of the present application, two-dimensional position information of the user's foot, knee, hip, wrist, elbow, head and neck, etc. is detected in real time by the camera 193.
The digital signal processor is used to process digital signals, and may process other digital signals in addition to digital image or video signals. For example, when the electronic device selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, and so on.
Video codecs are used to compress or decompress digital video. The electronic device may support one or more video codecs. In this way, the electronic device may play or record video in a variety of encoding formats, such as: dynamic picture experts group (Moving Picture Experts Group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a Neural-Network (NN) computing processor, and can rapidly process input information by referencing a biological Neural Network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of electronic devices can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image video playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 121 may include a flash random access memory, and may also include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. When the electronic device picks up a phone call or voice message, the voice can be picked up by placing the receiver 170B close to the human ear.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the electronic device picks up a phone call or voice message, the voice can be picked up by placing the receiver 170B close to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device may be provided with at least one microphone 170C.
The earphone interface 170D is used to connect a wired earphone.
The sensor module 180 may include 1 or more sensors, which may be of the same type or different types. It is to be understood that the sensor module 180 shown in fig. 1 is only an exemplary division, and other divisions are possible, which the present application is not limited to.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. When a touch operation is applied to the display screen 194, the electronic apparatus detects the intensity of the touch operation according to the pressure sensor 180A. The electronic device may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronics calculate altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device may detect the opening and closing of the flip holster using the magnetic sensor 180D.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device may measure the distance by infrared or laser. In some embodiments, the scene is photographed and the electronic device can range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device emits infrared light outwards through the light emitting diode. The electronic device uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that an object is in the vicinity of the electronic device. When insufficient reflected light is detected, the electronic device may determine that there is no object in the vicinity of the electronic device. The electronic device may detect that the user holds the electronic device near the ear to talk using the proximity light sensor 180G, so as to automatically extinguish the screen for power saving purposes. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device can adaptively adjust the brightness of the display 194 based on the perceived ambient light level.
The fingerprint sensor 180H is used to acquire a fingerprint. The electronic equipment can utilize the acquired fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device performs a temperature processing strategy using the temperature detected by temperature sensor 180J.
The touch sensor 180K, also referred to as a touch panel or touch sensitive surface. The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may parse out a voice signal based on the vibration signal of the vocal part vibration bone piece obtained by the bone conduction sensor 180M, and implement a voice function. The application processor can analyze heart rate information based on the blood pressure beat signals acquired by the bone conduction sensor 180M, so that a heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device. The electronic device may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The electronic equipment interacts with the network through the SIM card, so that the functions of communication, data communication and the like are realized. In some embodiments, the electronic device employs a SIM, namely: an embedded SIM card. The SIM card may be embedded in the electronic device and cannot be separated from the electronic device.
The electronic device illustrated in the example of fig. 1 may display various user interfaces described in various embodiments via the display 194. The electronic device may detect a touch operation in each user interface through the touch sensor 180K, such as a click operation (e.g., a touch operation on an icon, a double click operation) in each user interface, a slide operation up or down in each user interface, or an operation to perform a circled gesture, and so on. In some embodiments, the electronic device may detect a motion gesture performed by the user holding the electronic device, such as shaking the electronic device, through the gyroscope sensor 180B, the acceleration sensor 180E, and the like. In some embodiments, the electronic device may detect a non-touch gesture operation through the camera 193.
In some embodiments of the present application, the electronic device may capture moving images of the user through the camera 193.
The software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of an electronic device is illustrated.
Fig. 2 is a software configuration block diagram of an electronic device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the system is divided into four layers, from top to bottom, an application layer, an application framework layer, runtime (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications (also referred to as applications) such as cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is for providing communication functions of the electronic device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification presented in the form of a chart or scroll bar text in the system top status bar, such as a notification of a background running application, or a notification presented on a screen in the form of a dialog interface. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The runtime includes a core library and a virtual machine. The runtime is responsible for scheduling and management of the system.
The core library consists of two parts: one part is a function that a programming language (e.g., java language) needs to call, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes the programming files (e.g., java files) of the application layer and the application framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), two-dimensional graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of two-Dimensional (2D) and three-Dimensional (3D) layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing 3D graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and a virtual card driver.
The workflow of the electronic device software and hardware is illustrated below in connection with capturing a photo scene.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as a touch click operation, taking a control corresponding to the click operation as an example of a control of a camera application icon, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera driver by calling a kernel layer, and captures a still image or video by the camera 193.
The term "user interface" in the present specification and claims and in the drawings is a media interface for interaction and exchange of information between an application program or operating system and a user, which enables conversion between an internal form of information and a form that the user can receive. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
An exemplary user interface for an application menu on electronic device 100 is described below.
Fig. 3A illustrates an exemplary user interface 31 on an electronic device for exposing an application installed by the electronic device.
The user interface 31 displays an interface on which application icons are placed, which may include a plurality of application icons, such as a clock application icon 309, a calendar application icon 311, a gallery application icon 313, a memo application icon 315, a file management application icon 317, an email application icon 319, a music application icon 319, a wallet application icon 323, a Hua video application icon 325, a sports health application icon 327, a weather application icon 329, a browser application icon 331, a smart life application icon 333, a setup application icon 335, a recorder application icon 337, an application mall application icon 339, and the like. Above the plurality of application icons may be displayed one or more signal strength indicators 303, time indicators 305, battery status indicators 307, etc. of the mobile communication signal. Interface indicators 349 may also be displayed below the application icons to indicate the positional relationship of the currently displayed interface with other interfaces. Below the interface indicators are a plurality of tray icons, such as a phone application icon 345, an information application icon 347, an address book application icon 343, and a camera application icon 341. The tray icon remains displayed when the interface switches. The present embodiment is not limited to the content displayed on the user interface 31.
It will be appreciated that fig. 3A illustrates a user interface on an electronic device by way of example only and should not be construed as limiting embodiments of the present application.
In the following embodiments of the present application, an application program "sports health" of an electronic device and "camera" may provide a function of "sports detection", where the "sports detection" function may be used to detect a sports posture of a user during a sports process, calculate joint forces/moments of a joint related to the user during the sports process, and learn a damage risk situation of the user during the sports process.
"sports health", "camera" are applications installed on an electronic device, the names of which are not limiting in this application.
The method for motion analysis provided by the embodiment of the application can be applied to various scenes, including but not limited to:
(1) Scenes for motion detection in motion-like applications
Scene one:
as shown in fig. 3B, the electronic device may detect a user operation 200 (e.g., a click operation on icon 327) on the "sports health" icon 327, in response to which the user interface 32 shown in fig. 3C by way of example may be displayed. The user interface 32 may be a main user interface of the "sports health" application, which may include a sports pattern list 351, a navigation bar 352, a search bar 353, a control 354, a control 355, a control 356, a control 357, a control 358.
Wherein the sports mode list 351 may be displayed with one or more sports mode options. The one or more movement pattern options may include: indoor running option, body building option, yoga option, walking option, riding option and rope skipping option. The one or more athletic pattern options may appear as text messages on the interface. Without limitation, the one or more movement pattern options may also appear as icons or other forms of interactive elements (interactive element, IE) on the interface.
Among other things, controls 354, 356 may be used to monitor user operations that trigger the opening of an "athletic lesson". The electronic device 100 can detect a user operation on the control 354 (e.g., a click operation on the control 354) in response to which the electronic device 100 can display the user interface 33 shown in fig. 3D. The user interface 33 may include controls 360, 361. Control 360 may be used to monitor user operations that trigger the "start training" to be turned on. The electronic device can detect a user operation on the control 360 (e.g., a click operation 202 on the control 360), in response to which the electronic device 100 can display the user interface 34 as shown in fig. 3E.
The user interface 34 may include controls 362, 363. Control 362 may be used to monitor user operations (e.g., click operation 203 on control 362) that the user triggers selection of the motion gesture detection functionality control. Control 363 may be used to monitor user operations (e.g., clicking operations on control 363) that the user triggers a non-selection motion gesture detection functionality control.
Scene II:
as shown in fig. 3B, the electronic device may detect a user operation 200 (e.g., a click operation on icon 327) on an icon 327 of "sports health," in response to which the user interface 32 shown in fig. 3C by way of example may be displayed. The user interface 32 may be a user interface of a "sports health" application that may include a sports pattern list 351, a navigation bar 352, a search bar 353, a control 354, a control 355, a control 356, a control 357, a control 358.
Among other things, control 358 may be used to monitor user operations that trigger the "analog detect" to be turned on. As shown in fig. 3F, the electronic device 100 may detect a user operation on the control 358 (e.g., a click operation 204 on the control 358), in response to which the electronic device 100 triggers initiation of the simulated motion detection function.
In another case, as shown in fig. 3D,
controls 361 may be included on the user interface 33, the controls 361 being operable to monitor for user operations that trigger the "simulated detection" to be turned on. The electronic device can detect a user operation on control 361 (e.g., a click operation on control 361), in response to which user interface 32, shown in the example of FIG. 3C, can be displayed. The user interface 32 may be a user interface of a "sports health" application, with the electronic device 100 triggering the initiation of the simulated motion detection function.
Scene III:
as shown in fig. 3G, the electronic device may detect a user operation 205 (e.g., a click operation on the icon 341) on the icon 341 of "camera" in response to which the user interface 35 exemplarily shown in fig. 3H may be displayed. The user interface 35 may be a user interface of a "camera" application that may include a camera settings list 364, a shooting mode list 365, a motion detection option 366, a control 367, a control 368, and an area 370. Wherein:
the camera settings list 364 may be used to display one or more camera settings options for a user to adjust camera settings parameters. The shooting mode list 365 may display one or more shooting mode options, which may include: aperture option, night scene option, portrait option, photo option, video option, professional option, and motion detection option. Control 367 is used to monitor user operation of starting to open the "gallery". Control 368 is used to monitor user operations for a departure photograph. Control 369 is used to monitor the user operation of the switch camera. The area 370 may be used to display images captured by the camera.
The electronic device 100 may detect a user operation on the motion detection option (e.g., a click operation 206 at the motion detection option 366), in response to which the electronic device 100 may trigger a motion detection function.
It can be appreciated that the above scenario is merely an example, and the method for motion analysis provided in the embodiments of the present application may also be applied to other scenarios, which are not limited herein.
Based on the above scenario, some embodiments of a User Interface (UI) implemented on the electronic device 100 are described below.
Fig. 4A-4D illustrate user interfaces of face recognition modules in a "sports health" application. The electronic device 100 may detect a user operation of the control 362 acting on the user interface 34, in response to which the electronic device may display the user interface 40 as exemplarily shown in fig. 4A. The user interface 40 may be used to prompt the user for an impending face recognition, such as displaying the text "impending face recognition" 401, the prompt time may be 5s, the end of the prompt time, the electronic device 100 may display the user interface 41, and begin face recognition. As shown in fig. 4B, the user interface 41 exemplarily shows a face recognition interface, and collects face information. After the electronic device 100 collects the face information through the camera, some necessary processing may be performed, and the processed face information may be matched with a stored face information template, so as to invoke body evaluation information of the user based on the face information template. Wherein the face information template may be entered by the user prior to face recognition by the electronic device 100.
If the collected face information fails to match the stored face information template, the electronic device 100 may display the user interface 42 as shown in fig. 4C. The user interface 42 may display a prompt 403 for prompting the user that the user is a new user, i.e., the stored face information template does not store the face information of the user, the identity evaluation information cannot be invoked, and the user needs to perform a body measurement evaluation to obtain the body evaluation information of the user. The prompting time of the prompt 403 may be 5s, and after the prompt is finished, the electronic device may display the user interface provided by the body measurement evaluation function exemplarily shown in fig. 4E to 4I.
If the acquired face information is successfully matched with the stored face information template, the electronic device 100 may detect the last login time of the user, and if the last login time of the user does not exceed the preset time period, the electronic device 100 may acquire the identity evaluation information of the user.
If the collected face information is successfully matched with the stored face information template, the electronic device 100 may detect the last login time of the user, and if the last login time of the user exceeds a preset period of time, the electronic device may display the user interface 43. User interface 43 may include prompt 404, control 405, control 406. The prompt 404 is used to prompt the user to recommend that the user re-perform the body measurement assessment for the user whose login time exceeds a preset period. Control 405 is used to monitor user operations that trigger a re-measurement evaluation. The electronic device can detect a user operation on control 405 (e.g., a click operation on control 405) in response to which the electronic device can display a user interface provided by the body measurement evaluation function as exemplarily shown in fig. 4E-4I. Control 406 may be used to monitor user operations that trigger the use of legacy data. The legacy data indicates body assessment information that the user previously stored in the electronic device 100.
Fig. 4E-4I illustrate user interfaces provided by the body measurement evaluation function. The body measurement assessment may include both a body parameter assessment and a body state assessment.
Fig. 4E-4I illustrate related user interfaces for body parameter assessment.
As shown in fig. 4E, electronic device 100 can display control 407 and control 408 on user interface 44.
Control 407 may monitor for user operations that trigger device detection. The electronic device 100 may detect a user operation on the control 407 (e.g., a click operation on the control 407), in response to which the electronic device 100 may turn on the camera and display the user interface 45. As shown in fig. 4F, the user interface 45 may include a user image 409 and a prompt box 410 detected by a camera. The prompt 410 may display height information of the user. In particular implementations, the electronic device 100 may turn on the camera, detect the height of the user, and display the height information of the user on the prompt box 410. The electronic device 100 detecting the height of the user may be: aiming the electronic equipment 100 at the user to be measured, clicking the position of the aiming foot to create a measuring point, and moving the electronic equipment upwards to the head position of the user to be measured to measure the height information. After the electronic device 100 has detected the height information, the user interface 46 may be displayed. As shown in fig. 4G, the user interface 46 may include: prompt 411, control 412, and control 413. Prompt box 411 may be used to display whether the user is connected to a body fat scale to obtain body weight and body fat index (BMI) information of the user. Control 412 may monitor user operations that trigger the connector fat scale. The electronic device 100 may detect a user operation on the control 412 (e.g., a click operation on the control 412) in response to which the electronic device 100 may interface with the body fat scale, displaying the user interface 47. As shown in fig. 4H, the user interface 47 may display weight and body fat rate information obtained by the body fat scale. After the electronic device 100 obtains the height, weight, and body fat rate information of the user, a user interface for physical state assessment as shown in fig. 4J-4N may be displayed.
Control 408 may monitor for user operations that trigger the user to enter relevant data. The electronic device 100 can detect a user operation on the control 408 (e.g., a click operation on the control 408) in response to which the electronic device 100 can display the user interface 48. The user interface 48 may include an input box 416 that may be used to receive information of height, weight, and body fat rate entered by a user. The electronic device 100 may detect a user operation acting on the input box 416 (e.g., an input operation on the input box 416), and in response thereto may display a user interface for the physical state assessment as shown in fig. 4J-4N.
Fig. 4J-4N illustrate user interfaces for physical state assessment.
As shown in fig. 4J, the user interface 49 may include: prompt box 417, control 418, control 419. Prompt box 417 may be used to prompt an impending physical state assessment.
Control 418 may monitor user operations that trigger the device to detect a physical state. The electronic device 100 can detect a user operation on the control 418 (e.g., a click operation on the control 418) in response to which the electronic device 100 can display the user interface 50. As shown in fig. 4K, the user interface 50 may include a prompt box 420, where the prompt box 420 may be used to prompt the user that attention is needed when assessing physical state, and the information may be prompt "about to turn on the camera, please place the hand at the damaged portion, and the severity will be reflected according to the hand placement time. The prompting time of the prompting frame 420 can be 5s, the prompting time is over, the electronic device can start the camera, and the user interface 51 is displayed. As shown in fig. 4L, the user interface 51 may include an area 421, a display box 422. The area 421 may display an image of the user acquired by the electronic device 100 through the camera, and the area 421 may display an overall image of the user or may display an image of a lower limb of the user. The display box 422 may be used to display the extent of injury reflected by the time the user places his hand at the injury site. The display frame 422 may include damage degree frames reflected when the placement time is <2s, 2-4s, 4-6s, 6-8s, >8s, and the damage degree frames of <2s, 2-4s, 4-6s, 6-8s, >8s gradually change from short to long colors with the placement time, and may be displayed as blue, green, yellow, orange, red, respectively. For example, when the electronic device 100 detects that the user has been left for 5 seconds, the damage level box corresponding to 4-6 seconds will turn red. The electronic device 100 may display the user interface 52 after detecting that the damage level box color display is completed. As shown in fig. 4M, the user interface 52 may include a prompt 423. The prompt box 423 may be used to display the percentage of the user's injury site force as a standard force. After the electronic device 100 obtains the physical state evaluation information of the user, a user interface for detecting skeletal nodes as shown in fig. 5A-5C may be displayed. The control 419 may monitor for user operations that trigger manual entry of a physical state. The electronic device 100 can detect a user operation on the control 419 (e.g., a click operation on the control 419), in response to which the electronic device 100 can display the user interface 53. The user interface 53 may include input boxes 424, 425. The input box 424 may be used to receive a user entered lesion site name. The input box 425 may be used to receive the degree of injury to the injury site of the user, which may be categorized into 1-5 steps, with greater numbers being more severe. The 1-5 blocks respectively correspond to the corresponding damage degree frames and can be respectively displayed as blue, green, yellow, orange and red. The electronic device 100 may detect a user operation (e.g., a click operation on any of the 1-5 steps on the input frame 425) acting on the 1-5 steps damage level frame on the input frame 425, and the corresponding damage level frame displays the corresponding color. The electronic device 100 may display the user interface 52 upon detecting that the user operation on the user interface 53 is completed. The user interface 52 may include a prompt 423. The prompt box 423 may be used to display the percentage of the user's injury site force as a standard force. After the electronic device 100 obtains the physical state evaluation information of the user, a user interface for detecting skeletal nodes as shown in fig. 5A-5C may be displayed.
Fig. 5A-5C illustrate user interfaces for detecting skeletal nodes.
As shown in fig. 5A, the electronic device turns on the camera, displaying a user interface 54, which may include an area 501, a prompt box 502. The region 501 may be used to display a user image captured by a camera in real time, and the electronic device 100 may refresh the display content therein in real time so that the electronic device 100 detects the bone node position of the user from the captured user image. The prompt box 502 may display the status of detecting skeletal nodes, which may be the word "…" in skeletal node detection. After the electronic device 100 detects that the skeletal node is complete, a user interface for spatial pose calibration, as shown in FIG. 6, may be displayed.
In some embodiments, when the electronic device 100 acquires the user image in real time, as shown in fig. 5B and 5C, if the electronic device 100 detects that the user is not in an upright state, and there is a posture abnormality, such as a left leg bending, etc., a prompt message 504 may be output in the prompt box 502, and the prompt message 504 may be a word "posture abnormality, please keep upright", and may be used to prompt the user to adjust the posture and keep the upright state. After the electronic device 100 detects that the user has adjusted to an upright position, the user interface 54 may be displayed.
Fig. 6 illustrates a user interface for spatial pose calibration.
As shown in fig. 6, the user interface 60 may include a prompt 601 and a prompt 602.
Prompt 601 may be used to display user interface 60 as an interface for setting a ground reference value, prompt 601 may be the word "…" in spatial pose calibration.
The prompt box 602 may be used to output a time countdown, which may be a change in the numbers 3, 2, 1, for prompting the user for the time required for gesture calibration of the control.
When the electronic device 100 detects that the spatial pose calibration is completed, a user interface for setting the ground reference value as shown in fig. 7 may be displayed.
Fig. 7 illustrates a user interface for setting the ground-off/touchdown reference value.
As shown in fig. 7, the user interface 70 may include a prompt box 701, a control 702, and an input box 703.
The prompt box 701 may be used to display the user interface 70 as an interface for setting the ground reference value, and may be the word "set ground reference value".
Control 702 may monitor user operations that trigger the device to set the user's ground reference value. The electronic device 100 can detect a user operation on the control 702 (e.g., a click operation on the control 702), in response to which the electronic device 100 can set a user ground reference value. After the electronic device 100 detects that the setting of the ground separation/touchdown reference value of the user is completed, a user interface for force detection as shown in fig. 8A or 8B may be displayed.
The input block 703 may be used to receive a ground reference value entered by a user. The electronic device 100 may detect a user operation acting on the input box 703 (e.g., an input operation on the input box 703), and in response to the operation, may display a user interface for force detection as shown in fig. 8A or 8B.
Fig. 8A-8C illustrate user interfaces for force detection.
As shown in fig. 8A, user interface 80 may include a region 801, a region 802, and a prompt 803, a prompt 804.
The region 801 may be used to display a user moving image captured in real time by the camera 193. Region 802 may be used to display exemplary motion images of a athletic lesson. Prompt 803 may be used to display the current athletic action name. The prompt box 804 may be used to display the amount of user motion, which may be a combination of numbers and text, such as "5 kcal".
In some embodiments, as shown in FIG. 8B, user interface 81 may include a region 801, a region 802, a prompt 803, a prompt 804, and icons 805, a prompt 806. The region 801, the region 802, the prompt box 803, and the prompt box 804 may refer to related descriptions in the user interface 80, and will not be described herein. Icon 805 may be used to highlight the joint force location of the user and highlight the force magnitude by color marking. For example, when the joint is under less stress, it may be displayed as green on icon 805; when the joint is stressed greatly, the joint can be displayed as yellow on the icon 805; the joint stress exceeds the joint stress threshold and is shown red on icon 805 when there is a risk of injury. The prompt box 806 may be used to display the force values for the corresponding joint locations.
The electronic device 100 detects that a user has a certain risk of injury, and may display a user interface for outputting risk prompt information as shown in fig. 9A-9D.
Fig. 8C illustrates one user interface for simulated detection of athletic lessons. User interface 82 includes a region 807, which region 807 may be used to display images of exemplary actions in the athletic lesson.
Fig. 9A-9E illustrate user interfaces that output risk prompt information.
As shown in fig. 9A, the user interface 90 may include prompt boxes 901, 902 and a control 903. Prompt box 901 may be used to display high risk information for athletic activity, which may be the word "detect that there is a higher risk of injury to the current athletic activity. Prompt box 902 may be used to display that there is a higher risk of the current action, which may be the word "left leg is overstressed". Control 903 may monitor for user operations that trigger a return force detection. The electronic device 100 can detect a user operation on the control 903 (e.g., a click operation on the control 903), in response to which the electronic device 100 can display a force detection user interface as shown in fig. 8A or 8B.
The electronic device 100 may display the user 91 when no user operation is detected on the control 903 for a preset period of time, or when the user interface 90 is displayed for more than a preset period of time. As shown in fig. 9B, the user interface 91 may include a control 903, a prompt box 904, and controls 905 and 906. Control 903 may refer to related descriptions in user interface 90 and will not be described in detail herein. Prompt box 904 may display the text "do this movement continue? ".
Control 905 may monitor for user operations that trigger continued movement and receive movement guidance. The electronic device 100 can detect a user operation on the control 905 (e.g., a click operation on the control 905), in response to which the electronic device 100 can display the user interface 92. As shown in fig. 9C, the user interface 92 may include a control 903 and a prompt 907. Prompt box 907 may display an adjustment scheme for athletic activity that presents a higher risk. The electronic device 100 may detect that the user interface 92 is displayed for more than a preset period of time and may display a force detection user interface as shown in fig. 8A or 8B.
Control 906 may monitor for user operations that trigger switching athletic lessons. The electronic device 100 can detect a user operation on the control 906 (e.g., a click operation on the control 906) in response to which the electronic device 100 can display the user interface 93. The user interface 93 may include a control 908 and a control 909. Control 908 may monitor for user operations that trigger a return to the original athletic lesson. The electronic device 100 can detect a user operation on the control 908 (e.g., a click operation on the control 908), in response to which the electronic device 100 can display a force detection user interface or user interface 92 as shown in fig. 8A or 8B. Control 909 may monitor user operations that trigger selection of recommended athletic courses. The electronic device 100 can detect a user operation on the control 909 (e.g., a click operation on the control 909), in response to which the electronic device 100 can switch to a recommended athletic lesson selected by the user.
In some embodiments, after the electronic device 100 displays the user interface 90, the user interface 92 may be displayed. The user interface 92 may include prompt boxes 901, 902 and controls 903. The electronic device 100 may display the user interface 92 when detecting that the display time of the user interface 90 exceeds a preset period of time. The user interface 92 may include a control 903 and a prompt 907. The electronic device 100 can detect a user operation on the control 906 (e.g., a click operation on the control 906), and can display a force detection user interface such as that shown in fig. 8A or 8B in response to the operation or detection of the user interface 92 being displayed for more than a preset period of time.
Fig. 9E illustrates one user interface for simulated detection of athletic lessons. The user interface 94 may include: prompt box 910. Prompts 911, 912, 913 may be included in prompt box 910. Prompt 911 may be used to prompt the completion of the simulated test of the athletic lesson, prompt 912 may be used to prompt the damage risk level of the athletic lesson, and prompt 913 may be used to display the action with higher damage risk in the athletic lesson and the adjustment scheme for the action.
In the following, a method for motion analysis provided in the present application will be described in detail by taking motion analysis using an electronic device as an example.
Fig. 10 shows a detailed flow of a method of motion analysis. As shown in fig. 10, the method may include:
s101: the electronic device receives user operation of a user for the first application, wherein the user operation is used for indicating the electronic device to acquire the identity evaluation information.
The first application may be a sports application in an electronic device, such as a sports application in a smart phone or a television, a professional motion detection system, etc., or may be a camera.
Illustratively, the first application may be a "sports health" application in the electronic device 100 in fig. 3A, and may be a "camera" in the electronic device in fig. 3A. The exercise health is an exercise and fitness application program on electronic equipment such as a smart phone, a tablet computer and the like, and the name of the application program is not limited in the embodiment of the application program. The camera is a photographing application program on electronic equipment such as a smart phone, a tablet personal computer and the like, and the name of the application program is not limited in the embodiment of the application program.
The user operation for the first application may be a touch operation of a user, or may be a voice operation, a gesture operation, or the like of the user, which is not limited herein.
The identity assessment information may include physical parameter assessment information and may also include physical state assessment information. The physical parameter evaluation information may include a height, a weight, a body fat rate of the user, and the physical state evaluation information may include a damage location and a damage degree of the user. The injury refers to the damage of skin and flesh, tendons and bones, viscera and other tissue structures caused by the action of various external wound factors, and local or systemic reaction caused by the damage. The injured part is a part where the human body is injured, such as ankle, knee, etc.
The manner in which the user triggers the electronic device to obtain the body assessment information may be that a control with a function of initiating motion detection is triggered in a main interface of the first application; or, the user can trigger a certain exercise course control in the main interface of the first application, the main interface of the exercise course is displayed, and a control with a function of starting exercise detection is triggered in the main interface of the exercise course to acquire identity evaluation information.
In one possible implementation, the user interface may be a user interface as shown in FIGS. 3B-3E. The electronic device may detect a user operation 200 (e.g., a click operation on the icon 327) on the "sports health" icon 327, in response to which the user interface 32 illustrated by way of example in fig. 3C may be displayed. The electronic device 100 may detect a user operation (e.g., a click operation on control 354) on control 354 in the user interface 32, in response to which the electronic device 100 may display the user interface 33 shown in fig. 3D. User interface 33 is an introduction main interface for the athletic lesson. The electronic device 100 can detect a user operation (e.g., a click operation on the control 360) on the control 360 in the user interface 33, in response to which the electronic device 100 can display the user interface 34. The electronic device can detect a user operation (e.g., a click operation on control 362) on control 362 in user interface 34, triggering an operation to obtain identity assessment information.
In one possible implementation, the user interface may be a user interface as shown in FIGS. 3B-3E. The electronic device 100 may detect a user operation (e.g., a click operation on control 358) on control 358 in the user interface 33, in response to which the electronic device 100 triggers the initiation of a simulated motion detection function to obtain identity assessment information.
In one possible implementation, the user interface may be a user interface as shown in fig. 3B and 3C. The electronic device may detect a user operation 200 (e.g., a click operation on the icon 327) on the "sports health" icon 327, in response to which the user interface 32 illustrated by way of example in fig. 3C may be displayed. The electronic device 100 may detect a user operation (e.g., a click operation on control 358) on control 358 in the user interface 32, in response to which the electronic device 100 triggers the initiation of a simulated motion detection function to obtain identity assessment information.
In another possible implementation, the user interface may be a user interface as shown in fig. 3G and 3H. As shown in fig. 3G, the electronic device may detect a user operation 205 (e.g., a click operation on the icon 341) on the icon 341 of "camera" in response to which the user interface 35 exemplarily shown in fig. 3H may be displayed. The electronic device 100 may detect a user operation (e.g., a click operation on the capture mode list 365) acting on a motion detection option in the user interface 35, in response to which the electronic device 100 may trigger a motion detection function to obtain identity assessment information.
S102: the electronic device obtains identity assessment information of the user, wherein the identity assessment information comprises body parameter assessment information.
The body parameter assessment information may include weight and height.
The identity assessment information may also include physical state assessment information. The physical state evaluation information may include a lesion site and a lesion degree of the user.
The identity assessment information may also include an athletic ability index, which refers to the intensity of movement that a user can withstand.
In some embodiments, the electronic device may detect or receive user-entered identity assessment information via a detection device such as a camera. For example, the electronic device may turn on the camera, detect an image of the user, and obtain the height, weight, injury location, and injury level of the user from the detected image of the user; the electronic device may detect the weight of the user by jumping to the installed weight measurement application; the BMI of the user can be obtained by processing the obtained height data and weight data of the user. For another example, the electronic device may obtain information such as height, weight or BMI values, injury location, and injury level entered by the user. The electronic device can obtain the height and the weight by the weight ≡ Height of body 2 The BMI value is calculated.
In one implementation, the electronic device may obtain the identity assessment information of the user through a detection device such as a camera. For example, the electronic device may turn on the camera, detect an image of the user, and obtain the height of the user from the detected image of the user; the electronic equipment can acquire the weight and BMI value of the user through the Bluetooth connector lipid scale; the electronic device may obtain the physical state evaluation information of the user by turning on the camera to detect that the user places a certain part of the body at the damaged part and the placement time thereof, for example, the electronic device may obtain the physical state evaluation information of the user by turning on the camera to detect that the user places a hand at the damaged part and the placement time thereof. The specific examples are shown in fig. 4E-4H and fig. 4J-4M, which are not repeated here.
In another implementation, an electronic device receives identity assessment information entered by a user. For example, the electronic device may receive user input of height, weight, BMI value, injury location, and degree of injury. As shown in fig. 4E and 4I, electronic device 100 can detect a user operation on control 408 (e.g., a click operation on control 408), in response to which electronic device 100 can display user interface 48. The user interface 48 may include an input box 416 that may be used to receive information of height, weight, and body fat rate entered by a user. The electronic device 100 may detect a user operation on the input box 416 (e.g., an input operation on the input box 416), and in response thereto, may display a user interface for the physical state assessment as shown in fig. 4J. The electronic device 100 can detect a user operation (e.g., a click operation on control 419) on control 419 in the user interface 49, in response to which the electronic device 100 can display the user interface 53. The user interface 53 may include input boxes 424, 425. The input box 424 may be used to receive a user entered lesion site name. The input box 425 may be used to receive the degree of injury to the injury site of the user, which may be categorized into 1-5 steps, with greater numbers being more severe. The 1-5 blocks respectively correspond to the corresponding damage degree frames and can be respectively displayed as blue, green, yellow, orange and red. The electronic device 100 may detect a user operation (e.g., a click operation on any of the 1-5 steps on the input frame 425) acting on the 1-5 steps damage level frame on the input frame 425, and the corresponding damage level frame displays the corresponding color. The electronic device 100 may display the user interface 52 upon detecting that the user operation on the user interface 53 is completed. The user interface 52 may include a prompt 423. The prompt box 423 may be used to display the percentage of the user's injury site force as a standard force.
In other embodiments, the electronic device may obtain an identity of the user, and obtain the identity assessment information based on the identity. The identity feature may be face information, fingerprint information, etc. For example, the electronic device may start the camera after receiving the user operation, obtain the face image, and obtain the face information after processing; matching the processed face information with a face information template stored in electronic equipment, and calling body evaluation information; also for example:
in one implementation, the electronic device obtains an identity feature of a user, and when the identity feature is not included in the identity features stored in the electronic device, the identity evaluation information input by the user can be detected or received through a detection device such as a camera. For example, the electronic device may acquire a face image through a camera, and obtain face information after processing. And matching the processed face information with the stored face information template, wherein the electronic equipment detects that the matching fails, and the electronic equipment can detect or receive identity evaluation information input by a user through detection equipment such as a camera.
In another implementation, the electronic device obtains the identity feature of the user, the identity feature is included in the existing identity feature of the electronic device, the electronic device detects that the last login time of the user exceeds a preset time period, and the electronic device can detect or receive identity evaluation information input by the user through detection devices such as a camera. For example, the electronic device may acquire a face image through a camera, and obtain face information after processing. Matching the processed face information with the stored face information template, detecting the matched face information template by the electronic equipment, detecting that the last login time of the user exceeds a preset time period by the electronic equipment, and detecting or receiving identity evaluation information input by the user by the electronic equipment through detection equipment such as a camera.
The user interface may be, for example, a user interface as shown in fig. 4A-4D. After the electronic device 100 collects the face information through the camera, some necessary processing may be performed, and the processed face information may be matched with a stored face information template, so as to invoke body evaluation information of the user based on the face information template. Wherein the face information template may be entered by the user prior to face recognition by the electronic device 100. The embodiment of the application does not limit the device and the specific algorithm of face recognition, and only the face recognition can be realized.
If the collected face information fails to match the stored face information template, the electronic device 100 may display the user interface 42 as shown in fig. 4C. The user interface 42 may display a prompt 403 for prompting the user that the user is a new user, i.e., the stored face information template does not store the face information of the user, the identity evaluation information cannot be invoked, and the user needs to perform a body measurement evaluation to obtain the body evaluation information of the user. The prompting time of the prompt 403 may be 5s, and after the prompt is finished, the electronic device may display the user interface provided by the body measurement evaluation function exemplarily shown in fig. 4E to 4I.
If the acquired face information is successfully matched with the stored face information template, the electronic device 100 may detect the last login time of the user, and if the last login time of the user does not exceed the preset time period, the electronic device 100 may acquire the identity evaluation information of the user.
If the collected face information is successfully matched with the stored face information template, the electronic device 100 may detect the last login time of the user, and if the last login time of the user exceeds a preset period of time, the electronic device may display the user interface 43. User interface 43 may include prompt 404, control 405, control 406. The prompt 404 is used to prompt the user to recommend that the user re-perform the body measurement assessment for the user whose login time exceeds a preset period. Control 405 is used to monitor user operations that trigger a re-measurement evaluation. The electronic device can detect a user operation on control 405 (e.g., a click operation on control 405) in response to which the electronic device can display a user interface provided by the body measurement evaluation function as exemplarily shown in fig. 4E-4I. Control 406 may be used to monitor user operations that trigger the use of legacy data. The legacy data indicates body assessment information that the user previously stored in the electronic device 100.
In a specific implementation, the electronic device may distribute body mass of each part of the user by acquiring body parameter evaluation information of the user, may calculate a movement capacity index according to the body parameter evaluation information and a movement quantity of the user in a preset time period, and may correspondingly calculate a bearing reference value of the user through the BMI and the movement capacity index. The electronic device can set the limit of the bearing force of the standard organism of the human joint as the standard bearing force. Users of different sizes have different limits of the bearing capacity of their joints. In an implementation, the load reference value may be equal to the standard load multiplied by the parameter. The specific implementation of the method can comprise the following steps: the electronic device may divide the acquired BMI values into a plurality of BMI value intervals, and set different percentages of standard stress to BMI stress of different BMI value intervals according to the plurality of BMI value intervals, where the percentages are initial reference ratios. The electronic device may set the standard amount of motion based on the BMI of the user, and thereafter may dynamically adjust the standard amount of motion based on a percentage of the maximum amount of motion in the user's daily amount of motion. The initial reference ratio is dynamically reduced by 10% when the daily amount of motion of the user increases from 0 to the standard amount of motion. For example, the BMI value may be divided into n BMI value intervals, where the BMI force for the n BMI value intervals may be: 100% ×standard force, 90% ×standard force, 80% ×standard force, and (10- (n-1)) ×standard force.
In some embodiments, the electronic device may divide the acquired BMI values into five intervals of <24, 24-27, 27-30, 30-35, >35, and the corresponding BMI forces may be: 100% of standard stress, 90% of standard stress, 80% of standard stress, 70% of standard stress and 60% of standard stress, and the BMI stress is a stress reference value.
Further, in the specific implementation, the electronic device can also obtain whether the body of the user has a damaged part or not by acquiring the body state evaluation information of the user, and the damage degree of the damaged part is correspondingly calculated, so that the stress evaluation of the damaged part is correspondingly calculated, and the stress reference value of the damaged part can be correspondingly calculated by combining the BMI stress and the standard motion quantity of the user. The electronic equipment can divide the damage degree of the damaged part of the user into a plurality of degrees, evaluate the relation between the stress of the damaged part and the standard stress according to the damage of different degrees, and calculate the BMI stress according to the BMI intervals and the standard stress after the damage weighting.
In some embodiments, the placement time of the user at the injury site can be set to be less than 2s, 2-4s, 4-6s, 6-8s, > 8s, the placement time of the injury site corresponds to the stress evaluation of the injury site one by one, and the stress evaluation of the injury site is respectively 90%, 80%, 70%, 60% and 50% of the standard stress according to the placement time from short to long. For example, the electronic device detects that the hand of the user is placed on the knee joint for 4-6s, and the stress evaluation of the damaged part corresponds to 70% of the standard stress. And correspondingly calculating BMI stress according to the BMI intervals and the stress evaluation of the damaged part.
It is understood that step S102 may also be performed after step S103 or 104.
S103: the electronic device detects a position of a skeletal node of the target object to obtain a spatial positional relationship of the skeletal node.
The target object may be a user or a moving image in a selected movement course, such as an image of a standard demonstration action in a movement course.
The electronic device may analyze skeletal points of the user, such as ankle joints, knee joints, hip joints, etc., by skeletal point recognition techniques for images of standard demonstration actions in the user or video, such as in athletic courses.
The electronic device can detect skeletal nodes of the user through cameras, sensors and the like, and the positions of the skeletal nodes are used for indicating the connection relation between joints of the user and each joint and the spatial position relation of the skeletal nodes.
Further, the electronic device can analyze the body proportion, fat and thin and other body type conditions of the user by combining the depth camera module or according to the body parameter evaluation information.
Illustratively, the electronic device may detect the skeletal node of the user through a camera, infrared sensor, optical marker, 3D scanner, or the like sensor. The electronic device may also construct a skeletal model through a deep learning network such as a multiple-person linear model (SMPL), a video foreground extraction algorithm (visual background extractor, VIBE), and the like. For example, when the skeletal model of the human body is constructed by the SMPL model, the height, the weight and the like in the collected user body parameter evaluation information can be used for constructing by combining with national standard GB-T17245-2004.
Specifically, the electronic device may detect a skeletal node of the user by using the acquired user image and a human skeletal point positioning algorithm, where the skeletal node refers to a coordinate of a determined skeletal point. Further, the body type of the user may be determined in combination with the coordinates of the skeletal nodes and the above-described body parameter assessment information. The input of the human skeleton point positioning algorithm can be an image of a user, and the output can be coordinates of skeleton nodes. The electronic device may detect basic human skeletal nodes, such as left hip joint, right hip joint, left knee joint, right knee joint, etc., as shown in fig. 11. It will be appreciated that the electronic device may detect more or fewer skeletal nodes, not limited to those shown in fig. 11.
In some embodiments, the electronic device may turn on the camera to acquire a user image, identify skeletal nodes by analyzing the user image, and the acquired skeletal node map may be as shown in fig. 11.
In one implementation, the electronic device turns on the camera to obtain an image, and if the electronic device fails to detect a skeletal node from the image, the electronic device indicates that the electronic device fails to detect a user, and the electronic device may output prompt information that the user is not detected, which may be the word "no user detected".
In another implementation, as shown in fig. 5A, the electronic device turns on a camera to acquire an image, and identifies skeletal nodes of the user based on the image.
In a specific implementation, the electronic device may detect a skeletal node of a user, and in a detection process, if the electronic device detects that an abnormality exists in a gesture of the user, the electronic device may output a related abnormality prompt. Posture anomalies refer to the user not remaining upright when performing skeletal node detection prior to use in a motion state. The standing state refers to a natural standing state in which the upper limb naturally sags, the toe is forward, and the visual front is seen, and the abnormal posture of the lower limb can be that the legs are bent, the feet leave the ground, and the like. The user not remaining in an upright position may cause inaccuracy in the bone node map detected by the electronic device, resulting in errors in subsequent force detection. As shown in fig. 5B-5C, when the electronic device 100 detects that the posture of the lower limb of the user is abnormal, a prompt 504 for the posture abnormality is output to prompt the user to adjust the posture, keep the upright state, and continue to detect the skeletal node. It can be understood that the prompt of the abnormal gesture can be a screen text display on the electronic equipment, a voice prompt and the like.
S104: and the electronic equipment performs space attitude calibration and constructs a reference coordinate system.
The spatial attitude calibration refers to setting a reference coordinate system for the movement of the target object according to the skeletal node when the target object is in an upright state, so as to calibrate the movement state of the target object. For example, taking the example of detecting a user, a reference frame can be constructed with the ground and the user upright orientation, and then the jumping motion of the user relative to the reference frame assumes a state of motion in which both feet are away from the ground and jump upward. If the spatial pose calibration is not performed, the motion state of the user cannot be specifically determined.
In a specific embodiment, the reference coordinate system may use the waist or the foot of the user, the midpoint of the connecting line of the two feet, and the like as the origin, and the front direction, the distance direction between the head and the neck, the connecting line direction of the two feet, the vertical direction, and the like as coordinate axes to establish the space coordinate system. The reference coordinate system is the coordinate system for the user to move in the subsequent movement.
For the convenience of spatial posture calibration, for example, as shown in fig. 12, a reference coordinate system uvw is established with a midpoint of a connection line between two feet of a user as an origin, a connection line direction between two feet as a u-axis, a vertical distance direction between the head and the neck as a v-axis, a vertical distance direction between the head and the neck as a w-axis, and a vertical direction of a uv plane as a w-axis.
It will be appreciated that the coordinates of the bone nodes detected by the bone nodes described above can be known from the reference coordinate system after the reference coordinate system has been established.
Optionally, when the electronic device performs the spatial gesture calibration, a calibration countdown may be displayed in the user interface. For example, the electronic device may display a user interface 60 as described in fig. 6, the user interface 60 being for displaying countdown 3, 2, 1 of the spatial pose calibration. It will be appreciated that the calibration countdown may be a screen text display on the electronic device, a voice alert, or the like.
S105: the electronic equipment acquires a ground clearance reference value setting, and the ground clearance reference value is used for judging the ground clearance state of the user.
The ground clearance reference value is a minimum distance when the user's foot is in a ground contact state with both feet and the left and right ankle joints are detected as being in a ground clearance state with respect to the reference coordinate system.
In some embodiments, the electronic device may obtain the ground reference value by self-setting, so as to determine the ground state of the user.
In some embodiments, the electronic device may obtain the ground reference value by self-setting or receiving a user input, so as to determine the ground/bottoming state of the user.
In one implementation, the electronic device may set the user's ground reference value by itself. As shown in fig. 7, electronic device 100 can detect a user operation on control 702 (e.g., a click operation on control 702), in response to which electronic device 100 can set the user's ground/bottoming reference value by itself.
In another implementation, the electronic device may detect a user-entered ground reference value. As shown in fig. 7, the electronic apparatus 100 may detect a user operation (e.g., an input operation on the input box 703) acting on the input box 703, and in response to the operation, the electronic apparatus 100 may receive a ground reference value set by the user.
In other embodiments, the electronic device may also set the user's ground reference value based on the user's physical parameter evaluation information. The electronic equipment can dynamically adjust the ground reference value of the user according to the BMI value acquired by the body parameter evaluation information. If the BMI value of the user is divided into a plurality of BMI sections, when the BMI value is located in the normal section, the ground reference value of the user may be set to xcm, and the ground reference value of the BMI section with a larger BMI value is sequentially decreased.
For example, the reference coordinate system is the reference coordinate system uvw, and the electronic device may receive that the ground reference value of the user is 15cm, that is, means that when the distance between the ankle joint of the user and the uv plane of the reference coordinate system is greater than or equal to 15cm, the electronic device may detect that the foot of the user is in a ground-leaving state, and the user may perform a jumping motion.
S106: the electronic equipment obtains the stress condition of the target object based on the skeleton node.
The target object may be a user or a moving image in a video, such as an image of a standard demonstration action in a sports lesson.
The electronic equipment can acquire the stress value of the user aiming at the user, and analyze the stress condition of the user in the motion process. The electronic device may also analyze the stress in the images based on moving images in the video, such as images of standard demonstration actions in a movement session, in combination with the physical parameter assessment information and/or physical state assessment information of the user.
The specific steps of obtaining the stress condition of the target object are shown in fig. 13.
S201: acquiring first data of a target object according to body parameters of the target object; the first data includes mass, centroid, moment of inertia of the human body link.
The human body inertia parameters comprise the mass, mass center position and rotational inertia of human body links, and are basic parameters for human body movement and movement injury and prevention research. The human body links include: thigh, calf, foot, upper arm, forearm, etc. Such as the mass, mass center, moment of inertia of the human body link, specifically the mass, mass center, moment of inertia of the thigh, the shank, the foot, etc.
It will be appreciated that the mass of the thigh, calf, foot, the location of the centroid and moment of inertia can be obtained from body parameter information of the target subject, such as height and weight, which can be obtained by combining the body parameter information with the national standard for adult human inertia parameters (GB-T17245-2004). For example, the regression equation y=b for the male thigh can be passed 0 +B 1 X 1 +B 2 X 2 (wherein B 0 As a regression equation constant term, B 1 Regression coefficient of body weight, B 2 Regression coefficients for height) to calculate the mass or centroid position of the male thigh.
After the mass center of the human body link in the human body inertia parameters is obtained, the coordinates of the mass center of the human body link can be obtained through the reference coordinate system.
It is understood that the first data obtained in S201 may also be obtained in step S102.
S202: acquiring second data of the target object and a ground-leaving state of the foot; the second data includes movement speed, angular speed, and position information of the human joint.
The movement speed and the angular speed of the centroid of the human body link can refer to the movement speed and the angular speed of the human body links such as thighs, calves and the like, and the position information of the human body joints can be obtained by detecting the coordinate values of the human body joints in the reference coordinate system.
The foot-off condition may include a first condition, a second condition, and a third condition. The first state indicates a bipedal empty state, the second state indicates a one-foot touchdown state, and the third state indicates a bipedal touchdown state. Specifically, the foot-off state may be determined according to whether the distance of the foot of the target object from the reference plane (such as the uv plane in the above-described reference coordinate system) exceeds the ground-off reference value, as described in step S105.
S203: the first and second values of the ankle joint are calculated based on the ground clearance state.
By determining the ground clearance of the foot, the first and second values of the ankle joint can be calculated. The first value is the joint force of the ankle joint and the second value is the ankle joint moment. It will be appreciated that the human body has left and right feet, i.e., left and right ankle joints, the first value may include joint forces of the left and right ankle joints, and the second value may include left and right ankle joint moments. In the calculation of forces such as left/right knee joint and hip joint, the joint force or moment of the left/right ankle joint is used correspondingly.
It is understood that the first and second values of the ankle joint may be calculated from three states of the foot.
When the ground-leaving state is the first state, both feet are emptied, and the first value and the second value of the left ankle joint and the right ankle joint are both 0.
When the ground-leaving state is the second state, the foot touches the ground, the stress of one of the left ankle joint and the right ankle joint is 0, and only the stress of the other ankle joint is calculated. The joint force of the other ankle joint can be calculated by summing the product of the mass of the human body link and the speed of the human body link, and the moment of the other ankle joint can be calculated by subtracting the difference of the product of the vector of the mass center of the human body link to the reference point and the weight of the human body link from the product value of the mass of the human body link and the angular speed of the human body link. The reference point may be an origin in the reference coordinate system, and the vector from the centroid of the human body link to the reference point may be obtained by calculating the coordinates of the centroid of the human body link in the reference coordinate system and the coordinates of the reference point.
Specifically, the force applied to the other ankle joint can be calculated by the following formula, wherein F 1 、M 1 Or F 2 、M 2 At 0, calculate F 2 、M 2 Or F 1 、M 1 Is a function of the number of (c),
F 1 +F 2 =Σm i Δv ci +G
M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
wherein: f (F) 1 、F 2 First value of ankle joint, M 1 、M 2 For the second value of ankle joint, m i Is the quality of human body link, v ci The movement speed of the centroid of the human body link is calculated according to the physical parameters, G is the weight of the user, J i Is the moment of inertia omega of the human body link i Is the angular velocity of the centroid of the human body link, r i G is gravity acceleration according to the vector from the mass center of the human body link to the reference point. And will not be described in detail below.
Taking the left foot as an example, the right foot stress/moment is 0, and the left foot stress/moment is F respectively left 、M left The force/moment of the right foot is F respectively right 、M right According to the momentum theorem, the momentum moment theorem can be calculated:
F left +F right =Σm foot Δv cfoot +G
M left +M right =Σ(J foot Δω foot -r foot ×m foot g)
due to F rightM right 0, so F left 、M left The method comprises the following steps:
F left =Σm i Δv ci +G
M left =Σ(J i Δω i -r i ×m i g)
when the ground-leaving state is the third state, the first numerical value and the second numerical value of the ankle joint can be calculated according to the first data and the second data; the first data are the mass, the mass center and the rotational inertia of the human body link, the second data are the position information of the human body joint, the second coordinate is the gravity center projection coordinate of the target object, the third coordinate and the fourth coordinate are ankle joint coordinates, and the second coordinate, the third coordinate and the fourth coordinate are obtained according to the second data. Specifically, the joint force sums of the left and right ankle joints can be calculated by summing the products of the mass of the human body link and the speed of the human body link, and the moment sums of the left and right ankle joints can be calculated by subtracting the difference sum of the products of the vector of the centroid of the human body link to the reference point and the weight of the human body link from the product value of the mass of the human body link and the angular speed of the human body link. The reference point can be the origin in the reference coordinate system, and the mass center of the human body link reaches the reference point The vector of the (2) can be obtained by calculating the coordinates of the centroid of the human body link in the reference coordinate system and the coordinates of the reference point. And then the joint forces of the left ankle joint and the right ankle joint and the moment of the left ankle joint and the moment of the right ankle joint are respectively calculated through the gravity center projection coordinates, the coordinates of the left ankle joint and the right ankle joint, the joint forces of the left ankle joint and the right ankle joint and the moment of the left ankle joint and the moment of the right ankle joint. The barycentric projection coordinates may be determined from a vertical mapping of barycenters from a reference plane of the reference coordinate system (e.g., the uv plane). The third and fourth coordinates are coordinates of the left and right ankle joints in the above-described reference coordinate system, and can be obtained from the above-described reference coordinate system, as shown in fig. 14. Let P be proj For the center of gravity projection coordinates, P 1 For the left ankle joint coordinate, P 2 The right ankle joint coordinates.
In this state, the first and second values may be calculated by the following formula:
F 1 +F 2 =Σm i Δv ci +G
M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
Figure BDA0003329381120000271
Figure BDA0003329381120000272
Figure BDA0003329381120000273
Figure BDA0003329381120000274
s204: based on the motion gesture of the target object, a first coordinate system is constructed, and the first coordinate system is used for constructing a homogeneous transformation matrix and acquiring the first coordinates of the joints of the human body in the first coordinate system.
Wherein the first coordinate system may include: a reference sub-coordinate system, a first sub-coordinate system, and a second sub-coordinate system; the homogeneous transformation matrix is constructed based on the relation among the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system; the relation among the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system comprises the distance and the angle between coordinate axes; the first coordinates are coordinates of the human joint in the reference sub-coordinate system. It can be appreciated that a lower limb coordinate system can be constructed according to the skeletal node position of the target object, and the coordinate system can be established at the hip joint, the knee joint and the ankle joint. The reference sub-coordinate system is a coordinate system established based on a certain bone node, and the second and third coordinate systems are coordinate systems established based on other two bone nodes.
Wherein a reference coordinate system can be established at the centre of sphere of a spherical ankle of a hip joint, the hip joint comprising three rotational degrees of freedom, the knee joint comprising one rotational degree of freedom and the ankle joint comprising two rotational degrees of freedom. The electronic device may also establish an artificial coordinate system at the foot to express the orientation of the foot.
For example, a lower limb coordinate system as shown in fig. 15 may be established. The establishment process is specifically as follows: assuming a reference coordinate system is established with the hip joint, assuming a first rotation around Z 0 Axis X 0 Y 0 Z 0 X is the reference coordinate system 0 Toward the front of the foot, Z 0 Towards the human body side, Y determined according to the right hand rule 0 Orientation. Assuming the hip joint is a second time around Z 1 Side-lifting leg, determining X according to right hand rule 1 Direction of Z 0 Conversion to Z 1 . Assume the hip joint is wrapped around Z for the third time 2 Leg turning, determining X according to right hand rule 2 Direction of Z 1 Conversion to Z 2 . Thus, the rotation of the hip joint is completed three times above. Assuming that the fourth time is the knee-surrounding joint rotation, the rotation axis is Z 3 Determine X according to the right hand rule 3 Direction of Z 2 Conversion to Z 3 And so on to establish the lower limb coordinate system as shown in fig. 15. Wherein a is the z-axis distance of the adjacent sub-coordinate system, d is the x-axis distance of the adjacent sub-coordinate system, and alpha is the z-axis angle of the adjacent sub-coordinate system And θ is the initial included angle between the X axes+the joint rotation angle to be calculated, and an initial state table as shown in fig. 16 can be obtained. For example, X 0 And X 1 Intersecting, so d is 0; z is Z 0 And Z 1 Intersecting, so a is 0; z is Z 0 And Z 1 And thus alpha is 90.
It will be appreciated that the establishment of the reference coordinate system is not limited.
S205: based on the homogeneous transformation matrix, the angular velocity of the lower leg is calculated from the first coordinates and the first data.
According to the lower limb coordinate system, the distance and the included angle between the coordinate axes of the adjacent joint coordinate systems can be obtained. For example, according to the lower limb coordinate system shown in fig. 15, the distance between the x and z axes of the adjacent coordinate systems can be obtained, and the included angle between the z axes of the adjacent coordinate systems. The homogeneous transformation matrix can be constructed through the distance and the included angle between the coordinate axes of the adjacent joint coordinate systems, and the homogeneous transformation matrix has the following formula:
Figure BDA0003329381120000281
(i is more than or equal to 1 and less than or equal to 6,i is a positive integer)
Wherein p is the coordinate value of the ankle joint, the knee joint and the hip joint in the reference coordinate system, a is the z-axis distance of the adjacent sub-coordinate system, d is the X-axis distance of the adjacent sub-coordinate system, alpha is the z-axis angle of the adjacent sub-coordinate system, and theta is the initial included angle between the X-axes and the joint rotation angle to be calculated. For example, according to the lower limb coordinate system shown in fig. 15, a is the z-axis distance of the adjacent sub-coordinate system, d is the x-axis distance of the adjacent sub-coordinate system, α is the z-axis angle of the adjacent sub-coordinate system, and the initial state quantity thereof can be shown in fig. 16.
The following matrix is constructed according to the coordinates of the human body joint in the reference coordinate system (such as a hip joint coordinate system):
Figure BDA0003329381120000282
wherein m is a coefficient obtained by multiplying according to an A matrix, namely, a coefficient obtained by multiplying a trigonometric function, p is a first coordinate, a is a z-axis distance of an adjacent coordinate system, d is an x-axis distance of the adjacent coordinate system, alpha is an included angle between the z-axes of the adjacent coordinate system, and θ is an initial included angle+a rotation angle of a human joint.
The rotation angle theta value can be solved by corresponding the data in the T matrix to the homogeneous transformation matrix, and the angular speed of the link is calculated by derivation. By way of example, the angular velocity of the link can be derived from the rotation angle of the joint by first-order differentiation.
By bringing the data in the T matrix into the above data of the detected knee joint corresponding to the homogeneous transformation matrix, the angular velocity of the lower leg can be calculated.
Further, by taking the above data of the detected hip joint in the T matrix and the homogeneous transformation matrix, the angular velocity of the thigh can also be calculated.
S206: third and fourth values of the knee joint are calculated based on the first data, the second data, the first and second values of the ankle joint, and the angular velocity of the knee joint.
The third and fourth values of the knee joint can be calculated based on the above-obtained calf mass, centroid and moment of inertia, calf centroid speed, knee joint and ankle joint position information, ankle joint force and moment, and the above-calculated calf angular speed. Wherein the third value refers to the joint force of the knee joint, and the fourth data refers to the moment of the knee joint. The third and fourth values of the knee joint can be calculated by the following formula:
Figure BDA0003329381120000291
Figure BDA0003329381120000292
Wherein F3 is a third value, M4 is a fourth value, and M shank For the calf mass in the first data,
Figure BDA0003329381120000293
for the velocity of the centroid of the calf in said second data, r shank Is the vector from the centroid of the shank to the reference point, r foot Is the vector from the centroid of the foot to the reference point, J shank For moment of inertia of the lower leg>
Figure BDA0003329381120000294
Is the angular velocity of the lower leg.
It will be appreciated that after the third and fourth values of the knee joint are calculated, the fifth and sixth values of the hip joint may be calculated by combining the thigh mass, centroid, moment of inertia, movement speed, position information of the knee joint and hip joint, joint force and moment of the knee joint, and the above calculated angular velocity of the thigh. Wherein the fifth value refers to the joint force of the hip joint and the sixth data refers to the moment of the hip joint. The fifth and sixth values of the hip joint can be calculated by the following formula:
Figure BDA0003329381120000295
/>
Figure BDA0003329381120000296
wherein F is 5 Is the fifth value, M 6 A sixth value, m thigh For the mass of the thigh, the weight of the thigh,
Figure BDA0003329381120000297
is the speed of the center of mass of the thigh, r thigh Is the vector from the center of mass of the thigh to the reference point, r shank Is the vector from the centroid of the shank to the reference point, J thigh For the moment of inertia of the thigh in the first data, and (2)>
Figure BDA0003329381120000298
Is the angular velocity of the thigh.
It is understood that when the electronic device detects a motion of a user, the detected motion image of the user may be displayed on the screen. Furthermore, the electronic device can also display the joint stress condition of the user, such as the joint stress position and the joint force on the displayed moving image.
In one possible implementation, the electronic device may display a moving image of the user. For example, the electronic device may display a moving image of the user while displaying an action demonstration in the exercise course, or the electronic device may display only a moving image of the user. As shown in fig. 8A, the electronic device displays a user interface 80, and the user interface 80 may display an image 802 of an exemplary motion of a athletic lesson and a detected motion image 801 of the user.
In another possible implementation manner, the calculated joint force or moment value can be displayed on the corresponding position of the user in the moving image by overlaying the color codes when the moving image of the user is displayed. Specifically, as shown in fig. 8B, a circle may be superimposed on a stress portion corresponding to the user, and the value of the joint force/moment at the portion may be displayed beside the circle. The color can be yellow, green and red by superposing a circle with a color on the corresponding stress part of the user, and the corresponding stress value is smaller, so that yellow can be displayed; the force value gradually increases, and the color changes from yellow to green and then to red. It will be appreciated that the shape and color of the superimposed color patches, etc. are not so limited. It will be appreciated that the user image shown in fig. 8B may be marked with a force receiving portion for clearer illustration, and the user image in fig. 8B may be a diagram showing bone nodes detected as described above, or may be an actual moving image of the user, and the moving image may be marked.
In one possible implementation, the electronic device may display an image of a standard demonstration action in the athletic lesson as shown in fig. 8C when performing the simulated detection of the athletic lesson.
Optionally, the electronic device may detect the gravity center ground projection in real time before or during the stress detection, so as to determine whether the gravity center of the user deviates from the stable interval. If the gravity center of the user deviates from the stable interval, the user may have unstable movement posture or fall, and the electronic device may prompt the user to adjust the gravity center of the body. The prompt can be a text prompt through a user interface, and can also prompt the user to adjust the body gravity center through voice when the situation that the user deviates from a stable interval is detected in the movement process.
S107: the electronic equipment judges the situation of the movement risk based on the stress situation of each joint when the target object moves, and outputs risk prompt information.
In a specific embodiment, when the electronic device performs stress detection on the target object, corresponding joint stress data of each joint when the target object moves, for example, at least one of a first numerical value and a second numerical value of an ankle joint, a third numerical value and a fourth numerical value of the knee joint, and a fifth numerical value and a sixth numerical value of the hip joint, may be obtained, the data is compared with reference data, and when a numerical comparison result exceeds a preset threshold, the electronic device may display risk prompt information.
It will be appreciated that the preset threshold may be set according to actual needs, which is not limited in this application.
The joint stress condition may be a joint force value. The reference data of the joint force may be a human joint stress threshold (the BMI stress) or the stress reference value. The joint stress threshold may be estimated by counting the maximum value of joint stress for a certain number of users in an experiment. The electronic equipment can judge the risk of sports injury by calculating whether the ratio of the joint force to the human joint stress threshold value or the bearing reference value is equal to the ratio of a preset threshold value (such as a number 1).
It can be understood that the electronic device can also judge the situation of generating the movement risk based on the joint moment and output a risk prompt. The electronic equipment can calculate the accumulated work through the joint moment of the target object, compares the accumulated work with an accumulated work threshold, and judges the risk of sports injury if the ratio of the accumulated work to the accumulated work threshold is equal to the ratio of a preset threshold (such as 1).
In some embodiments, the electronic device may display the first prompt. As shown in fig. 9A, the electronic device 100 may display the user interface 90 based on the force detection described above. The user interface 90 may display a prompt 901, and the electronic device 100 detects a user operation on the control 903 (e.g., a click operation on the control 903), in response to which the electronic device 100 may display a force detection user interface as shown in fig. 8A or 8B.
In some embodiments, after the first prompt is displayed by the electronic device, a second prompt may be displayed. For example, the electronic device obtains the corresponding joint force value of each part when the user moves based on the force detection, compares the value with the corresponding joint force threshold, and repeatedly executes the force detection if the joint force value is smaller than the joint force threshold; if the joint force value is greater than or equal to the joint stress threshold, the electronic device outputs a first prompt. For example, as shown in fig. 9A and 9C, when the electronic device 100 detects that the joint force value is greater than the joint stress threshold, the user interface 90 may be displayed, and the user interface 90 may display a prompt box 901 to prompt the user that the current exercise action has a higher risk of injury, and may also display the reason why the user has a risk of injury, such as that the left leg of the user is stressed too much. After the electronic device displays the prompt box 901 for more than a preset period of time, the user interface 92 may be displayed, and the user interface 92 may display the prompt box 907 to instruct the user to adjust the motion gesture. The electronic device 100 may detect that the user has selected to return to the original athletic lesson or that the user interface 92 has been displayed for more than a preset period of time, and may display a force detection user interface as shown in fig. 8A or 8B.
In other embodiments, after the electronic device displays the first information, the first option and the second option may be displayed, and when a user operation on the first option by the user is detected, the electronic device may display a second prompt in response to the operation; when the electronic device detects a user operation of the user on the first option, the electronic device may display a third prompt in response to the operation.
Specifically, as shown in fig. 9A, 9B, and 9C, the electronic device 100 may display the user interface 90 based on the force detection to prompt that the current athletic movement has a higher risk. The electronic device 100 may display the user 91 when no user operation is detected on the control 903 for a preset period of time, or when the user interface 90 is displayed for more than a preset period of time. The electronic device can detect a user operation on the control 905 (e.g., a click operation on the control 905), in response to which the electronic device 100 can display the user interface 92. The user interface 92 may include a prompt 907, and the prompt 907 may display an adjustment scheme for athletic activity that presents a higher risk. The electronic device 100 may detect that the user interface 92 is displayed for more than a preset period of time and may display a force detection user interface as shown in fig. 8A or 8B.
In a specific embodiment, as shown in fig. 9A, 9B, and 9D, the electronic device 100 may display the user interface 90 based on the force detection, prompting that the current athletic activity is at a higher risk. The electronic device 100 may display the user 91 when no user operation is detected on the control 903 for a preset period of time, or when the user interface 90 is displayed for more than a preset period of time. The electronic device can detect a user operation on the control 906 (e.g., a click operation on the control 906) in response to which the electronic device 100 can display the user interface 93. The electronic device 100 can detect a user operation (e.g., a click operation on control 908) on control 908 in the user interface 93, in response to which the electronic device 100 can display a force-detected user interface or user interface 92 as shown in fig. 8A or 8B. The electronic device 100 may also detect a user operation on the control 909 (e.g., a click operation on the control 909), in response to which the electronic device 100 may switch to a recommended athletic lesson selected by the user.
In the case of performing simulation detection on a sports course, the electronic device may output a risk prompt when detecting that a sports motion has a certain risk of injury to a user; the electronic device can also output a risk prompt of the exercise course after the simulation detection of the exercise course is completed. For example, as shown in fig. 9A-9D, the electronic device 100 may detect that the exercise action in the exercise course has a higher risk of injury than the user, that is, output a risk prompt that the exercise action has a higher risk, and specific steps are not described again. For another example, as shown in fig. 9E, after the detection of the exercise course is completed, the electronic device 100 displays the user interface 94, where the user interface 94 is used to prompt that the simulation detection is completed, and may also display the damage risk level of the exercise course, the exercise action with higher risk, and so on.
In some embodiments, the electronic device obtains the corresponding joint force value of each part when the target object moves based on the stress detection, and can compare the value with the stress reference value, if the ratio of the joint force value to the joint force threshold is less than 0.6, the motion is a low risk motion, and the electronic device repeatedly executes the stress detection; if the ratio of the joint force value to the joint force threshold value is 0.6-0.9, the movement is a medium risk movement, the electronic equipment can continue to carry out stress detection, and meanwhile, the electronic equipment can also output guide information for adjusting the movement; if the ratio of the joint force value to the joint force threshold value is greater than 0.9, the exercise is a high-risk exercise, and the electronic equipment can output risk prompt information.
Fig. 17 shows a flow of another method of motion analysis, which is a simulated detection of joint forces/moments for a motion session. As shown in fig. 17, the method may include:
s301: the electronic equipment receives user operation of a user aiming at a first application, wherein the user operation is used for indicating the electronic equipment to acquire identity evaluation information of the user.
The step S101 may be referred to by the electronic device receiving a user operation of the user for the first application and the electronic device obtaining the identity evaluation information of the user.
S302: the electronic equipment acquires identity evaluation information of the user.
S302 may refer to the related description in step S102 described above.
S303: the electronic device detects the position of a skeletal node in the image of the standard demonstration action in the athletic lesson to obtain a spatial positional relationship of the skeletal node.
S303 may refer to the related description of step S103 described above.
S304: the electronic equipment performs space posture calibration on the images of standard demonstration actions in the exercise course, and a reference coordinate system is constructed.
S304 may refer to the related description of step S104 described above.
S305: the electronic equipment performs stress analysis based on the skeleton node and combined with the images of standard demonstration actions in the exercise course to obtain stress conditions.
In this step, the electronic device needs to set a ground reference value by itself, and the ground reference value is used to determine whether the image of the standard demonstration action in the exercise course is in the ground state.
The state may be referred to the related description in step S105 described above.
It can be appreciated that the electronic device can obtain corresponding height and weight information for the images of the standard demonstration actions in the exercise course, and can obtain the mass, the mass center position, the moment of inertia and the like among the bone nodes by combining the height and weight information and the bone node positions. Reference is made specifically to the description related to step S106.
S306: the electronic equipment judges the situation of the exercise risk based on the stress situation of each joint in the image of the standard demonstration action in the exercise course, and outputs risk prompt information.
S306 may refer to the related description in step S107 described above.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (23)

1. A method of motion analysis, comprising:
acquiring first data of a target object according to body parameters of the target object; the first data comprise the mass, mass center and rotational inertia of a human body link;
acquiring second data of the target object and a ground-leaving state of the foot; the second data comprise the movement speed, the angular speed and the human joint position information of the centroid of the human link;
Calculating a first value and a second value of the ankle joint based on the ground clearance state;
constructing a first coordinate system based on the motion gesture of the target object, wherein the first coordinate system is used for constructing a homogeneous transformation matrix and acquiring a first coordinate of the human joint in the first coordinate system;
calculating the angular velocity of the lower leg according to the first coordinate and the first data based on the homogeneous transformation matrix;
third and fourth values of the knee joint are calculated based on the first data, the second data, the first and second values of the ankle joint, and the angular velocity of the lower leg.
2. The method according to claim 1, wherein the method further comprises:
calculating the angular velocity of the thigh according to the first coordinate and the first data based on the homogeneous transformation matrix;
and calculating a fifth value and a sixth value of the hip joint based on the first data, the second data, the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint, and the angular velocity of the thigh.
3. The method of claim 1 or 2, wherein calculating the first and second values of the ankle joint based on the ground clearance state comprises:
When the ground-leaving state is a first state, the first value and the second value of the ankle joint are both 0;
when the ground leaving state is a second state, calculating a first value and a second value of the ankle joint according to the first data and the second data;
when the ground leaving state is a third state, calculating a first value and a second value of the ankle joint according to the first data and the second data; the second data are position information of the human body joints, the second coordinates are barycentric projection coordinates of the target object, the third coordinates and the fourth coordinates are ankle joint coordinates, and the second coordinates, the third coordinates and the fourth coordinates are obtained according to the second data.
4. The method of claim 3, wherein the calculating the first and second values of the ankle joint from the first and second data comprises: the first and second values are calculated by the following formula,
F 1 +F 2 =Σm i Δv ci +G
M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
wherein: f1 and F2 are the first numerical value of the ankle joint, M1 and M2 are the second numerical value of the ankle joint, M i V is the mass of the human body link ci G is the weight of the user calculated according to the body parameters, J i For the moment of inertia, ω, of the human body segment i An angular velocity of the centroid of the human body link, r i G is gravity acceleration according to the vector from the mass center of the human body link to the reference point.
5. The method of claim 3, wherein the calculating the first and second values of the ankle joint based on the first data, the second data; the second data is position information of the human body joint, the second coordinate is a gravity center projection coordinate of the target object, the third coordinate and the fourth coordinate are ankle joint coordinates, and the second coordinate, the third coordinate and the fourth coordinate are obtained according to the second data, and the method comprises the following steps: the first and second values are calculated by the following formula,
F 1 +F 2 =Σm i Δv ci +G
M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
Figure FDA0003329381110000021
Figure FDA0003329381110000022
Figure FDA0003329381110000023
Figure FDA0003329381110000024
wherein: f1 and F2 are the first numerical value of the ankle joint, M1 and M2 are the second numerical value of the ankle joint, M i V is the mass of the human body link ci G is the weight of the user calculated according to the body parameters, J i For the moment of inertia, ω, of the human body segment i An angular velocity of the centroid of the human body link, r i For the vector from the centroid of the human body link to the reference point, P proj For the second coordinate, P 1 For the third coordinate, P 2 And the fourth coordinate.
6. The method of any of claims 1-5, wherein calculating the angular velocity of the lower leg from the first coordinates and the first data based on the homogeneous transformation matrix comprises: calculating the rotation angle of the human body joint by corresponding the first coordinate to the following formula, calculating the angular velocity of the lower leg based on the rotation angle of the human body joint,
Figure FDA0003329381110000025
Figure FDA0003329381110000026
wherein m is a coefficient, p is the first coordinate, a, d and alpha are known distances or angles in the first coordinate system, and θ is an initial included angle+a rotation angle of the human joint.
7. The method of claim 6, wherein the calculating third and fourth values of the knee joint based on the first data, the second data, the first and second values of the ankle joint, and the angular velocity of the lower leg comprises: the third and fourth values are calculated by the following formula,
Figure FDA0003329381110000027
Figure FDA0003329381110000028
wherein F3 is the third value, M4 is the fourth value, M shank For the calf mass in the first data,
Figure FDA0003329381110000029
for the velocity of the centroid of the calf in said second data, r shank R is a vector of the center of mass of the shank from a reference point, which is derived from the first data and the second data foot For the vector of the foot mass center from the reference point according to the first data and the second data, J shank For the moment of inertia of the calf in said first data, and (2)>
Figure FDA00033293811100000210
Is the angular velocity of the lower leg.
8. The method of claim 7, wherein the calculating the angular velocity of the thigh from the first coordinate and the first data based on the homogeneous transformation matrix comprises: calculating the rotation angle of the human body joint by using the first coordinate corresponding to the following formula, calculating the angular velocity of the thigh based on the rotation angle of the human body joint,
Figure FDA0003329381110000031
Figure FDA0003329381110000032
/>
wherein m is a coefficient, p is the first coordinate, a, d and alpha are known distances or angles in the first coordinate system, and θ is an initial included angle+a rotation angle of the human joint.
9. The method of claim 8, wherein the calculating the fifth and sixth values of the hip joint based on the first data, the second data, the first and second values of the ankle joint, the third and fourth values of the knee joint, and the angular velocity of the thigh comprises: the fifth and sixth values are calculated by the following formula,
Figure FDA0003329381110000033
Figure FDA0003329381110000034
Wherein F5 is the fifth value, M6 is the sixth value, mthigh is the thigh mass in the first data,
Figure FDA0003329381110000035
r is the speed of the thigh centroid in the second data thigh R is a vector of the thigh centroid from the reference point, which is derived from the first data and the second data shank For vectors of shank centroid distance reference point derived from the first data and the second data, J thigh For the moment of inertia of the thigh in the first data, and (2)>
Figure FDA0003329381110000036
Is the angular velocity of the thigh.
10. The method of any one of claims 1-9, wherein the first coordinate system comprises: a reference sub-coordinate system, a first sub-coordinate system, and a second sub-coordinate system;
the homogeneous transformation matrix is constructed based on the relation among the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system; the relation among the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system comprises the distance and the angle among coordinate axes;
the first coordinates are coordinates of the human joint in the reference sub-coordinate system.
11. The method of any one of claims 1-10, wherein obtaining the ground clearance of the foot comprises:
Displaying a first user interface, wherein the first user interface is used for displaying the setting of the ground reference value; the ground clearance reference value is used for judging the ground clearance state of the foot;
a set operation for the ground reference value is received.
12. The method according to any one of claims 1-11, further comprising:
displaying a second user interface, wherein the second user interface displays a first image of the target object, and a first area and a first mark are overlapped on the first image; the first region is a region of the human joint on the first image, and the first identifier is at least one of a first value and a second value of the ankle joint, a third value and a fourth value of the knee joint, and a fifth value and a sixth value of the hip joint.
13. The method of any one of claims 2-12, further comprising, after calculating the fifth and sixth values of the hip joint:
judging whether or not a risk of movement is generated based on at least one of the first and second values of the ankle joint, the third and fourth values of the knee joint, and the fifth and sixth values of the hip joint.
14. The method of claim 13, wherein said determining whether a risk of movement is generated comprises:
judging the magnitude of a ratio of at least one of a first value and a second value of the ankle joint, a third value and a fourth value of the knee joint, and a fifth value and a sixth value of the hip joint to a first reference value to a first threshold;
and if the ratio is greater than a first threshold, outputting risk prompt information.
15. The method of claim 14, wherein outputting risk cues comprises:
outputting a first prompt;
or alternatively, the process may be performed,
outputting a first prompt, the first prompt comprising a first option; a second operation is received on the first option, and a second prompt is output.
16. The method of claim 14 or 15, wherein the first reference value is the human joint stress threshold.
17. The method according to any one of claims 1-16, further comprising, prior to acquiring the first data of the target subject in accordance with the body parameter of the target subject:
performing body measurement evaluation on a user; the body measurement assessment includes assessing a body state including a lesion site and a degree of damage to the lesion site.
18. The method of claim 17, wherein the assessing a physical condition comprises:
detecting a first location placed at a damaged location of the user, and detecting a time when the first location is placed at the damaged location; the first part is a body part of the user;
and determining the damage degree according to the time.
19. The method of claim 17 or 18, wherein the first reference value is a load bearing reference value, the load bearing reference value being adjusted based on the body measurement assessment.
20. The method of any one of claims 1-19, wherein the target object is a moving image of the user or a selected athletic lesson.
21. A motion analysis device comprising means for performing the method of any one of claims 1 to 20.
22. An electronic device includes a touch screen, a memory, one or more processors, a plurality of applications, and one or more programs; wherein the one or more programs are stored in the memory; wherein the one or more processors, when executing the one or more programs, cause the electronic device to implement the method of any of claims 1-20.
23. A computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-20.
CN202111276279.2A 2021-10-29 2021-10-29 Motion analysis method, motion analysis device, electronic equipment and computer storage medium Pending CN116072291A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111276279.2A CN116072291A (en) 2021-10-29 2021-10-29 Motion analysis method, motion analysis device, electronic equipment and computer storage medium
PCT/CN2022/127953 WO2023072195A1 (en) 2021-10-29 2022-10-27 Athletic analysis method and apparatus, and electronic device and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111276279.2A CN116072291A (en) 2021-10-29 2021-10-29 Motion analysis method, motion analysis device, electronic equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN116072291A true CN116072291A (en) 2023-05-05

Family

ID=86160493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111276279.2A Pending CN116072291A (en) 2021-10-29 2021-10-29 Motion analysis method, motion analysis device, electronic equipment and computer storage medium

Country Status (2)

Country Link
CN (1) CN116072291A (en)
WO (1) WO2023072195A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10213645B1 (en) * 2011-10-03 2019-02-26 Swingbyte, Inc. Motion attributes recognition system and methods
JP2016096889A (en) * 2014-11-19 2016-05-30 株式会社東芝 Image analysis apparatus, image analysis method and program
US11673024B2 (en) * 2018-01-22 2023-06-13 Pg Tech, Llc Method and system for human motion analysis and instruction
CN111062247B (en) * 2019-11-07 2023-05-26 郑州大学 Human motion intention prediction method for exoskeleton control
CN112957033B (en) * 2021-02-01 2022-10-18 山东大学 Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation
CN113283116B (en) * 2021-06-16 2022-08-05 北京理工大学 Multi-information fusion human motion analysis method and device

Also Published As

Publication number Publication date
WO2023072195A1 (en) 2023-05-04

Similar Documents

Publication Publication Date Title
EP4020491A1 (en) Fitness-assisted method and electronic apparatus
CN110495819B (en) Robot control method, robot, terminal, server and control system
WO2021169394A1 (en) Depth-based human body image beautification method and electronic device
CN110456938A (en) A kind of the false-touch prevention method and electronic equipment of Curved screen
CN113364971A (en) Image processing method and device
CN111544852B (en) Method and related apparatus for correcting body-building posture
CN112114912A (en) User interface layout method and electronic equipment
WO2022095788A1 (en) Panning photography method for target user, electronic device, and storage medium
WO2021008589A1 (en) Application running mehod and electronic device
CN111400605A (en) Recommendation method and device based on eyeball tracking
CN111882642A (en) Texture filling method and device for three-dimensional model
EP4310724A1 (en) Method for determining exercise guidance information, electronic device, and exercise guidance system
CN113723397A (en) Screen capturing method and electronic equipment
WO2022068522A1 (en) Target tracking method and electronic device
CN116072291A (en) Motion analysis method, motion analysis device, electronic equipment and computer storage medium
EP4006754A1 (en) Prompting method for fitness training, and electronic device
EP4224485A1 (en) Adaptive action evaluation method, electronic device, and storage medium
CN112711335B (en) Virtual environment picture display method, device, equipment and storage medium
CN114827442B (en) Method for generating image and electronic equipment
CN114080258B (en) Motion model generation method and related equipment
CN111982037B (en) Height measuring method and electronic equipment
CN114812381A (en) Electronic equipment positioning method and electronic equipment
CN114637392A (en) Display method and electronic equipment
WO2021233018A1 (en) Method and apparatus for measuring muscle fatigue degree after exercise, and electronic device
CN110853704A (en) Protein data acquisition method, protein data acquisition device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination