WO2023072195A1 - Athletic analysis method and apparatus, and electronic device and computer storage medium - Google Patents

Athletic analysis method and apparatus, and electronic device and computer storage medium Download PDF

Info

Publication number
WO2023072195A1
WO2023072195A1 PCT/CN2022/127953 CN2022127953W WO2023072195A1 WO 2023072195 A1 WO2023072195 A1 WO 2023072195A1 CN 2022127953 W CN2022127953 W CN 2022127953W WO 2023072195 A1 WO2023072195 A1 WO 2023072195A1
Authority
WO
WIPO (PCT)
Prior art keywords
value
data
user
electronic device
joint
Prior art date
Application number
PCT/CN2022/127953
Other languages
French (fr)
Chinese (zh)
Inventor
孙宇
刘航
徐腾
陈霄汉
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023072195A1 publication Critical patent/WO2023072195A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4528Joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • A61B5/4585Evaluating the knee
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • A61B5/4595Evaluating the ankle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Definitions

  • the present application relates to the technical field of motion analysis, in particular to a motion analysis method, device, electronic equipment and computer storage medium.
  • Exercise is an important part of human daily life, and exercise requires the cooperation of different joints of the human body. During exercise, people may suffer from different degrees of sports injuries due to non-standard exercise posture and excessive joint force.
  • the embodiment of the present application discloses a motion analysis method, device, electronic equipment, and computer storage medium, which calculate the stress on the joints in a relatively simple way to provide early warning of the risk of injury.
  • the embodiment of the present application provides a motion analysis method, which includes:
  • the first data includes the mass, center of mass, and moment of inertia of the human body link;
  • the second data includes the movement velocity, angular velocity and position information of the human body joints of the center of mass of the human body link;
  • the first coordinate system Constructing a first coordinate system based on the movement posture of the target object, the first coordinate system is used to construct a homogeneous transformation matrix and obtain the first coordinates of the human joints in the first coordinate system;
  • the joint force and moment of the target object's ankle joint is calculated according to the momentum theorem and momentum moment theorem by obtaining the human body inertial parameters and motion posture data of the target object combined with the ground-off state, and combined with the joint force and moment of the ankle joint to calculate
  • For the joint force and moment of the knee joint compared with the existing solutions, it is necessary to detect the joint force of the ankle joint through professional motion analysis equipment, calculate the moment of the ankle joint based on the joint force, and then calculate the moment of the knee joint.
  • This application can calculate the joint force and moment of the ankle joint based on the ground-off state of the target object, and then calculate the joint force and moment of the knee joint.
  • the calculation method is relatively simple, and there is no need to use professional motion analysis equipment during use to reduce costs.
  • the method further includes: calculating the angular velocity of the thigh according to the first coordinate and the first data based on the homogeneous transformation matrix; One data, the second data, the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint and the angular velocity of the thigh calculate the fifth value and the value of the hip joint sixth value.
  • the application can also calculate the joint force and moment of the hip joint by combining the angular velocity of the thigh, the inertial parameters of the human body and the motion posture data.
  • calculating the first value and the second value of the ankle joint based on the ground-off state includes: when the ground-off state is the first state , the first value and the second value of the ankle joint are both 0; when the off-the-ground state is the second state, calculate the first value of the ankle joint according to the first data and the second data A numerical value and a second numerical value; when the ground-off state is a third state, calculate the first numerical value and the second numerical value of the ankle joint according to the first data and the second data; the first numerical value
  • the second data is the position information of the human joints, the second coordinates are the projected coordinates of the center of gravity of the target object, the third coordinates and the fourth coordinates are the coordinates of the ankle joints, the second coordinates, the third coordinates, the fourth Coordinates are obtained from said second data.
  • This application divides the ground-off state into three situations, including the state of one foot touching the ground, the state of both feet touching the ground and the state of both feet in the air, and calculates the joint force and moment of the ankle joint based on the detected different ground-off states. Based on different ground-off states and using different data for calculation, the joints and moments of the ankle joints in different ground-off states can be calculated quickly and easily.
  • the calculating the first value and the second value of the ankle joint according to the first data and the second data includes: calculating the first value by the following formula a value and a second value,
  • F1, F2 are the first numerical value of described ankle joint
  • M1, M2 are the second numerical value of described ankle joint
  • m i is the quality of described human body link
  • v ci is the movement velocity of described human body link barycenter
  • G is the weight of the user calculated according to the body parameters
  • J i is the moment of inertia of the human body link
  • ⁇ i is the angular velocity of the center of mass of the human body link
  • r i is the distance from the center of mass of the human body link to the reference point Vector
  • g is the acceleration due to gravity.
  • This application is based on the off-the-ground state of one foot touching the ground.
  • the joint force and moment of the ankle joint in the vacant state are 0.
  • the ankle joint in the ground-contact state can be calculated by combining the relevant information of the inertial parameters of the human body and the motion data of the human body joints according to the above formula. force and moment.
  • the first value and the second value of the ankle joint are calculated according to the first data and the second data;
  • the second data is the human body
  • the position information of the joint is the projection coordinate of the center of gravity of the target object
  • the third coordinate and the fourth coordinate are the coordinates of the ankle joint
  • the second coordinate, the third coordinate and the fourth coordinate are based on the second
  • the data acquisition includes: calculating the first value and the second value by the following formula,
  • F1, F2 are the first numerical value of described ankle joint
  • M1, M2 are the second numerical value of described ankle joint
  • m i is the quality of described human body link
  • v ci is the movement velocity of described human body link barycenter
  • G is the weight of the user calculated according to the body parameters
  • J i is the moment of inertia of the human body link
  • ⁇ i is the angular velocity of the center of mass of the human body link
  • r i is the vector from the center of mass of the human body link to the reference point
  • P proj is the second coordinate
  • P 1 is the third coordinate
  • P 2 is the fourth coordinate.
  • This application is based on the off-the-ground state of both feet touching the ground, and can combine the relevant information of human body inertia parameters, human body joint motion data and joint node coordinates of ankle joints to calculate joint forces and moments according to the above formula.
  • calculating the angular velocity of the calf according to the first coordinate and the first data includes: corresponding the first coordinate to the following The formula calculates the rotation angle of the human body joint, and calculates the angular velocity of the lower leg based on the rotation angle of the human body joint,
  • m is a coefficient
  • p is the first coordinate
  • a, d, and ⁇ are known distances or angles in the first coordinate system
  • is the initial included angle + the rotation angle of the human joint.
  • the calculation based on the first data, the second data, the first value and the second value of the ankle joint and the angular velocity of the lower leg include: calculating the third numerical value and the fourth numerical value by the following formula,
  • F3 is the third numerical value
  • M4 is the fourth numerical value
  • m shank is the calf mass in the first data
  • r shank is the vector of the distance from the center of mass of the calf to the reference point based on the first data and the second data
  • r foot is based on the first data and the The vector of the center of mass of the foot obtained from the second data from the reference point
  • J shank is the moment of inertia of the calf in the first data, is the angular velocity of the lower leg.
  • the calculating the angular velocity of the thigh according to the first coordinate and the first data based on the homogeneous transformation matrix includes: converting the first coordinate The angle of rotation of the human joint is calculated corresponding to the following formula, and the angular velocity of the thigh is calculated based on the angle of rotation of the human joint,
  • m is a coefficient
  • p is the first coordinate
  • a, d, and ⁇ are known distances or angles in the first coordinate system
  • is the initial included angle + the rotation angle of the human joint.
  • the said first data, the second data, the first value and the second value of the ankle joint, the third value of the knee joint are used to calculate the fifth numerical value and the sixth numerical value of the hip joint, including: calculating the fifth numerical value and the sixth numerical value by the following formula,
  • F5 is the fifth numerical value
  • M6 is the sixth numerical value
  • m thigh is the thigh mass in the first data
  • r thigh is the vector of the center of mass of the thigh from the reference point based on the first data and the second data
  • r shank is the vector based on the first data and the second data
  • J thigh is the moment of inertia of the thigh in the first data, is the angular velocity of the thigh.
  • the first coordinate system includes: a reference sub-coordinate system, a first sub-coordinate system, and a second sub-coordinate system;
  • the homogeneous transformation matrix is constructed based on the relationship between the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system; the relationship between the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system The relationship between them includes the distance and angle between the coordinate axes;
  • the first coordinates are coordinates of the human body joints in the reference sub-coordinate system.
  • acquiring the ground lift state of the foot includes: displaying a first user interface, where the first user interface is used to display the setting of the ground lift reference value; The ground lift reference value is used to judge the ground lift state of the foot; and a setting operation for the ground lift reference value is received. By obtaining the setting of the ground lift reference value, it can be used to judge the ground lift state of the foot.
  • the method further includes: displaying a second user interface, where the second user interface displays a first image of the target object, and on the first image superimposing a first area and a first identification; the first area is the area of the human joint on the first image, and the first identification is the first value and the second value of the ankle joint, the At least one of the third value and the fourth value of the knee joint and the fifth value and the sixth value of the hip joint.
  • the joint forces and moments of the human joints can be displayed on the moving image, and the stress situation of the target object can be displayed more intuitively.
  • the hip joint after calculating the fifth value and the sixth value of the hip joint, it further includes: based on the first value and the second value of the ankle joint, the knee At least one of the third numerical value and the fourth numerical value of the joint and the fifth numerical value and the sixth numerical value of the hip joint are used to determine whether a movement risk occurs.
  • judging whether a movement risk occurs includes: judging the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint, and the value of the hip joint. The ratio of at least one of the fifth value and the sixth value to the first reference value and the first threshold value; if the ratio is greater than the first threshold value, a risk warning message is output.
  • outputting risk prompt information includes: outputting a first prompt; or outputting a first prompt, the first prompt including a first option; receiving a second operation acting on the first option , outputs the second prompt. Therefore, the above method can output the degree of risk of injury to human joints, and can also output a guidance plan for actions or exercise courses with risk of injury, which helps users adjust their own exercise posture and reduce the possibility of injury risk.
  • the first reference value is the human body joint stress threshold.
  • the method before obtaining the first data of the target object according to the physical parameters of the target object, the method further includes: performing anthropometric assessment on the user; the anthropometric assessment includes assessing physical state, the body state includes the injury site and the injury degree of the injury site.
  • the anthropometric assessment includes assessing physical state, the body state includes the injury site and the injury degree of the injury site.
  • evaluating the physical state includes: detecting that a first part is placed on the damaged part of the user, and detecting the time when the first part is placed on the damaged part; the body part of the user; and determine the extent of the injury based on the time.
  • the first reference value is a bearing reference value
  • the bearing reference value is adjusted according to the body measurement assessment. Dynamically adjust the load reference value through body measurement and evaluation, the load reference value can be adjusted according to the information of the body measurement evaluation, and the load reference value can also be dynamically adjusted according to the information after the body measurement evaluation changes, so it can be more based on the user's own Adjusting the load-bearing reference value according to the situation can reduce the risk of injury more accurately.
  • the target object is the user or a motion image in a selected exercise course.
  • the user's real-time analysis movement situation can be analyzed.
  • the moving images in the selected exercise course it is possible to simulate the detection of the moving images in the selected exercise course to analyze the exercise situation in the exercise course.
  • risk warnings about the selected exercise course can be output to determine whether the exercise in the exercise course is suitable for the user.
  • the present application provides a motion analysis device, including a unit for performing the method described in the first aspect above.
  • the present application provides an electronic device, including a touch screen, memory, one or more processors, multiple application programs, and one or more programs; wherein, the one or more programs are stored in In the memory; it is characterized in that, when the one or more processors execute the one or more programs, the electronic device implements the method described in the first aspect above.
  • the present application provides a computer storage medium, which is characterized by comprising computer instructions, and when the computer instructions are run on an electronic device, the electronic device is made to execute the method described in the first aspect above.
  • the embodiment of the present application calculates the force on the ankle joint by distinguishing between different ground-off states, and then obtains the force on the knee joint and hip joint by establishing a coordinate system in the lower limbs, so that a relatively simple method can be obtained. Calculate the stress on the joints to judge the risk of sports injuries.
  • FIG. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 2 is a software structural block diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 3A is a schematic diagram of a user interface for an application program menu on an electronic device provided by an embodiment of the present application
  • 3B-3E are schematic diagrams of a scenario involved in this application.
  • FIG. 3F is a schematic diagram of a scene involved in this application.
  • FIG. 3G-FIG. 3H are schematic diagrams of another scene of the present application.
  • 4A-4D are a set of schematic diagrams of interfaces provided by the embodiment of the present application.
  • Figure 4E- Figure 4I are another set of schematic diagrams of the interface provided by the embodiment of the present application.
  • Figure 4J- Figure 4N is another set of schematic diagrams of interfaces provided by the embodiment of the present application.
  • FIGS 5A-5C are schematic diagrams of another set of interfaces provided by the embodiment of the present application.
  • Fig. 6 is another set of interface diagrams provided by the embodiment of the present application.
  • Fig. 7 is a schematic diagram of another group of interfaces provided by the embodiment of the present application.
  • FIGS. 8A-8C are schematic diagrams of another set of interfaces provided by the embodiment of the present application.
  • FIGS. 9A-9E are schematic diagrams of another set of interfaces provided by the embodiment of the present application.
  • Fig. 10 shows a flow chart of a motion analysis method
  • Fig. 11 is a schematic diagram of a skeleton node provided by the embodiment of the present application.
  • FIG. 12 is a schematic diagram of a reference coordinate system provided by the embodiment of the present application.
  • Fig. 13 is a flow chart of a motion analysis method provided by the embodiment of the present application.
  • FIG. 14 is a schematic diagram of a center of gravity ground projection provided by an embodiment of the present application.
  • Fig. 15 is a schematic diagram of a lower limb coordinate system provided by the embodiment of the present application.
  • Fig. 16 is a schematic diagram of an initial state quantity of a lower limb coordinate system provided by the embodiment of the present application.
  • FIG. 17 is a flow chart of another motion analysis method provided by the embodiment of the present application.
  • the term “if” may be construed as “when” or “once” or “in response to determining” or “in response to detecting” depending on the context .
  • the phrase “if determined” or “if [the described condition or event] is detected” may be construed, depending on the context, to mean “once determined” or “in response to the determination” or “once detected [the described condition or event] ]” or “in response to detection of [described condition or event]”.
  • the electronic device may be a portable electronic device that also includes other functions such as a personal digital assistant and/or a music player, such as a mobile phone, a tablet computer, a wearable electronic device with a wireless communication function (such as a smart watch) )wait.
  • portable electronic devices include, but are not limited to, portable electronic devices running iOS, Android, Microsoft, or other operating systems.
  • UI user interface
  • the term "user interface (UI)" in the specification, claims and drawings of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the internal form of information Convert between forms that the user can receive.
  • the user interface of the application program is the source code written in specific computer languages such as java and extensible markup language (XML). Such as pictures, text, buttons and other controls.
  • Control also known as widget (widget)
  • Typical controls include toolbar (toolbar), menu bar (menu bar), text box (text box), button (button), scroll bar (scrollbar), images and text.
  • the properties and contents of the controls in the interface are defined through labels or nodes.
  • a node corresponds to a control or property in the interface, and after the node is parsed and rendered, it is presented as the content visible to the user.
  • the interfaces of many applications usually include web pages.
  • a web page also called a page, can be understood as a special control embedded in an application program interface.
  • a web page is a source code written in a specific computer language, such as hypertext markup language (HTML), cascading style Tables (cascading style sheets, CSS), java scripts (Java scriptvv, JS), etc.
  • HTML hypertext markup language
  • CSS cascading style Tables
  • Java scripts Java scriptvv, JS
  • the specific content contained in the webpage is also defined by the tags or nodes in the source code of the webpage.
  • HTML defines the elements and attributes of the webpage through ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
  • GUI graphical user interface
  • GUI refers to the user interface related to computer operation displayed in a graphical way. It can be an icon, window, control and other interface elements displayed on the display screen of the electronic device, where the visible control includes icons, buttons, menus, tabs, text boxes, dialog boxes, status boxes, navigation bars, widgets, etc.
  • Visual interface elements displayed on the display screen of the electronic device, where the visible control includes icons, buttons, menus, tabs, text boxes, dialog boxes, status boxes, navigation bars, widgets, etc.
  • the following embodiments of the present application provide a motion analysis method, a graphical user interface, and electronic equipment, which can calculate joint forces/torques, determine the injury risk of related sports, provide injury risk assessment and early warning, and reduce the possibility of sports injuries.
  • FIG. 1 shows a schematic structural diagram of an electronic device.
  • the electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and user An identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an environment sensor. 180L, bone conduction sensor 180M, etc.
  • the structure shown in the embodiment of the present application does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or fewer components than shown in the illustrations, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing unit, GPU), an image signal processor (Image Signal Processor, ISP), controller, memory, video codec, digital signal processor (Digital Signal Processor, DSP), baseband processor, and/or neural network processor (Neural-network Processing Unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors. In some embodiments, the electronic device may also include one or more processors 110.
  • the controller may be the nerve center and command center of the electronic equipment.
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the electronic device.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (MIPI) interface, general-purpose input/output (general-purpose input/output, GPIO) interface, SIM interface, and/or USB interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM interface SIM interface
  • USB interface etc.
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 may be respectively coupled to the touch sensors 180K through different I2C bus interfaces, so that the processor 110 and the touch sensors 180K communicate through the I2C bus interfaces to realize the touch function of the electronic device.
  • I2S can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, quantizing and encoding analog signal samples.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is usually used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device.
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device.
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device, and can also be used to transmit data between the electronic device and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device.
  • the electronic device may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive the charging input of the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging ring of the electronic device. While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status.
  • the power management module 141 may also be disposed in the processor 11 ⁇ .
  • the power management module 141 and the charging management module 140 may also be disposed in the same device.
  • the wireless communication function of the electronic device can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (Low Noise Amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules in the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the electronic device realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode). diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Mini LED, Micro LED, Micro-OLED, quantum dot light-emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the display screen 194 displays moving images of the user.
  • the electronic device can realize the display function through ISP, camera 193 , video codec, GPU, display screen 194 and application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute instructions to generate or alter display information.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image or video visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, etc.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP for conversion into a digital image or video signal.
  • ISP outputs digital image or video signal to DSP for processing.
  • DSP converts digital images or video signals into standard RGB, YUV and other formats of images or video signals.
  • the electronic device may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the two-dimensional position information of the user's feet, knees, hips, wrists, elbows, head and neck, etc. is detected in real time by the camera 193 .
  • Digital signal processors are used to process digital signals. In addition to digital image or video signals, they can also process other digital signals. For example, when an electronic device selects a frequency point, a digital signal processor is used to perform Fourier transform on the frequency point energy, etc.
  • Video codecs are used to compress or decompress digital video.
  • An electronic device may support one or more video codecs. In this way, the electronic device can play or record videos in various encoding formats, such as: Moving Picture Experts Group (Moving Picture Experts Group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 Moving Picture Experts Group
  • MPEG3 Moving Picture Experts Group
  • NPU is a neural network (Neural-Network, NN) computing processor.
  • NN neural network
  • Applications such as intelligent cognition of electronic devices can be realized through NPU, such as: image recognition, face recognition, speech recognition, text understanding, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image and video playing function, etc.) and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the electronic device can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • the electronic device receives a call or a voice message, it can listen to the voice by placing the receiver 170B close to the human ear.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device receives a call or a voice message, it can listen to the voice by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device may be provided with at least one microphone 170C.
  • the earphone interface 170D is used for connecting wired earphones.
  • the sensor module 180 may include one or more sensors, which may be of the same type or of different types. It can be understood that the sensor module 180 shown in FIG. 1 is only an exemplary division manner, and there may be other division manners, which are not limited in the present application.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the electronic device detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (ie, x, y, and z axes) may be determined by the gyro sensor 180B. The gyro sensor 180B can be used for image stabilization.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device may detect opening and closing of the flip holster using the magnetic sensor 180D.
  • the acceleration sensor 180E can detect the acceleration of the electronic device in various directions (generally three axes). When the electronic device is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • Electronic devices can measure distance via infrared or laser light. In some embodiments, when shooting a scene, the electronic device can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • Electronic devices emit infrared light outwards through light-emitting diodes.
  • Electronic devices use photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object in the vicinity of the electronic device. When insufficient reflected light is detected, the electronic device may determine that there is no object in the vicinity of the electronic device.
  • the electronic device can use the proximity light sensor 180G to detect that the user holds the electronic device close to the ear to make a call, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the fingerprint sensor 180H is used to acquire fingerprints. Electronic devices can use the acquired fingerprint features to unlock fingerprints, access application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy.
  • the touch sensor 180K is also called a touch panel or a touch-sensitive surface.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device, which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 180M can also contact the human pulse and receive the blood pressure beating signal. In some embodiments, the bone conduction sensor 180M can also be disposed in the earphone, combined into a bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device can receive key input and generate key signal input related to user settings and function control of the electronic device.
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to realize contact and separation with the electronic device.
  • the electronic device can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the electronic device interacts with the network through the SIM card to realize functions such as calling and data communication.
  • the electronic device adopts a SIM, that is, an embedded SIM card.
  • a SIM card can be embedded in an electronic device and cannot be separated from the electronic device.
  • the electronic device exemplarily shown in FIG. 1 can display various user interfaces described in various embodiments through a display screen 194 .
  • the electronic device can detect a touch operation in each user interface through the touch sensor 180K, such as a click operation (such as a touch operation on an icon, a double-click operation) in each user interface, and for example, an upward or downward movement in each user interface. swipe down, or perform a circle gesture, and so on.
  • the electronic device may detect motion gestures performed by the user holding the electronic device, such as shaking the electronic device, through the gyroscope sensor 180B, the acceleration sensor 180E, and the like.
  • the electronic device can detect non-touch gesture operations through the camera 193 .
  • the electronic device can capture the moving image of the user through the camera 193 .
  • the software system of the electronic device may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the Android system with layered architecture is taken as an example to illustrate the software structure of the electronic device.
  • FIG. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the system is divided into four layers, which are application program layer, application program framework layer, runtime (Android runtime) and system library, and kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application package may include application programs (also called applications) such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • application programs also called applications
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window managers, content providers, view systems, phone managers, resource managers, notification managers, and so on.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of electronic devices. For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog interface. For example, prompting text information in the status bar, issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • the runtime includes the core library and virtual machine.
  • the runtime is responsible for the scheduling and management of the system.
  • the core library includes two parts: one part is the function function that the programming language (for example, java language) needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes programming files (for example, java files) of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (media libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem, and provides fusion of two-dimensional (2-Dimensional, 2D) and three-dimensional (3-Dimensional, 3D) layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, and a virtual card driver.
  • a corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into original input events (including touch coordinates, time stamps of touch operations, and other information). Raw input events are stored at the kernel level.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Take the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the camera application icon as an example.
  • the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer.
  • Camera 193 captures still images or video.
  • GUI graphical user interface
  • FIG. 3A exemplarily shows an exemplary user interface 31 on an electronic device for displaying applications installed on the electronic device.
  • the user interface 31 displays an interface on which application icons are placed, and the interface may include a plurality of application icons, for example, a clock application icon 309, a calendar application icon 311, a calendar application icon 311, a gallery application icon 313, a memo application icon 315, a file management Application icon 317, email application icon 319, music application icon 319, wallet application icon 323, Huawei video application icon 325, sports health application icon 327, weather application icon 329, browser application icon 331, smart life application icon 333, settings An application icon 335, a recorder application icon 337, an application store application icon 339, and the like.
  • An interface indicator 349 may also be displayed below the above-mentioned multiple application icons to indicate the positional relationship between the currently displayed interface and other interfaces.
  • the tray icon remains displayed when the interface is switched.
  • the embodiment of the present application does not limit the content displayed on the user interface 31 .
  • FIG. 3A only exemplarily shows a user interface on an electronic device, and should not be construed as limiting the embodiment of the present application.
  • the application programs "Sports Health” and “Camera” of the electronic device can provide the function of "motion detection", wherein the "motion detection” function can be used to detect the user's motion posture during the exercise process , calculate the joint force/moment of the relevant joints of the user during the exercise, and learn the user's injury risk during the exercise.
  • the motion analysis method provided by the embodiment of this application can be applied to various scenarios, including but not limited to:
  • the electronic device can detect a user operation 200 (such as a click operation on the icon 327) acting on the "exercise and health" icon 327, and in response to this operation, the user interface shown in Figure 3C can be displayed exemplarily 32.
  • User interface 32 may be the main user interface of the "Exercise and Health" application, which may include exercise mode list 351 , navigation bar 352 , search bar 353 , controls 354 , controls 355 , controls 356 , controls 357 , and controls 358 .
  • the exercise mode list 351 may display one or more exercise mode options.
  • the one or more exercise mode options may include: an indoor running option, a fitness option, a yoga option, a walking option, a cycling option, and a rope skipping option.
  • the one or more exercise mode options can be presented as text information on the interface. Not limited thereto, the one or more exercise mode options may also be represented as icons or other forms of interactive elements (interactive element, IE) on the interface.
  • interactive element interactive element
  • the controls 354 and 356 can be used to monitor user operations that trigger opening of the "exercise course".
  • the electronic device 100 may detect a user operation on the control 354 (such as a click operation on the control 354 ), and in response to the operation, the electronic device 100 may display the user interface 33 shown in FIG. 3D .
  • User interface 33 may include controls 360 , 361 .
  • Control 360 may be used to listen for user actions that trigger opening of "Start Training”.
  • the electronic device may detect a user operation on the control 360 (such as the click operation 202 on the control 360), and in response to the operation, the electronic device 100 may display the user interface 34 as shown in FIG. 3E.
  • User interface 34 may include controls 362 , 363 .
  • the control 362 can be used to monitor the user's operation (such as the click operation 203 on the control 362 ) that triggers the selection of the motion gesture detection function control.
  • the control 363 can be used to monitor the user's operation (such as a click operation on the control 363 ) to trigger the control not to select the motion posture detection function.
  • the electronic device can detect the user's operation 200 (such as a click operation on the icon 327) acting on the icon 327 of "Exercise and Health", and in response to this operation, the user operation 200 exemplarily shown in Figure 3C can be displayed.
  • interface32 The user interface 32 may be the user interface of the "Exercise and Health" application, and the user interface may include an exercise mode list 351 , a navigation bar 352 , a search bar 353 , controls 354 , 355 , 356 , 357 , and 358 .
  • control 358 can be used to monitor the user operation that triggers the opening of "simulation detection".
  • the electronic device 100 may detect a user operation on the control 358 (such as the click operation 204 on the control 358 ), and in response to the operation, the electronic device 100 triggers the activation of the analog motion detection function.
  • the user interface 33 may include a control 361, and the control 361 may be used to monitor a user operation that triggers turning on "simulation detection".
  • the electronic device can detect a user operation on the control 361 (such as a click operation on the control 361 ), and in response to the operation, the user interface 32 exemplarily shown in FIG. 3C can be displayed.
  • the user interface 32 may be the user interface of the "sports and health" application program, and the electronic device 100 triggers to start the simulated exercise detection function.
  • the electronic device can detect a user operation 205 (such as a click operation on the icon 341) acting on the icon 341 of the "camera", and in response to the operation, the user interface shown in FIG. 3H can be displayed exemplarily. 35.
  • User interface 35 may be that of a “camera” application, which may include camera settings list 364 , capture mode list 365 , motion detection options 366 , controls 367 , controls 368 , controls 368 , and fields 370 . in:
  • the camera setting list 364 can be used to display one or more camera setting options, so that the user can adjust the camera setting parameters.
  • the shooting mode list 365 may display one or more shooting mode options, which may include: aperture options, night scene options, portrait options, photo taking options, video recording options, professional options, and motion detection options.
  • the control 367 is used to monitor the user operation of starting to open the “Gallery”.
  • Control 368 is used to monitor the user's operation to start shooting.
  • the control 369 is used to monitor the user's operation of starting and switching the camera.
  • Area 370 may be used to display images captured by the camera.
  • the electronic device 100 may detect a user action on the motion detection option (such as the click operation 206 on the motion detection option 366), and in response to the action, the electronic device 100 may trigger the motion detection function.
  • a user action on the motion detection option such as the click operation 206 on the motion detection option 366
  • a user interface user interface, UI
  • UI user interface
  • the electronic device 100 may detect a user operation on the control 362 of the user interface 34, and in response to the operation, the electronic device may display the user interface 40 as exemplarily shown in FIG. 4A.
  • the user interface 40 can be used to remind the user that face recognition is about to be performed. For example, the text "face recognition is about to be displayed" 401 is displayed, and the prompt time can be 5 seconds. After the prompt time is over, the electronic device 100 can display the user interface 41 and start face recognition. As shown in FIG. 4B , the user interface 41 exemplarily shows a face recognition interface to collect face information.
  • the electronic device 100 After the electronic device 100 collects the face information through the camera, it can perform some necessary processing, and match the processed face information with the stored face information template, so as to retrieve the user's physical assessment based on the face information template. information.
  • the face information template may be input by the user before the electronic device 100 performs face recognition.
  • the electronic device 100 may display a user interface 42 as shown in FIG. 4C .
  • the user interface 42 can display a prompt 403, which is used to prompt the user to be a new user, that is, the user's face information is not stored in the stored face information template, and the identity evaluation information cannot be retrieved, and the user needs to perform body measurement evaluation to obtain user information. physical assessment information.
  • the prompt time of prompt 403 may be 5s, and after the prompt ends, the electronic device may display the user interface provided by the body measurement evaluation function exemplarily shown in Fig. 4E-Fig. 4I.
  • the electronic device 100 can detect the user's last login time, and if the user's last login time does not exceed the preset time period, the electronic device 100 can obtain the user's last login time. identity assessment information.
  • the electronic device 100 can detect the user's last login time, and if the user's last login time exceeds the preset time period, the electronic device can display the user interface 43 .
  • User interface 43 may include prompt 404 , control 405 , control 406 .
  • Prompt 404 is used to prompt the user for the user whose login time exceeds the preset time period, suggesting that the user perform body measurement assessment again.
  • Control 405 is used to listen for user operations that trigger re-measurement evaluation.
  • the electronic device may detect a user operation acting on the control 405 (such as a click operation on the control 405), and in response to the operation, the electronic device may display the body measurement evaluation function provided by the body measurement evaluation function as exemplarily shown in FIGS. 4E-4I.
  • User Interface Control 406 can be used to monitor user operations that trigger the use of original data.
  • the original data indicates the user's physical assessment information stored in the electronic device 100 before that.
  • Anthropometric assessment may include two aspects: physical parameter assessment and physical state assessment.
  • 4E-4I exemplarily illustrate relevant user interfaces for physical parameter assessment.
  • the electronic device 100 may display a control 407 and a control 408 on the user interface 44 .
  • Control 407 may listen for user operations that trigger device detection.
  • the electronic device 100 may detect a user operation on the control 407 (such as a click operation on the control 407 ), and in response to the operation, the electronic device 100 may turn on the camera and display the user interface 45 .
  • the user interface 45 may include a user image 409 detected by a camera and a prompt box 410 .
  • Prompt 410 may display the user's height information.
  • the electronic device 100 may turn on the camera, detect the user's height, and display the user's height information in the prompt box 410 .
  • the electronic device 100 can detect the height of the user by: aiming the electronic device 100 at the user to be measured, pointing at the position of the foot and clicking to create a measurement point, moving the electronic device up to the head of the user to measure the height information.
  • the electronic device 100 may display the user interface 46 .
  • the user interface 46 may include: a prompt box 411 , a control 412 and a control 413 .
  • the prompt box 411 can be used to display whether the user is connected to the body fat scale to obtain the user's weight and body fat percentage information (body mass index, BMI).
  • Control 412 may listen for user operations that trigger connection to the body fat scale.
  • the electronic device 100 may detect a user operation on the control 412 (such as a click operation on the control 412 ), and in response to the operation, the electronic device 100 may connect to the body fat scale and display the user interface 47 .
  • the user interface 47 can display the body weight and body fat percentage information obtained through the body fat scale.
  • the electronic device 100 After the electronic device 100 acquires the user's height, weight, and body fat percentage information, it may display a user interface for body state assessment as shown in FIGS. 4J-4N .
  • Control 408 may listen for user operations that trigger user input of relevant data.
  • the electronic device 100 may detect a user operation on the control 408 (eg, a click operation on the control 408 ), and in response to the operation, the electronic device 100 may display the user interface 48 .
  • the user interface 48 may include an input box 416, which may be used to receive user-entered information on height, weight, and body fat percentage.
  • the electronic device 100 may detect a user operation acting on the input box 416 (such as an input operation on the input box 416), and in response to the operation, may display a user interface for body state assessment as shown in FIGS. 4J-4N.
  • 4J-4N exemplarily illustrate user interfaces for physical state assessment.
  • the user interface 49 may include: a prompt box 417 , a control 418 , and a control 419 .
  • the prompt box 417 may be used to prompt that a physical state assessment is about to be performed.
  • the control 418 may listen for user operations that trigger the device to detect a body state.
  • the electronic device 100 may detect a user operation on the control 418 (eg, a click operation on the control 418 ), and in response to the operation, the electronic device 100 may display the user interface 50 .
  • the user interface 50 may include a prompt box 420, which can be used to prompt the user to pay attention to the information when evaluating the physical state.
  • the information may be a prompt "The camera is about to be turned on, please place your hand on the injured part , will reflect the severity according to the time the hand is placed."
  • the prompt time of the prompt box 420 can be 5s, and when the prompt time ends, the electronic device can turn on the camera and display the user interface 51 .
  • the user interface 51 may include an area 421 and a display frame 422 .
  • the area 421 may display the image of the user collected by the electronic device 100 through the camera, and the area 421 may display the overall image of the user, or the image of the user's lower limbs.
  • the display frame 422 can be used to display the damage degree reflected by the time when the user puts the hand on the damage site.
  • the display frame 422 can include the damage degree frame reflected when the placement time is ⁇ 2s, 2-4s, 4-6s, 6-8s, >8s, ⁇ 2s, 2-4s, 4-6s, 6-8s, >8s.
  • the color of the damage degree box gradually changes from short to long with the placement time, and can be displayed as blue, green, yellow, orange, and red respectively. For example, when the electronic device 100 detects that the user left it for 5s, the damage degree box corresponding to 4-6s will turn red.
  • the electronic device 100 may display the user interface 52 after detecting that the color of the damage degree frame is displayed. As shown in FIG. 4M , the user interface 52 may include a prompt box 423 .
  • the prompt box 423 can be used to display the percentage of the force on the damaged part of the user as the standard force.
  • the electronic device 100 may display the user interface for detecting bone nodes as shown in FIGS. 5A-5C .
  • Control 419 may listen for user operations that trigger manual input of the body state.
  • the electronic device 100 may detect a user operation on the control 419 (eg, a click operation on the control 419 ), and in response to the operation, the electronic device 100 may display the user interface 53 .
  • the user interface 53 may include input boxes 424 , 425 .
  • the input box 424 can be used to receive the name of the injury site input by the user.
  • the input box 425 can be used to receive the damage degree of the damaged part of the user, and the damage degree can be divided into 1-5 grades, and the larger the number, the more serious the damage degree. Gears 1-5 correspond to the corresponding damage degree boxes, which can be displayed as blue, green, yellow, orange, and red respectively.
  • the electronic device 100 may detect a user operation acting on the 1-5 damage degree box on the input box 425 (for example, a click operation on any one of the 1-5 grades on the input box 425), and the corresponding damage degree box displays a corresponding color.
  • the electronic device 100 may display the user interface 52 after detecting that the user operation on the user interface 53 is completed.
  • the user interface 52 may include a prompt box 423 .
  • the prompt box 423 can be used to display the percentage of the force on the damaged part of the user as the standard force.
  • the electronic device 100 After the electronic device 100 acquires the user's body state assessment information, it may display the user interface for detecting bone nodes as shown in FIGS. 5A-5C .
  • 5A-5C exemplarily illustrate a user interface for detecting skeletal nodes.
  • the electronic device turns on a camera and displays a user interface 54 , which may include an area 501 and a prompt box 502 .
  • the area 501 can be used to display the user image collected by the camera in real time, and the electronic device 100 can refresh the displayed content therein in real time, so that the electronic device 100 can detect the position of the user's bone nodes through the collected user image.
  • the prompt box 502 can display the state of the detected bone node, which can be the text "Detecting the bone node".
  • the electronic device 100 may display a user interface for space attitude calibration as shown in FIG. 6 .
  • the electronic device 100 when the user image is collected by the electronic device 100 in real time, as shown in FIG. 5B and FIG. 5C , if the electronic device 100 detects that the user is not in an upright state, and there is an abnormal posture, such as the left leg is bent, etc., it can A prompt message 504 is output in the prompt box 502.
  • the prompt message 504 may be the text "abnormal posture, please keep upright", which may be used to prompt the user to adjust the posture and maintain an upright state.
  • the electronic device 100 After the electronic device 100 detects that the user adjusts to the upright state, it may display the user interface 54 .
  • Fig. 6 exemplarily shows a user interface for space attitude calibration.
  • the user interface 60 may include a prompt box 601 and a prompt box 602 .
  • the prompt box 601 may be used to display that the user interface 60 is an interface for setting the reference value of the ground lift, and the prompt box 601 may be the text "Space attitude calibration".
  • the prompt box 602 can be used to output the countdown of the time, which can be a change of the number 3, 2, 1, to remind the user of the time required for the attitude calibration of the control.
  • the electronic device 100 When the electronic device 100 detects that the space attitude calibration is completed, it may display the user interface for setting the ground clearance reference value as shown in FIG. 7 .
  • FIG. 7 exemplarily shows a user interface for setting the ground lift/ground contact reference value.
  • the user interface 70 may include a prompt box 701 , a control 702 and an input box 703 .
  • the prompt box 701 may be used to display that the user interface 70 is an interface for setting the ground lift reference value, and may be the text "set the ground lift reference value".
  • Control 702 may listen for a user operation that triggers the device to set the user's ground clearance reference value.
  • the electronic device 100 may detect a user operation on the control 702 (such as a click operation on the control 702 ), and in response to the operation, the electronic device 100 may set the user's ground clearance reference value. After the electronic device 100 detects that the user's ground lift/ground contact reference value is set, it may display a force detection user interface as shown in FIG. 8A or FIG. 8B .
  • the input box 703 can be used to receive the ground reference value input by the user.
  • the electronic device 100 may detect a user operation acting on the input box 703 (such as an input operation on the input box 703), and in response to the operation, may display a force detection user interface as shown in FIG. 8A or FIG. 8B.
  • the user interface 80 may include an area 801 , an area 802 , a prompt box 803 , and a prompt box 804 .
  • the area 801 can be used to display the moving images of the user collected by the camera 193 in real time.
  • Area 802 may be used to display exemplary action images for an exercise session.
  • the prompt box 803 can be used to display the name of the current sports action.
  • the prompt box 804 can be used to display the user's exercise amount, which can be a combination of numbers and words, such as "5 kcal".
  • the user interface 81 may include an area 801 , an area 802 , a prompt box 803 , a prompt box 804 , an icon 805 , and a prompt box 806 .
  • the area 801 , the area 802 , the prompt box 803 , and the prompt box 804 reference may be made to the relevant description in the user interface 80 , and details are not repeated here.
  • the icon 805 can be used to highlight the force-bearing parts of the user's joints, and highlight the magnitude of the force through color marking.
  • the joint force when the joint force is small, it can be displayed in green on the icon 805; when the joint force is large, it can be displayed in yellow on the icon 805; It is displayed in red on the 805.
  • the prompt box 806 can be used to display the force value of the corresponding joint.
  • the electronic device 100 detects that the user has a certain risk of injury, and may display a user interface for outputting risk prompt information as shown in FIGS. 9A-9D .
  • FIG. 8C exemplarily shows a user interface for simulated testing of exercise sessions.
  • the user interface 82 includes an area 807 that may be used to display images of exemplary moves during an exercise session.
  • 9A-9E exemplarily show a user interface for outputting risk warning information.
  • the user interface 90 may include prompt boxes 901 , 902 and a control 903 .
  • the prompt box 901 may be used to display high-risk information of sports actions, which may be the text "It is detected that the current sports action has a high risk of injury".
  • the prompt box 902 can be used to display the reason why the current action has a higher risk, and it can be the text "excessive force on the left leg”.
  • the control 903 may monitor user operations that trigger returning to force detection.
  • the electronic device 100 may detect a user operation on the control 903 (such as a click operation on the control 903 ), and in response to the operation, the electronic device 100 may display a user interface for force detection as shown in FIG. 8A or 8B .
  • the electronic device 100 may display the user 91 when no user operation on the control 903 is detected within the preset time period, or when the display time of the user interface 90 exceeds the preset time period.
  • the user interface 91 may include a control 903 , a prompt box 904 , and controls 905 and 906 .
  • the prompt box 904 may display the text "Continue this exercise?".
  • the control 905 may listen for user operations that trigger continued movement and receive movement guidance.
  • the electronic device 100 may detect a user operation on the control 905 (eg, a click operation on the control 905 ), and in response to the operation, the electronic device 100 may display the user interface 92 .
  • the user interface 92 may include a control 903 and a prompt box 907 .
  • the prompt box 907 may display an adjustment plan for sports actions with higher risks.
  • the electronic device 100 may detect that the display time of the user interface 92 exceeds a preset time period, and may display the force detection user interface as shown in FIG. 8A or FIG. 8B .
  • Control 906 may listen for user operations that trigger switching of exercise classes.
  • the electronic device 100 may detect a user operation on the control 906 (such as a click operation on the control 906 ), and in response to the operation, the electronic device 100 may display the user interface 93 .
  • User interface 93 may include control 908 and control 909 .
  • Control 908 may listen for user operations that trigger a return to the original exercise session.
  • the electronic device 100 may detect a user operation acting on the control 908 (such as a click operation on the control 908), and in response to the operation, the electronic device 100 may display a force detection user interface as shown in FIG. 8A or FIG. 8B or user interface92.
  • Control 909 may listen for user actions that trigger selection of a recommended exercise session.
  • the electronic device 100 may detect a user operation on the control 909 (such as a click operation on the control 909 ), and in response to the operation, the electronic device 100 may switch to the recommended exercise course selected by the user.
  • the electronic device 100 may display the user interface 92 after displaying the user interface 90 .
  • the user interface 92 may include prompt boxes 901 , 902 and a control 903 .
  • the electronic device 100 may detect that the display time of the user interface 90 exceeds a preset time period, it may display the user interface 92 .
  • the user interface 92 may include a control 903 and a prompt box 907 .
  • the electronic device 100 may detect a user operation acting on the control 906 (such as a click operation on the control 906), and in response to the operation or detect that the display time of the user interface 92 exceeds a preset period of time, it may display the user interface as shown in FIG. 8A or FIG. 8B The user interface of the force detection is shown.
  • FIG. 9E exemplarily shows a user interface for performing a simulation test on an exercise course.
  • the user interface 94 may include: a prompt box 910 , and the prompt box 910 may include prompts 911 , 912 , and 913 .
  • Prompt 911 can be used to prompt that the simulation test of the exercise course is completed
  • prompt 912 can be used to prompt the degree of injury risk of the exercise course
  • prompt 913 can be used to display the action with a higher risk of injury in the exercise course and the adjustment plan for the action .
  • FIG. 10 shows a detailed flow of a method for motion analysis. As shown in Figure 10, the method may include:
  • the electronic device receives a user operation of a user on a first application, where the user operation is used to instruct the electronic device to obtain identity evaluation information.
  • the first application may be a motion application in an electronic device, such as a motion application in a smart phone or a TV, a professional motion detection system, etc., or may be a camera.
  • the first application may be the "sports and health” application in the electronic device 100 in FIG. 3A, and may be the “camera” in the electronic device in FIG. 3A.
  • Sports Health is a sports and fitness application program on electronic devices such as smart phones and tablet computers. The embodiment of this application does not limit the name of the application program.
  • “Camera” is a photo-taking application program on electronic devices such as smart phones and tablet computers, and the embodiment of the present application does not limit the name of the application program.
  • the user operation for the first application may be a user's touch operation, or may be a user's voice operation, gesture operation, etc., which is not limited herein.
  • the identity evaluation information may include physical parameter evaluation information, and may also include physical state evaluation information.
  • the body parameter evaluation information may include the user's height, weight, and body fat percentage, and the body state evaluation information may include the user's injury site and injury degree.
  • Injury refers to the destruction of the human body's skin, muscles, bones, viscera and other tissue structures caused by various external traumatic factors and the resulting local or systemic reactions.
  • the damaged part refers to the damaged part of the human body, such as ankle and knee.
  • the way for the user to trigger the acquisition of physical assessment information on the electronic device may be to trigger a control with the function of starting motion detection in the main interface of the first application; or, the user may trigger a certain exercise course control in the main interface of the first application to display The main interface of the exercise course, triggering the control with the function of starting the exercise detection in the main interface of the exercise course, so as to obtain identity evaluation information.
  • the user interface may be the user interface shown in FIG. 3B-FIG. 3E.
  • the electronic device may detect a user operation 200 (such as a click operation on the icon 327 ) acting on the icon 327 of "exercise and health", and in response to the operation, the user interface 32 exemplarily shown in FIG. 3C may be displayed.
  • the electronic device 100 may detect a user operation on the control 354 in the user interface 32 (such as a click operation on the control 354 ), and in response to the operation, the electronic device 100 may display the user interface 33 shown in FIG. 3D .
  • the user interface 33 is the main interface for introducing the exercise course.
  • the electronic device 100 may detect a user operation on the control 360 in the user interface 33 (such as a click operation on the control 360 ), and in response to the operation, the electronic device 100 may display the user interface 34 .
  • the electronic device may detect a user operation on the control 362 in the user interface 34 (such as a click operation on the control 362 ), and trigger the operation of acquiring identity assessment information.
  • the user interface may be the user interface shown in FIG. 3B-FIG. 3E.
  • the electronic device 100 may detect a user operation on the control 358 in the user interface 33 (such as a click operation on the control 358 ), and in response to the operation, the electronic device 100 triggers and activates a simulated motion detection function to obtain identity assessment information.
  • the user interface may be the user interface shown in FIG. 3B and FIG. 3C .
  • the electronic device may detect a user operation 200 (such as a click operation on the icon 327 ) acting on the icon 327 of "exercise and health", and in response to the operation, the user interface 32 exemplarily shown in FIG. 3C may be displayed.
  • the electronic device 100 may detect a user operation on the control 358 in the user interface 32 (such as a click operation on the control 358 ), and in response to the operation, the electronic device 100 triggers the activation of a simulated motion detection function to obtain identity assessment information.
  • the user interface may be the user interface shown in FIG. 3G and FIG. 3H .
  • the electronic device can detect a user operation 205 (such as a click operation on the icon 341) acting on the icon 341 of the "camera", and in response to the operation, the user interface shown in FIG. 3H can be displayed exemplarily. 35.
  • the electronic device 100 can detect a user operation acting on the motion detection option in the user interface 35 (such as a click operation on the shooting mode list 365), and in response to the operation, the electronic device 100 can trigger the motion detection function to obtain identity assessment information .
  • S102 The electronic device acquires identity assessment information of the user, where the identity assessment information includes physical parameter assessment information.
  • Physical parameter assessment information may include weight and height.
  • Identity assessment information may also include physical status assessment information.
  • the physical state evaluation information may include the user's injury site and injury degree.
  • the identity evaluation information may also include an exercise ability index, which refers to the exercise intensity that the user can bear.
  • the electronic device may detect or receive the identity assessment information input by the user through a detection device such as a camera.
  • a detection device such as a camera.
  • the electronic device can turn on the camera, detect the user's image, and obtain the user's height, weight, injury site and injury degree through the detected user image; the electronic device can detect the user's weight by jumping to the installed weight measurement application.
  • the user's weight by processing the obtained user's height data and weight data, the user's BMI can be obtained.
  • the electronic device may obtain information such as height, weight or BMI value, injury site and injury degree input by the user.
  • the electronic device can use the obtained height and weight to calculate the BMI value by weight ⁇ height 2 .
  • the electronic device may obtain the user's identity evaluation information through a detection device such as a camera.
  • a detection device such as a camera.
  • the electronic device can turn on the camera, detect the user's image, and obtain the user's height through the detected user image; the electronic device can connect to the body fat scale through Bluetooth, and obtain the user's weight and BMI value;
  • the user places a certain part of the body on the damaged part and the time it was placed to obtain the user's physical state assessment information.
  • the electronic device can obtain the user's physical state assessment information by turning on the camera to detect the user's hand placed on the damaged part and the time it was placed. Specifically, it can be shown in FIG. 4E-FIG. 4H, FIG. 4J-FIG. 4M, which will not be repeated here.
  • the electronic device receives identity assessment information input by a user.
  • the electronic device may receive the user's input of height, weight, BMI value, injury site and injury degree.
  • the electronic device 100 may detect a user operation on the control 408 (such as a click operation on the control 408 ), and in response to the operation, the electronic device 100 may display the user interface 48 .
  • the user interface 48 may include an input box 416, which may be used to receive user-entered information on height, weight, and body fat percentage.
  • the electronic device 100 may detect a user operation on the input box 416 (such as an input operation on the input box 416), and in response to the operation, may display a user interface for body state assessment as shown in FIG.
  • the electronic device 100 may detect a user operation on the control 419 in the user interface 49 (such as a click operation on the control 419 ), and in response to the operation, the electronic device 100 may display the user interface 53 .
  • the user interface 53 may include input boxes 424 , 425 .
  • the input box 424 can be used to receive the name of the injury site input by the user.
  • the input box 425 can be used to receive the damage degree of the damaged part of the user, and the damage degree can be divided into 1-5 grades, and the larger the number, the more serious the damage degree. Gears 1-5 correspond to the corresponding damage degree boxes, which can be displayed as blue, green, yellow, orange, and red respectively.
  • the electronic device 100 can detect a user operation acting on the 1-5 damage degree box on the input box 425 (for example, a click operation on any one of the 1-5 grades on the input box 425), and the corresponding damage degree box displays a corresponding color.
  • the electronic device 100 may display the user interface 52 after detecting that the user operation on the user interface 53 is completed.
  • the user interface 52 may include a prompt box 423 .
  • the prompt box 423 can be used to display the percentage of the force on the damaged part of the user as the standard force.
  • the electronic device may acquire the identity feature of the user, and acquire the identity evaluation information based on the identity feature.
  • Identity features can be face information, fingerprint information, etc.
  • the electronic device can turn on the camera, obtain the face image, and obtain the face information after processing; match the processed face information with the face information template stored in the electronic device, and call the body assessment information; for example:
  • the electronic device obtains the identity feature of the user, and when the identity feature stored in the electronic device does not include the identity feature, it can detect or receive the identity evaluation information input by the user through a detection device such as a camera.
  • a detection device such as a camera.
  • an electronic device may obtain a face image through a camera, and obtain face information after processing. Match the processed face information with the stored face information template, and if the electronic device detects that the matching fails, the electronic device can detect or receive identity evaluation information input by the user through a detection device such as a camera.
  • the electronic device obtains the user's identity feature, and includes the identity feature in the existing identity feature of the electronic device.
  • the electronic device detects that the user's last login time exceeds a preset time period, and the electronic device can and other detection devices to detect or receive identity assessment information input by users.
  • an electronic device may obtain a face image through a camera, and obtain face information after processing. Match the processed face information with the stored face information template, the electronic device detects the matched face information template, the electronic device detects that the user's last login time exceeds the preset time period, and the electronic device can pass through the camera, etc.
  • the detection device detects or receives user-entered identity assessment information.
  • the user interface may be the user interface shown in FIGS. 4A-4D .
  • the electronic device 100 collects the face information through the camera, it can perform some necessary processing, and match the processed face information with the stored face information template, so as to retrieve the user's physical assessment based on the face information template. information.
  • the face information template may be input by the user before the electronic device 100 performs face recognition.
  • the embodiment of the present application does not limit the devices and specific algorithms for face recognition, as long as the face recognition can be realized.
  • the electronic device 100 may display a user interface 42 as shown in FIG. 4C .
  • the user interface 42 can display a prompt 403, which is used to prompt the user to be a new user, that is, the user's face information is not stored in the stored face information template, and the identity evaluation information cannot be retrieved, and the user needs to perform body measurement evaluation to obtain user information. physical assessment information.
  • the prompt time of prompt 403 may be 5s, and after the prompt ends, the electronic device may display the user interface provided by the body measurement evaluation function exemplarily shown in Fig. 4E-Fig. 4I.
  • the electronic device 100 can detect the user's last login time, and if the user's last login time does not exceed the preset time period, the electronic device 100 can obtain the user's last login time. identity assessment information.
  • the electronic device 100 can detect the user's last login time, and if the user's last login time exceeds the preset time period, the electronic device can display the user interface 43 .
  • User interface 43 may include prompt 404 , control 405 , control 406 .
  • Prompt 404 is used to prompt the user for a user whose login time exceeds a preset time period, suggesting that the user re-perform body measurement assessment.
  • Control 405 is used to listen for user operations that trigger re-measurement evaluation.
  • the electronic device may detect a user operation acting on the control 405 (such as a click operation on the control 405), and in response to the operation, the electronic device may display the body measurement evaluation function provided by the body measurement evaluation function as exemplarily shown in FIGS. 4E-4I.
  • User Interface Control 406 can be used to monitor user operations that trigger the use of original data.
  • the original data indicates the user's physical assessment information stored in the electronic device 100 before that.
  • the electronic device can distribute the body mass of each part of the user by acquiring the user's body parameter evaluation information, and can also calculate the exercise capacity index according to the body parameter evaluation information and the user's exercise amount within a preset time period, and use BMI and The exercise capacity index can correspond to the calculation of the user's load-bearing reference value.
  • Electronic equipment can set the limit of the standard body's ability to bear the joints of the human body as the standard force. Users of different body types have different limits to the bearing capacity of their joints.
  • the force reference value may be equal to the standard force multiplied by the parameter.
  • the electronic device can divide the obtained BMI value into multiple BMI value intervals, and set different percentages of standard force corresponding to the multiple BMI value intervals as the BMI force of different BMI value intervals.
  • the percentage is initial benchmark ratio.
  • the electronic device can set a standard amount of exercise according to the user's BMI, and then dynamically adjust the standard amount of exercise according to the percentage of the maximum amount of exercise in the user's daily exercise amount. When the user's daily exercise volume increases from 0 to the standard exercise volume, the initial baseline ratio is dynamically reduced by 10%.
  • the BMI value can be divided into n BMI value intervals, and the BMI force of the n BMI value intervals can be: 100% ⁇ standard force, 90% ⁇ standard force, 80% ⁇ standard force.... (10-(n-1)) ⁇ standard force.
  • the electronic device can divide the obtained BMI value into five intervals of ⁇ 24, 24-27, 27-30, 30-35, and >35, and the corresponding BMI force can be: 100% ⁇ standard Force, 90% ⁇ standard force, 80% ⁇ standard force, 70% ⁇ standard force, 60% ⁇ standard force, the BMI force is the bearing reference value.
  • the electronic device can also obtain the user's physical state evaluation information to know whether there is a damaged part of the user's body and the degree of damage of the damaged part, so as to calculate and reduce the force evaluation of the damaged part, and combine the user's BMI
  • the force and the standard amount of exercise can be used to calculate the reference value of the force of the injured part.
  • the electronic device can divide the damage degree of the user's damaged part into multiple degrees, evaluate the relationship between the force of the damaged part and the standard force according to the different degrees of damage, and then correspond to the standard force after multiple BMI intervals and damage weighting. Calculate BMI force.
  • the user's placement time at the damaged part can be set to be less than 2s, 2-4s, 4-6s, 6-8s, >8s, and the placement time of the damaged part is related to the force evaluation of the damaged part one by one.
  • the force evaluation of the damaged part is 90%, 80%, 70%, 60% and 50% of the standard force respectively.
  • the electronic device detects that the user's hand is placed on the knee joint for 4-6 seconds, the force evaluation of the damaged part corresponds to 70% of the standard force. Then the BMI force is calculated correspondingly according to multiple BMI intervals and the force evaluation of the damaged part.
  • step S102 may also be performed after step S103 or S104.
  • the electronic device detects the positions of the skeletal nodes of the target object, so as to obtain the spatial positional relationship of the skeletal nodes.
  • the above-mentioned target object can be a user, and can also be a moving image in a selected exercise course, such as an image of a standard demonstration action in an exercise course.
  • the electronic device can analyze the user's bone points, such as ankle joint points, knee joint points, hip joint points, etc., through the bone point recognition technology for the user or in the video, such as images of standard demonstration actions in sports courses.
  • bone points such as ankle joint points, knee joint points, hip joint points, etc.
  • the electronic device can detect the user's skeletal nodes through cameras, sensors, etc., and the positions of the above-mentioned skeletal nodes are used to indicate the user's joints and the connection relationship between each joint and the spatial position relationship of the skeletal nodes.
  • the electronic device can combine the depth camera module or analyze the user's body proportions, fat and thin, etc. according to the above-mentioned body parameter evaluation information.
  • the electronic device may detect the user's bone nodes through sensors such as cameras, infrared sensors, optical markers, and 3D scanners. Electronic devices can also build bone models through deep learning networks such as skinned multi-person linear (SMPL) and visual background extractor (VIBE). For example, when building a human skeleton model through the SMPL model, the height, weight, etc. in the user's physical parameter evaluation information collected above can be used to construct it in conjunction with the national standard GB-T17245-2004.
  • SMPL skinned multi-person linear
  • VIBE visual background extractor
  • the electronic device can detect the user's bone node by using the acquired user image and the human body bone point positioning algorithm, where the bone node refers to the coordinates of the determined bone point. Further, the body shape of the user may be determined in combination with the coordinates of the skeletal nodes and the above body parameter evaluation information.
  • the input of the human skeleton point positioning algorithm may be the user's image
  • the output may be the coordinates of the skeleton nodes.
  • the electronic device can detect basic human skeleton nodes as shown in FIG. 11 , such as left hip joint, right hip joint, left knee joint, right knee joint, and the like. It can be understood that, not limited to the skeleton nodes shown in FIG. 11 , the electronic device can detect more or less skeleton nodes.
  • the electronic device can turn on the camera, acquire user images, and identify skeletal nodes by analyzing the user images, and the acquired skeletal node graph can be shown in FIG. 11 .
  • the electronic device turns on the camera to acquire the image, if the electronic device fails to detect the skeletal node from the image, it indicates that the electronic device fails to detect the user, and the electronic device can output a prompt message that the user cannot be detected , which can be the text "No user detected".
  • the electronic device turns on the camera, acquires an image, and identifies the user's bone nodes based on the image.
  • the electronic device may detect the skeletal nodes of the user, and during the detection process, if the electronic device detects that the posture of the user is abnormal, it may output a relevant abnormal prompt.
  • Abnormal posture means that the user does not maintain an upright state when performing bone node detection before the user is in motion. Upright refers to the natural standing state in which the upper limbs naturally hang down, the toes are forward, and the eyes are forward.
  • the abnormal posture of the lower limbs can be that the legs are bent and the feet are off the ground. If the user does not maintain an upright state, the skeletal node graph detected by the electronic device may be inaccurate, resulting in errors in subsequent force detection. As shown in FIGS.
  • the electronic device 100 when the electronic device 100 detects that the posture of the user's lower limbs is abnormal, it outputs a prompt 504 of abnormal posture, prompting the user to adjust the posture, maintain an upright state, and continue to detect bone nodes.
  • the reminder of the abnormal posture can be a text display on the screen of the electronic device, or a voice reminder, etc.
  • S104 The electronic device performs space attitude calibration, and constructs a reference coordinate system.
  • Spatial posture calibration refers to setting a reference coordinate system for the movement of the target object according to the above-mentioned skeletal nodes when the target object is in an upright state, so as to calibrate the motion state of the target object. For example, taking the detection of the user as an example, the ground and the user's upright direction can be used to construct a reference system, then the user's jumping action presents a motion state of both feet away from the ground and upward jumping relative to this coordinate system. Without spatial attitude calibration, the user's motion state cannot be specifically determined.
  • the reference coordinate system can be based on the user's waist or feet, the midpoint of the line connecting the two feet, etc. As the coordinate axis, establish a space coordinate system.
  • the reference coordinate system is the coordinate system relative to which the user moves in subsequent movements.
  • the midpoint of the line connecting the user's feet is the origin
  • the direction of the line connecting the two feet is the u-axis
  • the vertical distance between the head and neck is the v-axis.
  • the distance is the vertical direction as the w-axis
  • the vertical direction of the uv plane as the w-axis to establish a reference coordinate system uvw.
  • the coordinates of the bone nodes detected through the aforementioned bone nodes can be obtained through the reference coordinate system.
  • the electronic device when it performs space attitude calibration, it may display a calibration countdown on the user interface.
  • the electronic device may display a user interface 60 as shown in FIG. 6 , and the user interface 60 is used to display countdowns 3 , 2 , and 1 of space attitude calibration.
  • the calibration countdown can be a text display on the screen of the electronic device, or a voice reminder, etc.
  • S105 The electronic device acquires a ground lift reference value setting, and the ground lift reference value is used to determine the user's ground lift state.
  • the ground lift reference value refers to the minimum distance when the user's feet are detected to be in the ground lift state relative to the state of both feet touching the ground, and the left and right ankle joints are relative to the above-mentioned reference coordinate system.
  • the electronic device can obtain the ground lift reference value by setting itself, so as to judge the user's ground lift state.
  • the electronic device can obtain the ground lift reference value by setting itself or receiving user input, so as to judge the user's ground lift/bottomed state.
  • the electronic device can set the user's ground clearance reference value by itself.
  • the electronic device 100 can detect a user operation acting on the control 702 (such as a click operation on the control 702), and in response to this operation, the electronic device 100 can set the user's ground lift/bottom reference by itself. value.
  • the electronic device may detect the ground clearance reference value input by the user. As shown in FIG. 7, the electronic device 100 can detect a user operation acting on the input box 703 (such as an input operation on the input box 703), and in response to the operation, the electronic device 100 can receive the reference value of the ground clearance set by the user. .
  • the electronic device may also set the user's ground clearance reference value according to the user's physical parameter evaluation information.
  • the electronic device can dynamically adjust the user's ground clearance reference value according to the level of the BMI value obtained from the body parameter evaluation information. For example, the user's BMI value is divided into multiple BMI intervals. When the BMI value is in the normal interval, the user's ground clearance reference value can be set to xcm. The BMI interval with a larger BMI value has a lower ground clearance reference value.
  • the reference coordinate system is the above-mentioned reference coordinate system uvw
  • the electronic device can receive the user's reference value of 15cm from the ground, which means that when the distance between the user's ankle joint and the uv plane of the reference coordinate system is greater than or equal to 15cm, The electronic device can detect that the user's feet are off the ground, and the user may be doing jumping exercises.
  • the electronic device acquires the force condition of the target object based on the bone node.
  • the above-mentioned target object may be a user, or a moving image in a video, such as an image of a standard demonstration action in an exercise course.
  • the electronic device can obtain the force value of the user for the user, and analyze the force of the user during the exercise.
  • the electronic device can also analyze the stress situation in the image based on the moving image in the video, such as the image of the standard demonstration action in the exercise course, combined with the user's body parameter evaluation information and/or body state evaluation information.
  • S201 Obtain the first data of the target object according to the body parameters of the target object; the first data includes the mass, center of mass, and moment of inertia of the human body links.
  • the inertial parameters of the human body include the mass of the human body, the position of the center of mass and its moment of inertia, which are the basic parameters for the research on human motion and sports injury and prevention.
  • Human body links include: thighs, calves, feet, upper arms, forearms, etc.
  • the mass, center of mass, and moment of inertia of human body links such as the mass, center of mass, and moment of inertia of thighs, calves, and feet.
  • the coordinates of the center of mass of the human body link can be obtained through the above reference coordinate system.
  • the first data obtained in S201 may also be obtained in step S102.
  • S202 Obtain second data of the target object and the ground-off state of the foot; the second data includes movement speed, angular velocity, and position information of human joints.
  • the motion velocity and angular velocity of the center of mass of the above-mentioned human body links can refer to the motion speed and angular velocity of human body links such as thighs and calves, and the position information of human body joints can be obtained by detecting the coordinate values of human body joints in the above-mentioned reference coordinate system.
  • the foot-off state may include a first state, a second state, and a third state.
  • the first state indicates the state of both feet in the air
  • the second state indicates the state of one foot touching the ground
  • the third state indicates the state of both feet touching the ground.
  • the ground lift state can be judged according to whether the distance between the target object's feet relative to the reference plane (such as the uv plane in the above reference coordinate system) exceeds the ground lift reference value, which can be referred to the description in step S105.
  • S203 Calculate a first numerical value and a second numerical value of the ankle joint based on the ground-off state.
  • the first numerical value and the second numerical value of the ankle joint can be calculated by judging the ground-off state of the foot.
  • the first value is the joint force of the ankle joint
  • the second value is the moment of the ankle joint.
  • the human body has left and right feet, that is, left and right ankle joints
  • the first value may include the joint forces of the left and right ankle joints
  • the second value may include the moments of the left and right ankle joints.
  • the joint force or moment of the left/right ankle joint is correspondingly used.
  • the first numerical value and the second numerical value of the ankle joint can be calculated according to the three states of the foot.
  • both feet are in the air, and the first value and the second value of the left and right ankle joints are both 0.
  • the joint force of the other ankle joint can be calculated by summing the product of the mass of the human body link and the velocity of the human body link, and subtracting the vector from the mass center of the human body link to the reference point and the human body The difference between the products of the link weights is summed to calculate the moment at the other ankle joint.
  • the reference point can be the origin in the above reference coordinate system, and the vector from the centroid of the human body link to the reference point can be obtained by calculating the coordinates of the centroid of the human body link in the above reference coordinate system and the coordinates of the reference point.
  • the force of another ankle joint can be calculated by the following formula, where F 1 , M 1 or F 2 , M 2 are 0, and the values of F 2 , M 2 or F 1 , M 1 can be calculated,
  • F 1 and F 2 are the first numerical value of the ankle joint
  • M 1 and M 2 are the second numerical value of the ankle joint
  • m i is the mass of the human body link
  • v ci is the movement speed of the mass center of the human body link
  • G is the The weight of the user calculated by the parameters
  • J i is the moment of inertia of the human body link
  • ⁇ i is the angular velocity of the center of mass of the human body link
  • ri is the vector from the center of mass of the human body link to the reference point
  • g is the acceleration of gravity. The following will not repeat them.
  • the ground-off state is the third state
  • the feet touch the ground
  • the first value and the second value of the ankle joint can be calculated according to the first data and the second data
  • the first data is the mass, center of mass, and rotation of the human body link Inertia
  • the second data is the position information of human joints
  • the second coordinate is the projection coordinate of the center of gravity of the target object
  • the third and fourth coordinates are ankle joint coordinates
  • the second coordinate, the third coordinate, and the fourth coordinate are based on the second Data learned.
  • the sum of the joint forces of the left and right ankle joints can be calculated by summing the product of the mass of the human body link and the velocity of the human body link, and subtracting the mass center of the human body link from the product value of the mass of the human body link and the angular velocity of the human body link to the reference.
  • the difference summation of the product of the vector of the point and the weight of the human body links is used to calculate the torque sum of the left and right ankle joints.
  • the reference point can be the origin in the above reference coordinate system, and the vector from the centroid of the human body link to the reference point can be obtained by calculating the coordinates of the centroid of the human body link in the above reference coordinate system and the coordinates of the reference point.
  • the projected coordinates of the center of gravity can be determined according to the vertical mapping between the center of gravity and the reference plane (such as the uv plane mentioned above) of the above reference coordinate system.
  • the third coordinate and the fourth coordinate are the coordinates of the left and right ankle joints in the above reference coordinate system, which can be obtained according to the above reference coordinate system, as shown in FIG. 14 .
  • P proj be the projected coordinates of the center of gravity
  • P 1 be the coordinates of the left ankle joint
  • P 2 be the coordinates of the right ankle joint.
  • the first value and the second value can be calculated by the following formula:
  • S204 Based on the motion posture of the target object, construct a first coordinate system, the first coordinate system is used to construct a homogeneous transformation matrix and obtain first coordinates of human joints in the first coordinate system.
  • the first coordinate system may include: a reference sub-coordinate system, a first sub-coordinate system, and a second sub-coordinate system; the homogeneous transformation matrix is based on the relationship between the reference sub-coordinate system, the first sub-coordinate system, and the second sub-coordinate system Relationship construction: the relationship between the reference sub-coordinate system, the first sub-coordinate system, and the second sub-coordinate system includes the distance and angle between the coordinate axes; the first coordinate is the coordinate of the human body joint in the reference sub-coordinate system. It is understandable that the lower limb coordinate system can be established according to the bone node position of the target object, and the coordinate system can be established at the hip joint, knee joint, and ankle joint.
  • the reference sub-coordinate system is the coordinate system established based on a certain bone node, and the second and third coordinate systems are the coordinate systems established based on the other two bone nodes.
  • the reference coordinate system can be established at the spherical center of the spherical ankle of the hip joint.
  • the hip joint contains three rotational degrees of freedom
  • the knee joint contains one rotational degree of freedom
  • the ankle joint contains two rotational degrees of freedom.
  • the electronic device can also establish an artificial coordinate system on the foot to describe the orientation of the foot.
  • the lower limb coordinate system as shown in Figure 15 can be established.
  • the establishment process is as follows: Assume that the reference coordinate system is established with the hip joint, assuming that the first rotation is around the Z 0 axis, X 0 Y 0 Z 0 is the reference coordinate system, X 0 faces the front of the foot, and Z 0 faces the side of the human body. The Y 0 orientation determined by the right-hand rule. Assuming that the hip joint lifts the leg around Z 1 for the second time, the direction of X 1 is determined according to the right-hand rule, and Z 0 is transformed into Z 1 .
  • a is the z-axis distance of the adjacent sub-coordinate system
  • d is the x-axis distance of the adjacent sub-coordinate system
  • is the z-axis angle between the adjacent sub-coordinate systems
  • is the initial angle between the X-axis + the joint rotation angle to be calculated
  • the initial state scale shown in Figure 16 can be obtained.
  • X 0 and X 1 intersect, so d is 0; Z 0 and Z 1 intersect, so a is 0; the angle between Z 0 and Z 1 is 90, so ⁇ is 90.
  • S205 Based on the homogeneous transformation matrix, calculate the angular velocity of the calf according to the first coordinates and the first data.
  • the distance and included angle between coordinate axes of adjacent joint coordinate systems can be obtained.
  • the distance between the x and z axes of adjacent coordinate systems and the included angle between the z axes of adjacent coordinate systems can be obtained according to the lower limb coordinate system shown in FIG. 15 .
  • the homogeneous transformation matrix can be constructed through the distance and angle between the coordinate axes of the adjacent joint coordinate systems.
  • the homogeneous transformation matrix is as follows:
  • p is the coordinate value of the ankle joint, knee joint and hip joint in the above reference coordinate system
  • a is the z-axis distance of the adjacent sub-coordinate system
  • d is the x-axis distance of the adjacent sub-coordinate system
  • is the adjacent sub-coordinate is the included angle of the z-axis
  • is the initial included angle between the x-axes + the joint rotation angle to be calculated.
  • a is the z-axis distance of adjacent sub-coordinate systems
  • d is the x-axis distance of adjacent sub-coordinate systems
  • is the z-axis angle between adjacent sub-coordinate systems
  • its initial state quantity It It can be shown in Figure 16.
  • m is the coefficient obtained by multiplying the A matrix, that is, the coefficient of the trigonometric function multiplication
  • p is the first coordinate
  • a is the z-axis distance of the adjacent coordinate system
  • d is the x-axis distance of the adjacent coordinate system
  • is the phase
  • is the initial included angle + the rotation angle of the human joint.
  • the value of the rotation angle ⁇ can be solved, and the angular velocity of this link can be calculated by derivation.
  • the angular velocity of the link can be derived from the rotation angle of the joint through a first-order differential.
  • the angular velocity of the lower leg can be calculated.
  • the angular velocity of the thigh can also be calculated by bringing the above-mentioned detected data of the hip joint into the T matrix and the homogeneous transformation matrix.
  • S206 Calculate a third value and a fourth value of the knee joint based on the first data, the second data, the first value and the second value of the ankle joint, and the angular velocity of the knee joint.
  • the third angle of the knee joint can be calculated. value and the fourth value.
  • the third numerical value refers to the joint force of the knee joint
  • the fourth data refers to the torque of the knee joint.
  • F3 is the third numerical value
  • M4 is the fourth numerical value
  • m shank is the calf mass in the first data
  • r shank is the vector from the center of mass of the calf to the reference point
  • r foot is the vector from the center of mass of the foot to the reference point
  • J shank is the moment of inertia of the calf
  • the mass of the thigh can be used to calculate the fifth value and the sixth value of the hip joint.
  • the fifth numerical value refers to the joint force of the hip joint
  • the sixth data refers to the torque of the hip joint.
  • the fifth and sixth values of the hip joint can be calculated by the following formula:
  • F 5 is the fifth value
  • M 6 is the sixth value
  • m thigh is the mass of the thigh
  • r thigh is the vector from the center of mass of the thigh to the reference point
  • rshank is the vector from the center of mass of the calf to the reference point
  • J thigh is the moment of inertia of the thigh in the first data
  • the electronic device when the electronic device detects the motion of the user, it may display the detected motion image of the user on the screen. Furthermore, the electronic device can also display the user's joint force on the displayed motion image, such as the position of the joint force and the magnitude of the joint force.
  • the electronic device may display the moving image of the user.
  • the electronic device can display the user's moving image while displaying the action demonstration in the exercise course, or the electronic device can only display the user's moving image.
  • the electronic device displays a user interface 80
  • the user interface 80 may display an image 802 of an exercise course demonstration action and a detected motion image 801 of the user.
  • the value of the joint force or moment obtained through the above calculation may also be superimposed and displayed on the corresponding part of the user in the moving image through a color code.
  • a circle may be superimposed on the corresponding force-bearing part of the user, and the numerical value of the joint force/moment of the part is displayed beside the circle.
  • a colored circle can include yellow, green and red.
  • the corresponding force value is small and yellow can be displayed; the force value gradually increases, changing from yellow to green and then to red.
  • the shape and color of the superimposed color scale are not limited thereto. It can be understood that the user image shown in FIG. 8B can be marked to mark the force-bearing part for a clearer illustration. The user image in FIG. The actual moving image of , and mark it on the moving image.
  • the electronic device when the electronic device simulates and detects the exercise course, it may display images of standard demonstration actions in the exercise course, as shown in FIG. 8C .
  • the electronic device can detect the ground projection of the center of gravity in real time before or during the force detection, so as to determine whether the user's center of gravity deviates from the stable range. If the user's center of gravity deviates from the stable range, the user may experience unstable posture or fall, and the electronic device can prompt the user to adjust the body's center of gravity. The prompt can be prompted through text on the user interface, or when it is detected that the user deviates from the stable range during exercise, the user can be prompted to adjust the center of gravity by voice.
  • S107 The electronic device judges the occurrence of a movement risk based on the stress on each joint when the target object moves, and outputs risk prompt information.
  • the electronic device when it detects the force of the target object, it can obtain the corresponding joint force data of each joint when the target object moves, such as the first value and the second value of the ankle joint, the first value and the second value of the knee joint. At least one of the three numerical values and the fourth numerical value and the fifth numerical value and the sixth numerical value of the hip joint are compared with the reference data, and when the numerical comparison result exceeds a preset threshold, the electronic device can display risk warning information.
  • the preset threshold can be set according to actual needs, which is not limited in the present application.
  • the stress situation of the joint may be a numerical value of the joint force.
  • the above-mentioned reference data of joint force may be the stress threshold of human joints (it may be the above-mentioned BMI force), or it may be the above-mentioned force-bearing reference value.
  • the joint force threshold can be evaluated by counting the maximum joint force of a certain number of users in the experiment.
  • the electronic device can determine the risk of sports injury by calculating whether the ratio of the joint force to the human joint force threshold or the force reference value is the ratio to a preset threshold (such as a value of 1).
  • the electronic device can also judge the occurrence of movement risk based on the joint torque, and output a risk prompt.
  • the electronic device can calculate the cumulative work through the joint torque of the target object, and judge the risk of sports injury by comparing the cumulative work with the cumulative work threshold, and if the ratio of the cumulative work to the cumulative work threshold is the ratio of the preset threshold (such as 1).
  • the electronic device may display the first prompt.
  • the electronic device 100 may display a user interface 90 based on the above force detection.
  • the user interface 90 may display a prompt 901, and the electronic device 100 detects a user operation acting on the control 903 (such as a click operation on the control 903), and in response to the operation, the electronic device 100 may display the The user interface for force testing.
  • the electronic device may display the second prompt. For example, based on the above-mentioned force detection, the electronic device obtains the corresponding joint force value of each part when the user is exercising, and compares the value with the corresponding joint force threshold. If the joint force value is less than the joint force threshold, the electronic device repeats. Force detection; if the joint force value is greater than or equal to the joint force threshold, the electronic device outputs a first prompt. For example, as shown in FIG. 9A and FIG.
  • the electronic device 100 when the electronic device 100 detects that the value of the joint force is greater than the joint force threshold, it can display the user interface 90, and the user interface 90 can display a prompt box 901 to prompt the user that the current motion action has a higher value.
  • Injury risk can also show the reason why the user is at risk of injury, for example, the user's left leg is under too much force.
  • the electronic device After the electronic device displays the prompt box 901 beyond the preset time period, it may display the user interface 92, and the user interface 92 may display the prompt box 907 to guide the user to adjust the exercise posture.
  • the electronic device 100 may detect that the user chooses to return to the original exercise course or detect that the display time of the user interface 92 exceeds a preset time period, and may display the force detection user interface as shown in FIG. 8A or FIG. 8B .
  • the electronic device after the electronic device displays the first information, it may display the first option and the second option, and when a user operation on the first option is detected, in response to the operation, the electronic device may display the second prompt ; When the electronic device detects a user operation on the first option by the user, in response to the operation, the electronic device may display a third prompt.
  • the electronic device 100 may display a user interface 90 to prompt that the current exercise action has a high risk.
  • the electronic device 100 may display the user 91 when no user operation on the control 903 is detected within the preset time period, or when the display time of the user interface 90 exceeds the preset time period.
  • the electronic device can detect a user operation on the control 905 (eg, a click operation on the control 905 ), and in response to the operation, the electronic device 100 can display the user interface 92 .
  • the user interface 92 may include a prompt box 907, and the prompt box 907 may display an adjustment plan for an exercise action with a higher risk.
  • the electronic device 100 may detect that the display time of the user interface 92 exceeds a preset time period, and may display the force detection user interface as shown in FIG. 8A or FIG. 8B .
  • the electronic device 100 may display a user interface 90 to prompt that the current exercise action has a high risk.
  • the electronic device 100 may display the user 91 when no user operation on the control 903 is detected within the preset time period, or when the display time of the user interface 90 exceeds the preset time period.
  • the electronic device can detect a user operation on the control 906 (such as a click operation on the control 906 ), and in response to the operation, the electronic device 100 can display the user interface 93 .
  • the electronic device 100 can detect a user operation (such as a click operation on the control 908) acting on the control 908 in the user interface 93, and in response to the operation, the electronic device 100 can display the force detection as shown in FIG. 8A or FIG. 8B The user interface or user interface 92.
  • the electronic device 100 may also detect a user operation acting on the control 909 (such as a click operation on the control 909), and in response to the operation, the electronic device 100 may switch to the recommended exercise course selected by the user.
  • the electronic device can output a risk prompt when it detects that the exercise action has a certain risk of injury to the user; the electronic device can also output the exercise after the simulation detection of the exercise course Risk warning of the course.
  • the electronic device 100 can detect that the sports action in the exercise course has a higher risk of injury than the user, that is, output a risk reminder that the sports action has a higher risk.
  • the specific steps are not as follows: Let me repeat.
  • the electronic device 100 displays the user interface 94.
  • the user interface 94 is used to prompt that the simulation test has been completed, and can also display the degree of injury risk of the exercise course, which has a relatively high risk of injury. High-risk sports activities, etc.
  • the electronic device obtains the corresponding joint force value of each part when the target object moves, and can compare the value with the force reference value, if the ratio of the joint force value/joint force threshold is less than 0.6 , the movement is a low-risk movement, and the electronic device repeatedly performs force detection; if the ratio of the joint force value/joint force threshold is 0.6-0.9, the movement is a medium-risk movement, and the electronic device can continue to perform stress detection.
  • the electronic device can also output guidance information for adjusting the movement; if the ratio of joint force value/joint force threshold is greater than 0.9, the movement is a high-risk movement, and the electronic device can output risk warning information.
  • FIG. 17 shows the flow of another motion analysis method, which is to perform simulated detection of joint force/torque for a motion course.
  • the method may include:
  • the electronic device receives a user operation by a user on a first application, where the user operation is used to instruct the electronic device to obtain identity evaluation information of the user.
  • step S101 For the electronic device receiving the user's user operation on the first application and the electronic device acquiring the user's identity evaluation information, reference may be made to the above step S101.
  • S302 The electronic device acquires identity evaluation information of the user.
  • S303 The electronic device detects the position of the bone nodes in the image of the standard demonstration action in the exercise course, so as to obtain the spatial position relationship of the bone nodes.
  • S304 The electronic device performs spatial attitude calibration on images of standard demonstration actions in the exercise course, and constructs a reference coordinate system.
  • the electronic device needs to set a ground-off reference value by itself, and the ground-lift reference value is used to determine whether the image of the standard demonstration action in the exercise course is in the ground-off state.
  • the electronic device can obtain the corresponding height and weight information for the image of the standard demonstration action in the exercise course, and combine the height and weight information and the position of the bone nodes to obtain the mass between the bone nodes, the position of the center of mass, the moment of inertia, etc.
  • the electronic device can obtain the corresponding height and weight information for the image of the standard demonstration action in the exercise course, and combine the height and weight information and the position of the bone nodes to obtain the mass between the bone nodes, the position of the center of mass, the moment of inertia, etc.
  • S306 The electronic device judges the occurrence of exercise risk based on the stress of each joint in the image of the standard demonstration action in the exercise course, and outputs risk prompt information.

Abstract

Provided are an athletic analysis method and apparatus, and an electronic device and a computer storage medium. The method comprises: acquiring first data of a target object according to body parameters of the target object (S201); acquiring second data of the target object, and the off-ground state of feet (S202); calculating a first numerical value and a second numerical value of an ankle joint on the basis of the off-ground state (S203); constructing a first coordinate system on the basis of an athletic posture of the target object, wherein the first coordinate system is used for constructing a homogeneous transformation matrix and acquiring first coordinates of a human body joint in the first coordinate system (S204); calculating the angular velocity of lower legs on the basis of the homogeneous transformation matrix and according to the first coordinates and the first data (S205); and calculating a third numerical value and a fourth numerical value of a knee joint on the basis of the first data, the second data, the first numerical value and the second numerical value of the ankle joint, and the angular velocity of the lower legs (S206). By means of the present application, a relatively simple method can be acquired to calculate a joint stress condition, so as to determine an athletic injury risk.

Description

运动分析方法、装置、电子设备及计算机存储介质Motion analysis method, device, electronic equipment and computer storage medium
本申请要求于2021年10月29日提交中国专利局、申请号为202111276279.2、申请名称为“运动分析方法、装置、电子设备及计算机存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of the Chinese patent application with the application number 202111276279.2 and the application title "Motion analysis method, device, electronic equipment and computer storage medium" submitted to the China Patent Office on October 29, 2021, the entire content of which is incorporated by reference incorporated in this application.
技术领域technical field
本申请涉及运动分析技术领域,尤其涉及一种运动分析方法、装置、电子设备及计算机存储介质。The present application relates to the technical field of motion analysis, in particular to a motion analysis method, device, electronic equipment and computer storage medium.
背景技术Background technique
运动是人类日常生活中重要的组成部分,运动需要人体不同关节相关配合。在运动过程中,人们可能因运动姿势不标准、关节受力过大导致不同程度的运动损伤。Exercise is an important part of human daily life, and exercise requires the cooperation of different joints of the human body. During exercise, people may suffer from different degrees of sports injuries due to non-standard exercise posture and excessive joint force.
目前,可以通过专业的运动捕捉系统和逆动力学分析软件来仿真求解关节受力,进而完成运动指导。但这种方案的算法复杂,设备成本较高,不便于用户日常使用。At present, professional motion capture system and inverse dynamics analysis software can be used to simulate and solve the joint force, and then complete the motion guidance. However, the algorithm of this scheme is complicated, the equipment cost is high, and it is not convenient for users to use it daily.
因此,如何获取一种较为简单的、成本较低的方法来计算关节受力以有效提供损伤风险预警是本领域技术人员需要解决的技术问题。Therefore, how to obtain a relatively simple and low-cost method to calculate joint force to effectively provide early warning of injury risk is a technical problem to be solved by those skilled in the art.
发明内容Contents of the invention
本申请实施例公开一种运动分析方法、装置、电子设备及计算机存储介质,通过较为简单的方法来计算关节受力情况以提供损伤风险预警。The embodiment of the present application discloses a motion analysis method, device, electronic equipment, and computer storage medium, which calculate the stress on the joints in a relatively simple way to provide early warning of the risk of injury.
第一方面,本申请实施例提供一种运动分析方法,该方法包括:In the first aspect, the embodiment of the present application provides a motion analysis method, which includes:
依据目标对象的身体参数获取所述目标对象的第一数据;所述第一数据包括人体环节的质量、质心、转动惯量;Acquiring the first data of the target object according to the physical parameters of the target object; the first data includes the mass, center of mass, and moment of inertia of the human body link;
获取所述目标对象的第二数据和足部的离地状态;所述第二数据包括人体环节质心的运动速度、角速度及人体关节位置信息;Obtaining the second data of the target object and the ground-off state of the feet; the second data includes the movement velocity, angular velocity and position information of the human body joints of the center of mass of the human body link;
基于所述离地状态计算所述踝关节的第一数值和第二数值;calculating a first value and a second value of the ankle joint based on the ground lift state;
基于所述目标对象的运动姿势,构建第一坐标系,所述第一坐标系用于构建齐次变换矩阵和获取所述人体关节在所述第一坐标系的第一坐标;Constructing a first coordinate system based on the movement posture of the target object, the first coordinate system is used to construct a homogeneous transformation matrix and obtain the first coordinates of the human joints in the first coordinate system;
基于所述齐次变换矩阵,根据所述第一坐标和所述第一数据计算小腿的角速度;calculating the angular velocity of the calf according to the first coordinates and the first data based on the homogeneous transformation matrix;
基于所述第一数据、所述第二数据、所述踝关节的第一数值和第二数值及所述小腿的角速度计算所述膝关节的第三数值和第四数值。calculating a third value and a fourth value of the knee joint based on the first data, the second data, the first value and the second value of the ankle joint, and the angular velocity of the lower leg.
在本申请中,通过获取目标对象的人体惯性参数和运动姿态数据结合离地状态,依据动量定理、动量矩定理计算目标对象踝关节的关节受力和力矩,结合踝关节的关节力和力矩计算膝关节的关节力和力矩,相比于现有的方案,如需通过专业的运动分析设备检测踝关节的关节力,基于该关节力计算踝关节的力矩,再计算膝关节的力矩。本申请可以基于目标对象的离地状态计算踝关节的关节力和力矩,再计算膝关节的关节力和力矩,计算方法较为简单,使用过程中无需采用专业的运动分析设备以降低成本。In this application, the joint force and moment of the target object's ankle joint is calculated according to the momentum theorem and momentum moment theorem by obtaining the human body inertial parameters and motion posture data of the target object combined with the ground-off state, and combined with the joint force and moment of the ankle joint to calculate For the joint force and moment of the knee joint, compared with the existing solutions, it is necessary to detect the joint force of the ankle joint through professional motion analysis equipment, calculate the moment of the ankle joint based on the joint force, and then calculate the moment of the knee joint. This application can calculate the joint force and moment of the ankle joint based on the ground-off state of the target object, and then calculate the joint force and moment of the knee joint. The calculation method is relatively simple, and there is no need to use professional motion analysis equipment during use to reduce costs.
结合第一方面,在其中一种可能的实施方式中,所述方法还包括:基于所述齐次变换矩阵,根据所述第一坐标和所述第一数据计算大腿的角速度;基于所述第一数据、所述第二数据、所述踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述大腿的角速度计算所述髋关节的第五数值和第六数值。With reference to the first aspect, in one possible implementation manner, the method further includes: calculating the angular velocity of the thigh according to the first coordinate and the first data based on the homogeneous transformation matrix; One data, the second data, the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint and the angular velocity of the thigh calculate the fifth value and the value of the hip joint sixth value.
本申请在计算出踝关节、膝关节的关节力、力矩的基础上,还可以结合大腿的角速度和人体惯性参数和运动姿态数据可以计算出髋关节的关节力和力矩。On the basis of calculating the joint force and moment of the ankle joint and knee joint, the application can also calculate the joint force and moment of the hip joint by combining the angular velocity of the thigh, the inertial parameters of the human body and the motion posture data.
结合第一方面,在其中一种可能的实施方式中,基于所述离地状态计算所述踝关节的第一数值和第二数值,包括:当所述离地状态为第一状态的情况下,所述踝关节的第一数值和第二数值均为0;当所述离地状态为第二状态的情况下,根据所述第一数据和所述第二数据计算所述踝关节的第一数值和第二数值;当所述离地状态为第三状态的情况下,根据所述第一数据、所述第二数据计算所述踝关节的第一数值和第二数值;所述第二数据为所述人体关节的位置信息,第二坐标为所述目标对象的重心投影坐标,第三坐标、第四坐标为所述踝关节坐标,所述第二坐标、第三坐标、第四坐标根据所述第二数据获得。With reference to the first aspect, in one possible implementation manner, calculating the first value and the second value of the ankle joint based on the ground-off state includes: when the ground-off state is the first state , the first value and the second value of the ankle joint are both 0; when the off-the-ground state is the second state, calculate the first value of the ankle joint according to the first data and the second data A numerical value and a second numerical value; when the ground-off state is a third state, calculate the first numerical value and the second numerical value of the ankle joint according to the first data and the second data; the first numerical value The second data is the position information of the human joints, the second coordinates are the projected coordinates of the center of gravity of the target object, the third coordinates and the fourth coordinates are the coordinates of the ankle joints, the second coordinates, the third coordinates, the fourth Coordinates are obtained from said second data.
本申请将离地状态分为三种情况,包括单脚触地状态、双脚触地状态和双脚腾空状态,基于检测到的不同离地状态,区分情形计算踝关节的关节力和力矩。基于不同的离地状态,使用不同的数据进行计算,能够快速简便的计算出不同离地状态下的踝关节的关节和力矩。This application divides the ground-off state into three situations, including the state of one foot touching the ground, the state of both feet touching the ground and the state of both feet in the air, and calculates the joint force and moment of the ankle joint based on the detected different ground-off states. Based on different ground-off states and using different data for calculation, the joints and moments of the ankle joints in different ground-off states can be calculated quickly and easily.
结合第一方面,在其中一种实施方式中,所述根据所述第一数据和所述第二数据计算所述踝关节的第一数值和第二数值,包括:通过以下公式计算所述第一数值和第二数值,With reference to the first aspect, in one implementation manner, the calculating the first value and the second value of the ankle joint according to the first data and the second data includes: calculating the first value by the following formula a value and a second value,
F 1+F 2=Σm iΔv ci+G F 1 +F 2 =Σm i Δv ci +G
M 1+M 2=Σ(J iΔω i-r i×m ig) M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
其中:F1、F2为所述踝关节的第一数值,M1、M2为所述踝关节的第二数值,m i为所述人体环节的质量,v ci为所述人体环节质心的运动速度,G为根据所述身体参数计算的所述用户的重量,J i为所述人体环节的转动惯量,ω i为所述人体环节质心的角速度,r i为依据所述人体环节质心到参考点的矢量,g为重力加速度。 Wherein: F1, F2 are the first numerical value of described ankle joint, M1, M2 are the second numerical value of described ankle joint, m i is the quality of described human body link, v ci is the movement velocity of described human body link barycenter, G is the weight of the user calculated according to the body parameters, J i is the moment of inertia of the human body link, ω i is the angular velocity of the center of mass of the human body link, r i is the distance from the center of mass of the human body link to the reference point Vector, g is the acceleration due to gravity.
本申请基于单脚触地的离地状态,腾空状态的踝关节的关节力和力矩为0,触地状态的踝关节可以结合人体惯性参数的相关信息和人体关节运动数据依据上述公式计算出关节力和力矩。This application is based on the off-the-ground state of one foot touching the ground. The joint force and moment of the ankle joint in the vacant state are 0. The ankle joint in the ground-contact state can be calculated by combining the relevant information of the inertial parameters of the human body and the motion data of the human body joints according to the above formula. force and moment.
结合第一方面,在其中一种可能的实施方式中,根据所述第一数据、所述第二数据计算所述踝关节的第一数值和第二数值;所述第二数据为所述人体关节的位置信息,第二坐标为所述目标对象的重心投影坐标,第三坐标、第四坐标为所述踝关节坐标,所述第二坐标、第三坐标、第四坐标根据所述第二数据获得,包括:通过以下公式计算所述第一数值和第二数值,With reference to the first aspect, in one possible implementation manner, the first value and the second value of the ankle joint are calculated according to the first data and the second data; the second data is the human body The position information of the joint, the second coordinate is the projection coordinate of the center of gravity of the target object, the third coordinate and the fourth coordinate are the coordinates of the ankle joint, and the second coordinate, the third coordinate and the fourth coordinate are based on the second The data acquisition includes: calculating the first value and the second value by the following formula,
F 1+F 2=Σm iΔv ci+G F 1 +F 2 =Σm i Δv ci +G
M 1+M 2=Σ(J iΔω i-r i×m ig) M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
Figure PCTCN2022127953-appb-000001
Figure PCTCN2022127953-appb-000001
Figure PCTCN2022127953-appb-000002
Figure PCTCN2022127953-appb-000002
Figure PCTCN2022127953-appb-000003
Figure PCTCN2022127953-appb-000003
Figure PCTCN2022127953-appb-000004
Figure PCTCN2022127953-appb-000004
其中:F1、F2为所述踝关节的第一数值,M1、M2为所述踝关节的第二数值,m i为所述人体环节的质量,v ci为所述人体环节质心的运动速度,G为根据所述身体参数计算的所述用户的重量,J i为所述人体环节的转动惯量,ω i为所述人体环节质心的角速度,r i为所述人体环节质心到参考点的矢量,P proj为所述第二坐标,P 1为所述第三坐标,P 2为所述第四坐标。 Wherein: F1, F2 are the first numerical value of described ankle joint, M1, M2 are the second numerical value of described ankle joint, m i is the quality of described human body link, v ci is the movement velocity of described human body link barycenter, G is the weight of the user calculated according to the body parameters, J i is the moment of inertia of the human body link, ω i is the angular velocity of the center of mass of the human body link, r i is the vector from the center of mass of the human body link to the reference point , P proj is the second coordinate, P 1 is the third coordinate, and P 2 is the fourth coordinate.
本申请基于双脚触地的离地状态,可以结合人体惯性参数的相关信息、人体关节运动数据及踝关节的关节节点坐标依据上述公式计算出关节力和力矩。This application is based on the off-the-ground state of both feet touching the ground, and can combine the relevant information of human body inertia parameters, human body joint motion data and joint node coordinates of ankle joints to calculate joint forces and moments according to the above formula.
结合第一方面,在其中一种可能的实施方式中,基于所述齐次变换矩阵,根据所述第一坐标和所述第一数据计算小腿的角速度,包括:将所述第一坐标对应以下公式计算所述人体关节的转动角,基于所述人体关节的转动角计算所述小腿的角速度,With reference to the first aspect, in one possible implementation manner, based on the homogeneous transformation matrix, calculating the angular velocity of the calf according to the first coordinate and the first data includes: corresponding the first coordinate to the following The formula calculates the rotation angle of the human body joint, and calculates the angular velocity of the lower leg based on the rotation angle of the human body joint,
Figure PCTCN2022127953-appb-000005
Figure PCTCN2022127953-appb-000005
Figure PCTCN2022127953-appb-000006
(1≤i≤6,i为正整数)
Figure PCTCN2022127953-appb-000006
(1≤i≤6, i is a positive integer)
其中,m为系数,p为所述第一坐标,a、d、α为所述第一坐标系中已知距离或角度,θ为初始夹角+所述人体关节的转动角。Wherein, m is a coefficient, p is the first coordinate, a, d, and α are known distances or angles in the first coordinate system, and θ is the initial included angle + the rotation angle of the human joint.
结合第一方面,在其中一种可能的实施方式中,所述基于所述第一数据、所述第二数据、所述踝关节的第一数值和第二数值及所述小腿的角速度计算所述膝关节的第三数值和第四数值,包括:通过以下公式计算所述第三数值和第四数值,With reference to the first aspect, in one possible implementation manner, the calculation based on the first data, the second data, the first value and the second value of the ankle joint and the angular velocity of the lower leg The third numerical value and the fourth numerical value of the knee joint include: calculating the third numerical value and the fourth numerical value by the following formula,
Figure PCTCN2022127953-appb-000007
Figure PCTCN2022127953-appb-000007
Figure PCTCN2022127953-appb-000008
Figure PCTCN2022127953-appb-000008
其中,F3为所述第三数值,M4为所述第四数值,m shank为所述第一数据中的小腿质量,
Figure PCTCN2022127953-appb-000009
为所述第二数据中小腿质心的速度,r shank为依据所述第一数据和所述第二数据得出的小腿质心距离参考点的矢量,r foot为依据所述第一数据和所述第二数据得出的足部质心距离参考点的矢量,J shank为所述第一数据中小腿的转动惯量,
Figure PCTCN2022127953-appb-000010
为所述小腿的角速度。
Wherein, F3 is the third numerical value, M4 is the fourth numerical value, m shank is the calf mass in the first data,
Figure PCTCN2022127953-appb-000009
is the velocity of the center of mass of the calf in the second data, r shank is the vector of the distance from the center of mass of the calf to the reference point based on the first data and the second data, r foot is based on the first data and the The vector of the center of mass of the foot obtained from the second data from the reference point, J shank is the moment of inertia of the calf in the first data,
Figure PCTCN2022127953-appb-000010
is the angular velocity of the lower leg.
结合第一方面,在其中一种可能的实施方式中,所述基于所述齐次变换矩阵,根据所述第一坐标和所述第一数据计算大腿的角速度,包括:将所述第一坐标对应以下公式计算所述人体关节的转动角,基于所述人体关节的转动角计算所述大腿的角速度,With reference to the first aspect, in one possible implementation manner, the calculating the angular velocity of the thigh according to the first coordinate and the first data based on the homogeneous transformation matrix includes: converting the first coordinate The angle of rotation of the human joint is calculated corresponding to the following formula, and the angular velocity of the thigh is calculated based on the angle of rotation of the human joint,
Figure PCTCN2022127953-appb-000011
Figure PCTCN2022127953-appb-000011
Figure PCTCN2022127953-appb-000012
(1≤i≤6,i为正整数)
Figure PCTCN2022127953-appb-000012
(1≤i≤6, i is a positive integer)
其中,m为系数,p为所述第一坐标,a、d、α为所述第一坐标系中已知距离或角度,θ为初始夹角+所述人体关节的转动角。Wherein, m is a coefficient, p is the first coordinate, a, d, and α are known distances or angles in the first coordinate system, and θ is the initial included angle + the rotation angle of the human joint.
结合第一方面,在其中一种可能的实施方式中,所述基于所述第一数据、所述第二数据、所述踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述大腿的角速度计算所述髋关节的第五数值和第六数值,包括:通过以下公式计算所述第五数值和第六数值,With reference to the first aspect, in one possible implementation manner, the said first data, the second data, the first value and the second value of the ankle joint, the third value of the knee joint The numerical value and the fourth numerical value and the angular velocity of the thigh are used to calculate the fifth numerical value and the sixth numerical value of the hip joint, including: calculating the fifth numerical value and the sixth numerical value by the following formula,
Figure PCTCN2022127953-appb-000013
Figure PCTCN2022127953-appb-000013
Figure PCTCN2022127953-appb-000014
Figure PCTCN2022127953-appb-000014
其中,F5为所述第五数值,M6为所述第六数值,m thigh为所述第一数据中的大腿质量,
Figure PCTCN2022127953-appb-000015
为所述第二数据中大腿质心的速度,r thigh为依据所述第一数据和所述第二数据得出的大腿质心距离参考点的矢量,r shank为依据所述第一数据和所述第二数据得出的小腿质心距离参考点的矢量,J thigh为所述第一数据中大腿的转动惯量,
Figure PCTCN2022127953-appb-000016
为所述大腿的角速度。
Wherein, F5 is the fifth numerical value, M6 is the sixth numerical value, m thigh is the thigh mass in the first data,
Figure PCTCN2022127953-appb-000015
is the velocity of the center of mass of the thigh in the second data, r thigh is the vector of the center of mass of the thigh from the reference point based on the first data and the second data, and r shank is the vector based on the first data and the second data The second data obtains the vector of the calf center of mass distance from the reference point, and J thigh is the moment of inertia of the thigh in the first data,
Figure PCTCN2022127953-appb-000016
is the angular velocity of the thigh.
结合第一方面,在其中一种可能的实施方式中,第一坐标系包括:基准子坐标系、第一子坐标系、第二子坐标系;With reference to the first aspect, in one possible implementation manner, the first coordinate system includes: a reference sub-coordinate system, a first sub-coordinate system, and a second sub-coordinate system;
所述齐次变换矩阵基于所述基准子坐标系、第一子坐标系、第二子坐标系之间的关系构建;所述基准子坐标系、第一子坐标系、第二子坐标系之间的关系包括坐标轴之间的距离和角度;The homogeneous transformation matrix is constructed based on the relationship between the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system; the relationship between the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system The relationship between them includes the distance and angle between the coordinate axes;
所述第一坐标为所述人体关节在所述基准子坐标系中的坐标。The first coordinates are coordinates of the human body joints in the reference sub-coordinate system.
结合第一方面,在其中一种可能的实施方式中,获取所述足部的离地状态,包括:显示第一用户界面,所述第一用户界面用于显示离地参考值的设定;所述离地参考值用于判断所述足部的离地状态;接收针对所述离地参考值的设定操作。通过获取离地参考值的设定,可用以判断足部的离地状态。With reference to the first aspect, in one possible implementation manner, acquiring the ground lift state of the foot includes: displaying a first user interface, where the first user interface is used to display the setting of the ground lift reference value; The ground lift reference value is used to judge the ground lift state of the foot; and a setting operation for the ground lift reference value is received. By obtaining the setting of the ground lift reference value, it can be used to judge the ground lift state of the foot.
结合第一方面,在其中一种可能的实施方式中,所述方法还包括:显示第二用户界面,所述第二用户界面显示所述目标对象的第一图像,在所述第一图像上叠加第一区域和第一标识;所述第一区域为所述人体关节在所述第一图像上的区域,所述第一标识为所述踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述髋关节的第五数值和第六数值中的至少一项。在上述方法中,可以在运动图像上显示人体关节的关节力和力矩,能够更为直观地显示目标对象的受力情况。With reference to the first aspect, in one possible implementation manner, the method further includes: displaying a second user interface, where the second user interface displays a first image of the target object, and on the first image superimposing a first area and a first identification; the first area is the area of the human joint on the first image, and the first identification is the first value and the second value of the ankle joint, the At least one of the third value and the fourth value of the knee joint and the fifth value and the sixth value of the hip joint. In the above method, the joint forces and moments of the human joints can be displayed on the moving image, and the stress situation of the target object can be displayed more intuitively.
结合第一方面,在其中一种可能的实施方式中,计算所述髋关节的第五数值和第六数值之后,还包括:基于所述踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述髋关节的第五数值和第六数值中的至少一项,判断是否产生运动风险。With reference to the first aspect, in one possible implementation manner, after calculating the fifth value and the sixth value of the hip joint, it further includes: based on the first value and the second value of the ankle joint, the knee At least one of the third numerical value and the fourth numerical value of the joint and the fifth numerical value and the sixth numerical value of the hip joint are used to determine whether a movement risk occurs.
在其中一种可能的实施方式中,判断是否产生运动风险,包括:判断所述踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述髋关节的第五数值和第六数值中的至少一项与第一参考数值的比值与第一阈值的大小;若所述比值大于第一阈值,输出风险提示信息。In one possible implementation manner, judging whether a movement risk occurs includes: judging the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint, and the value of the hip joint. The ratio of at least one of the fifth value and the sixth value to the first reference value and the first threshold value; if the ratio is greater than the first threshold value, a risk warning message is output.
在一种可能的实现方式中,输出风险提示信息,包括:输出第一提示;或者,输出第一提示,所述第一提示包括第一选项;接收作用于所述第一选项的第二操作,输出第二提示。因此,上述方法可以输出人体关节产生损伤风险的程度,也可以输出针对具有损伤风险的动作或运动课程的指导方案,有助于用户调整自身运动姿势,降低产生损伤风险的可能性。In a possible implementation manner, outputting risk prompt information includes: outputting a first prompt; or outputting a first prompt, the first prompt including a first option; receiving a second operation acting on the first option , outputs the second prompt. Therefore, the above method can output the degree of risk of injury to human joints, and can also output a guidance plan for actions or exercise courses with risk of injury, which helps users adjust their own exercise posture and reduce the possibility of injury risk.
在一种可能的实现方式中,第一参考数值为所述人体关节受力阈值。In a possible implementation manner, the first reference value is the human body joint stress threshold.
结合第一方面,在其中一种可能的实施方式中,依据目标对象的身体参数获取所述目标对象的第一数据之前,还包括:对用户进行身体测量评估;所述身体测量评估包括评估身体状态,所述身体状态包括损伤部位和所述损伤部位的损伤程度。通过检测用户的身体状态,可以获知用户是否存在损伤部位以及损伤部位的损伤程度,从而可以调整相应的运动动作或提示用户相应减轻损伤部位的受力,降低用户的运动损伤风险。With reference to the first aspect, in one possible implementation manner, before obtaining the first data of the target object according to the physical parameters of the target object, the method further includes: performing anthropometric assessment on the user; the anthropometric assessment includes assessing physical state, the body state includes the injury site and the injury degree of the injury site. By detecting the user's physical state, it is possible to know whether the user has an injured part and the degree of injury of the injured part, so that the corresponding sports action can be adjusted or the user can be prompted to reduce the force on the injured part to reduce the risk of the user's sports injury.
在一种可能的实现方式中,评估身体状态,包括:检测第一部位放置于所述用户的损伤部位,检测所述第一部位放置于所述损伤部位的时间;所述第一部位为所述用户的身体部位;根据所述时间确定所述损伤程度。通过检测身体部位放置于损伤部位的时间,无需复杂的过程,可以简便的告知损伤情况。In a possible implementation manner, evaluating the physical state includes: detecting that a first part is placed on the damaged part of the user, and detecting the time when the first part is placed on the damaged part; the body part of the user; and determine the extent of the injury based on the time. By detecting the time when the body part is placed on the injured part, the injury status can be easily informed without complicated procedures.
在一种可能的实现方式中,第一参考数值为承力参考值,所述承力参考值根据所述身体测量评估调整。通过身体测量评估动态调整承力参考值,可以依据身体测量评估的信息调整承力参考值,还可以根据身体测量评估产生变化后的信息动态调整承力参考值,因此,可以更加依据用户的自身情况调整承力参考值,能够更精准的降低损伤风险。In a possible implementation manner, the first reference value is a bearing reference value, and the bearing reference value is adjusted according to the body measurement assessment. Dynamically adjust the load reference value through body measurement and evaluation, the load reference value can be adjusted according to the information of the body measurement evaluation, and the load reference value can also be dynamically adjusted according to the information after the body measurement evaluation changes, so it can be more based on the user's own Adjusting the load-bearing reference value according to the situation can reduce the risk of injury more accurately.
结合第一方面,在其中一种可能的实施方式中,所述目标对象为所述用户或已选定运动课程中的运动图像。通过检测用户的实际运动,可以分析用户的实时分析运动情况。通过检测已选定运动课程中的运动图像,可以通过模拟检测已选定运动课程中的运动图像,分析该运动课程中的运动情况。进一步,可以输出有关该已选定运动课程的风险提示来判断该运动课程中的运动是否适合用户。With reference to the first aspect, in one possible implementation manner, the target object is the user or a motion image in a selected exercise course. By detecting the actual movement of the user, the user's real-time analysis movement situation can be analyzed. By detecting the moving images in the selected exercise course, it is possible to simulate the detection of the moving images in the selected exercise course to analyze the exercise situation in the exercise course. Further, risk warnings about the selected exercise course can be output to determine whether the exercise in the exercise course is suitable for the user.
第二方面,本申请提供一种运动分析装置,包括用于执行上述第一方面所述的方法的单元。In a second aspect, the present application provides a motion analysis device, including a unit for performing the method described in the first aspect above.
第三方面,本申请提供一种电子设备,包括触控屏,存储器,一个或多个处理器,多个应用程序,以及一个或多个程序;其中,所述一个或多个程序被存储在所述存储器中;其特征在于,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备实现上述第一方面所述的方法。In a third aspect, the present application provides an electronic device, including a touch screen, memory, one or more processors, multiple application programs, and one or more programs; wherein, the one or more programs are stored in In the memory; it is characterized in that, when the one or more processors execute the one or more programs, the electronic device implements the method described in the first aspect above.
第四方面,本申请提供一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行上述第一方面所述的方法。In a fourth aspect, the present application provides a computer storage medium, which is characterized by comprising computer instructions, and when the computer instructions are run on an electronic device, the electronic device is made to execute the method described in the first aspect above.
综上所述,本申请实施例通过区分不同离地状态来计算踝关节受力,再通过在下肢建立坐标系的方式获知膝关节和髋关节受力情况,可以获取一种较为简单的方法来计算关节受力情况以判断运动损伤风险。To sum up, the embodiment of the present application calculates the force on the ankle joint by distinguishing between different ground-off states, and then obtains the force on the knee joint and hip joint by establishing a coordinate system in the lower limbs, so that a relatively simple method can be obtained. Calculate the stress on the joints to judge the risk of sports injuries.
附图说明Description of drawings
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍。In order to illustrate the technical solutions of the embodiments of the present application more clearly, the following briefly introduces the drawings that need to be used in the description of the embodiments.
图1是本申请实施例提供的一种电子设备的结构示意图;FIG. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
图2是本申请实施例提供的一种电子设备的软件结构框图;FIG. 2 is a software structural block diagram of an electronic device provided by an embodiment of the present application;
图3A是本申请实施例提供的一种电子设备上的用于应用程序菜单的用户界面的示意图;FIG. 3A is a schematic diagram of a user interface for an application program menu on an electronic device provided by an embodiment of the present application;
图3B-图3E为本申请涉及的一种场景的示意图;3B-3E are schematic diagrams of a scenario involved in this application;
图3F为本申请涉及的一种场景的示意图;FIG. 3F is a schematic diagram of a scene involved in this application;
图3G-图3H为本申请的另一场景的示意图;FIG. 3G-FIG. 3H are schematic diagrams of another scene of the present application;
图4A-图4D为本申请实施例提供的一组界面示意图;4A-4D are a set of schematic diagrams of interfaces provided by the embodiment of the present application;
图4E-图4I为本申请实施例提供的另一组界面示意图;Figure 4E-Figure 4I are another set of schematic diagrams of the interface provided by the embodiment of the present application;
图4J-图4N为本申请实施例提供的另一组界面示意图;Figure 4J-Figure 4N is another set of schematic diagrams of interfaces provided by the embodiment of the present application;
图5A-图5C为本申请实施例提供的另一组界面示意图;Figures 5A-5C are schematic diagrams of another set of interfaces provided by the embodiment of the present application;
图6为本申请实施例提供的另一组界面示意图;Fig. 6 is another set of interface diagrams provided by the embodiment of the present application;
图7为本申请实施例提供的另一组界面示意图;Fig. 7 is a schematic diagram of another group of interfaces provided by the embodiment of the present application;
图8A-图8C为本申请实施例提供的另一组界面示意图;8A-8C are schematic diagrams of another set of interfaces provided by the embodiment of the present application;
图9A-图9E为本申请实施例提供的另一组界面示意图;9A-9E are schematic diagrams of another set of interfaces provided by the embodiment of the present application;
图10示出了一种运动分析方法的流程图;Fig. 10 shows a flow chart of a motion analysis method;
图11为本申请实施例提供的一种骨骼节点的示意图;Fig. 11 is a schematic diagram of a skeleton node provided by the embodiment of the present application;
图12为本申请实施例提供的一种参考坐标系的示意图;FIG. 12 is a schematic diagram of a reference coordinate system provided by the embodiment of the present application;
图13为本申请实施例提供的一种运动分析法的流程图;Fig. 13 is a flow chart of a motion analysis method provided by the embodiment of the present application;
图14为本申请实施例提供的一种重心地面投影的示意图;FIG. 14 is a schematic diagram of a center of gravity ground projection provided by an embodiment of the present application;
图15为本申请实施例提供的一种下肢坐标系的示意图;Fig. 15 is a schematic diagram of a lower limb coordinate system provided by the embodiment of the present application;
图16为本申请实施例提供的一种下肢坐标系的初始状态量的示意图;Fig. 16 is a schematic diagram of an initial state quantity of a lower limb coordinate system provided by the embodiment of the present application;
图17为本申请实施例提供的另一种运动分析方法的流程图。FIG. 17 is a flow chart of another motion analysis method provided by the embodiment of the present application.
具体实施方式Detailed ways
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。The technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the drawings in the embodiments of the present application.
应当理解,当在本说明书和所附权利要求书中使用时,术语“包括”和“包含”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。It should be understood that when used in this specification and the appended claims, the terms "comprising" and "comprises" indicate the presence of described features, integers, steps, operations, elements and/or components, but do not exclude one or Presence or addition of multiple other features, integers, steps, operations, elements, components and/or collections thereof.
还应当理解,在此本申请说明书中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本申请。如在本申请说明书和所附权利要求书中所使用的那样,除非上下文清楚地指明其它情况,否则单数形式的“一”、“一个”及“该”意在包括复数形式。It should also be understood that the terminology used in the specification of this application is for the purpose of describing particular embodiments only and is not intended to limit the application. As used in this specification and the appended claims, the singular forms "a", "an" and "the" are intended to include plural referents unless the context clearly dictates otherwise.
还应当进一步理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。It should also be further understood that the term "and/or" used in the description of the present application and the appended claims refers to any combination and all possible combinations of one or more of the associated listed items, and includes these combinations .
如在本说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。As used in this specification and the appended claims, the term "if" may be construed as "when" or "once" or "in response to determining" or "in response to detecting" depending on the context . Similarly, the phrase "if determined" or "if [the described condition or event] is detected" may be construed, depending on the context, to mean "once determined" or "in response to the determination" or "once detected [the described condition or event] ]” or “in response to detection of [described condition or event]”.
以下介绍了电子设备、用于这样的电子设备的用户界面、和用于这样的电子设备的实施例。在一些实施例中,电子设备可以是还包含其它功能诸如个人数字助理和/或音乐播放器功能的便携式电子设备,诸如手机、平板电脑、具备无线通讯功能的可穿戴式电子设备(如智能手表)等。便携式电子设备的示例性实施例包括但不限于搭载iOS、Android、Microsoft或者其他操作系统的便携式电子设备。Electronic devices, user interfaces for such electronic devices, and embodiments for such electronic devices are described below. In some embodiments, the electronic device may be a portable electronic device that also includes other functions such as a personal digital assistant and/or a music player, such as a mobile phone, a tablet computer, a wearable electronic device with a wireless communication function (such as a smart watch) )wait. Exemplary embodiments of portable electronic devices include, but are not limited to, portable electronic devices running iOS, Android, Microsoft, or other operating systems.
本申请的说明书和权利要求书及附图中的术语“用户界面(user interface,UI)”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接收形式之间的转换。应用程序的用户界面是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在终端设备上经过解析,渲染,最终呈现为用户可以识别的内容,如图片、文字、按钮等控件。控件(control)也称为部件(widget),是用户界面的基本元素,典型的控件有工具栏(toolbar)、菜单栏(menu bar)、文本框(text box)、按钮(button)、滚动条(scrollbar)、图片和文本。界面中的控件的属性和内容是通过标签或者节点来定义的。一个节点对应界面中一个控件或属性,节点经过解析和渲染之后呈现为用户可视的内容。此外,很多应用程序,比如混合应用(hybrid application)的界面中通常还包含有网页。网页,也称为页面,可以理解为内嵌在应用程序界面中的一个特殊的控件,网页是通过特定计算机语言编写的源代码,例如超文本标记语言(hyper text markup language,HTML),层叠样式表(cascading style sheets,CSS),java脚本(Java scr iptvv,JS)等,网页源代码可以由浏览器或与浏览器功能类似的网页显示组件加载和显示为用户可识别的内容。网页所包含的具体内容也是通过网页源代码中的标签或节点来定义的,比如HTML通过<p>、<img>、<video>、<canvas>来定义网页的元素和属性。The term "user interface (UI)" in the specification, claims and drawings of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the internal form of information Convert between forms that the user can receive. The user interface of the application program is the source code written in specific computer languages such as java and extensible markup language (XML). Such as pictures, text, buttons and other controls. Control (control), also known as widget (widget), is the basic element of user interface. Typical controls include toolbar (toolbar), menu bar (menu bar), text box (text box), button (button), scroll bar (scrollbar), images and text. The properties and contents of the controls in the interface are defined through labels or nodes. A node corresponds to a control or property in the interface, and after the node is parsed and rendered, it is presented as the content visible to the user. In addition, the interfaces of many applications, such as hybrid applications, usually include web pages. A web page, also called a page, can be understood as a special control embedded in an application program interface. A web page is a source code written in a specific computer language, such as hypertext markup language (HTML), cascading style Tables (cascading style sheets, CSS), java scripts (Java scriptvv, JS), etc., the source code of the webpage can be loaded and displayed as user-recognizable content by a browser or a webpage display component similar in function to the browser. The specific content contained in the webpage is also defined by the tags or nodes in the source code of the webpage. For example, HTML defines the elements and attributes of the webpage through <p>, <img>, <video>, and <canvas>.
用户界面常用的表现形式是图形用户界面(graphical user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可视包括图标、按钮、菜单、选项卡、文本框、对话框、状态框、导航栏、Widget等可视的界面元素。The commonly used form of user interface is the graphical user interface (GUI), which refers to the user interface related to computer operation displayed in a graphical way. It can be an icon, window, control and other interface elements displayed on the display screen of the electronic device, where the visible control includes icons, buttons, menus, tabs, text boxes, dialog boxes, status boxes, navigation bars, widgets, etc. Visual interface elements.
本申请以下实施例提供了一种运动分析方法、图形用户界面及电子设备,可以计算关节力/力矩,判断相关运动的损伤风险,提供损伤风险评估和预警,减少产生运动损伤的可能性。The following embodiments of the present application provide a motion analysis method, a graphical user interface, and electronic equipment, which can calculate joint forces/torques, determine the injury risk of related sports, provide injury risk assessment and early warning, and reduce the possibility of sports injuries.
下面介绍本申请以下实施例中提供的示例性电子设备。Exemplary electronic devices provided in the following embodiments of the present application are introduced below.
图1示出了电子设备的结构示意图。FIG. 1 shows a schematic structural diagram of an electronic device.
电子设备可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境传感器180L,骨传导传感器180M等。The electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and user An identification module (subscriber identification module, SIM) card interface 195 and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an environment sensor. 180L, bone conduction sensor 180M, etc.
可以理解的是,本申请实施例示意的结构并不构成对电子设备的具体限定。在本申请另一些实施例中,电子设备可以包括比图示更多或更少的部件,或组合某些部件,或拆分某些部件,或不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It can be understood that, the structure shown in the embodiment of the present application does not constitute a specific limitation on the electronic device. In other embodiments of the present application, the electronic device may include more or fewer components than shown in the illustrations, or combine certain components, or separate certain components, or arrange different components. The illustrated components can be realized in hardware, software or a combination of software and hardware.
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(Application Processor,AP),调制解调处理器,图形处理器(Graphics Processing unit, GPU),图像信号处理器(Image Signal Processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(Digital Signal Processor,DSP),基带处理器,和/或神经网络处理器(Neural-network Processing Unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。在一些实施例中,电子设备也可以包括一个或多个处理器110.The processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing unit, GPU), an image signal processor (Image Signal Processor, ISP), controller, memory, video codec, digital signal processor (Digital Signal Processor, DSP), baseband processor, and/or neural network processor (Neural-network Processing Unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors. In some embodiments, the electronic device may also include one or more processors 110.
其中,控制器可以是电子设备的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。Wherein, the controller may be the nerve center and command center of the electronic equipment. The controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了电子设备的效率。A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the electronic device.
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器(mobile industry processor interface,MIPI)接口,通用输入输出(general-purpose input/output,GPIO)接口,SIM接口,和/或USB接口等。In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (MIPI) interface, general-purpose input/output (general-purpose input/output, GPIO) interface, SIM interface, and/or USB interface, etc.
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备的触摸功能。The I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be respectively coupled to the touch sensors 180K through different I2C bus interfaces, so that the processor 110 and the touch sensors 180K communicate through the I2C bus interfaces to realize the touch function of the electronic device.
I2S可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。I2S can be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor and the audio module 170 . In some embodiments, the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
PCM接口也可以用于音频通信,将模拟信号抽样量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。The PCM interface can also be used for audio communication, quantizing and encoding analog signal samples. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset.
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160.例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。The UART interface is a universal serial data bus used for asynchronous communication. The bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, the UART interface is usually used to connect the processor 110 and the wireless communication module 160. For example, the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function. In some embodiments, the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备的显示功能。The MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 . MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc. In some embodiments, the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device. The processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device.
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制性,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110余摄像头193,显示屏194,无线通信 模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。The GPIO interface can be configured by software. The GPIO interface can be configured as a control signal or as a data signal. In some embodiments, the GPIO interface can be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备充电,也可以用于电子设备与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。The USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like. The USB interface 130 can be used to connect a charger to charge the electronic device, and can also be used to transmit data between the electronic device and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备的结构限定。在另一些实施例中,电子设备也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。It can be understood that the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device. In other embodiments, the electronic device may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电器的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备的无线充电圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。The charging management module 140 is configured to receive a charging input from a charger. Wherein, the charger may be a wireless charger or a wired charger. In some embodiments of the wired charger, the charging management module 140 can receive the charging input of the wired charger through the USB interface 130 . In some wireless charging embodiments, the charging management module 140 may receive wireless charging input through a wireless charging ring of the electronic device. While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器11欧中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一器件中。The power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 . The power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 . The power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status. In some other embodiments, the power management module 141 may also be disposed in the processor 11Ω. In some other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device.
电子设备的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。The wireless communication function of the electronic device can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
天线1和天线2用于发射和接收电磁波信号。电子设备中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另一些实施例中,天线可以和调谐开关结合使用。 Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals. Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
移动通信模块150可以提供应用在电子设备上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(Low Noise Amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110中的至少部分模块被设置在同一器件中。The mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (Low Noise Amplifier, LNA) and the like. The mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation. The mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation. In some embodiments, at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 . In some embodiments, at least part of the functional modules of the mobile communication module 150 and at least part of the modules in the processor 110 may be set in the same device.
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。A modem processor may include a modulator and a demodulator. Wherein, the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing. The low-frequency baseband signal is passed to the application processor after being processed by the baseband processor. The application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 . In some embodiments, the modem processor may be a stand-alone device. In some other embodiments, the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
无线通信模块160可以提供应用在电子设备上的包括无线局域网(wireless local area  networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。The wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 . The wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
电子设备通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。The electronic device realizes the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquidcrystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Mini LED,Micro LED,Micro-OLED,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备可以包括1个或N个显示屏194,N为大于1的正整数。The display screen 194 is used to display images, videos and the like. The display screen 194 includes a display panel. The display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode). diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Mini LED, Micro LED, Micro-OLED, quantum dot light-emitting diodes (quantum dot light emitting diodes, QLED), etc. In some embodiments, the electronic device may include 1 or N display screens 194, where N is a positive integer greater than 1.
在本申请的一些实施例中,显示屏194显示用户的运动图像。In some embodiments of the present application, the display screen 194 displays moving images of the user.
电子设备可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行指令以生成或改变显示信息。The electronic device can realize the display function through ISP, camera 193 , video codec, GPU, display screen 194 and application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute instructions to generate or alter display information.
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像或视频。ISP还可以对图像的噪点,亮度等进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。The ISP is used for processing the data fed back by the camera 193 . For example, when taking a picture, open the shutter, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image or video visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, etc. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be located in the camera 193 .
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像或视频信号。ISP将数字图像或视频信号输出到DSP加工处理。DSP将数字图像或视频信号转换成标准的RGB,YUV等格式的图像或视频信号。在一些实施例中,电子设备可以包括1个或N个摄像头193,N为大于1的正整数。Camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects it to the photosensitive element. The photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP for conversion into a digital image or video signal. ISP outputs digital image or video signal to DSP for processing. DSP converts digital images or video signals into standard RGB, YUV and other formats of images or video signals. In some embodiments, the electronic device may include 1 or N cameras 193, where N is a positive integer greater than 1.
在本申请的一些实施例中,通过摄像头193实时检测用户足、膝、髋、腕、肘、头颈等部位的二维位置信息。In some embodiments of the present application, the two-dimensional position information of the user's feet, knees, hips, wrists, elbows, head and neck, etc. is detected in real time by the camera 193 .
数字信号处理器用于处理数字信号,除了可以处理数字图像或视频信号,还可以处理其他数字信号。例如,当电子设备在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。Digital signal processors are used to process digital signals. In addition to digital image or video signals, they can also process other digital signals. For example, when an electronic device selects a frequency point, a digital signal processor is used to perform Fourier transform on the frequency point energy, etc.
视频编解码器用于对数字视频压缩或解压缩。电子设备可以支持一种或多种视频编解码器。这样,电子设备可以播放或录制多种编码格式的视频,例如:动态图像专家组(Moving Picture Experts Group,MPEG)1,MPEG2,MPEG3,MPEG4等。Video codecs are used to compress or decompress digital video. An electronic device may support one or more video codecs. In this way, the electronic device can play or record videos in various encoding formats, such as: Moving Picture Experts Group (Moving Picture Experts Group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
NPU为神经网络(Neural-Network,NN)计算处理器,通过借鉴生物神经网络结构,例 如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。NPU is a neural network (Neural-Network, NN) computing processor. By referring to the structure of biological neural networks, such as the transmission mode between neurons in the human brain, it can quickly process input information and can continuously learn by itself. Applications such as intelligent cognition of electronic devices can be realized through NPU, such as: image recognition, face recognition, speech recognition, text understanding, etc.
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。The external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像视频播放功能等)等。存储数据区可存储电子设备使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括告诉随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。The internal memory 121 may be used to store computer-executable program codes including instructions. The processor 110 executes various functional applications and data processing of the electronic device by executing instructions stored in the internal memory 121 . The internal memory 121 may include an area for storing programs and an area for storing data. Wherein, the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image and video playing function, etc.) and the like. The storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device. In addition, the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
电子设备可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。The electronic device can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。The audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。当电子设备接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。Speaker 170A, also referred to as a "horn", is used to convert audio electrical signals into sound signals. When the electronic device receives a call or a voice message, it can listen to the voice by placing the receiver 170B close to the human ear.
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。Receiver 170B, also called "earpiece", is used to convert audio electrical signals into sound signals. When the electronic device receives a call or a voice message, it can listen to the voice by placing the receiver 170B close to the human ear.
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备可以设置至少一个麦克风170C。The microphone 170C, also called "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a phone call or sending a voice message, the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C. The electronic device may be provided with at least one microphone 170C.
耳机接口170D用于连接有线耳机。The earphone interface 170D is used for connecting wired earphones.
传感器模块180可以包括1个或多个传感器,这些传感器可以为相同类型或不同类型。可理解,图1所示的传感器模块180仅为一种示例性的划分方式,还可能有其他划分方式,本申请对此不作限制。The sensor module 180 may include one or more sensors, which may be of the same type or of different types. It can be understood that the sensor module 180 shown in FIG. 1 is only an exemplary division manner, and there may be other division manners, which are not limited in the present application.
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。当有触摸操作作用于显示屏194,电子设备根据压力传感器180A检测所述触摸操作强度。电子设备也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。The pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 180A may be disposed on display screen 194 . When a touch operation acts on the display screen 194, the electronic device detects the intensity of the touch operation according to the pressure sensor 180A. The electronic device may also calculate the touched position according to the detection signal of the pressure sensor 180A. In some embodiments, touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. In some embodiments, touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
陀螺仪传感器180B可以用于确定电子设备的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。The gyro sensor 180B can be used to determine the motion posture of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (ie, x, y, and z axes) may be determined by the gyro sensor 180B. The gyro sensor 180B can be used for image stabilization.
气压传感器180C用于测量气压。在一些实施例中,电子设备通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。The air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
磁传感器180D包括霍尔传感器。电子设备可以利用磁传感器180D检测翻盖皮套的开合。The magnetic sensor 180D includes a Hall sensor. The electronic device may detect opening and closing of the flip holster using the magnetic sensor 180D.
加速度传感器180E可检测电子设备在各个方向上(一般为三轴)加速度的大小。当电子设备静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。The acceleration sensor 180E can detect the acceleration of the electronic device in various directions (generally three axes). When the electronic device is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
距离传感器180F,用于测量距离。电子设备可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备可以利用距离传感器180F测距以实现快速对焦。The distance sensor 180F is used to measure the distance. Electronic devices can measure distance via infrared or laser light. In some embodiments, when shooting a scene, the electronic device can use the distance sensor 180F to measure the distance to achieve fast focusing.
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备通过发光二极管向外发射红外光。电子设备使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备附近有物体。当检测到不充分的反射光时,电子设备可以确定电子设备附近没有物体。电子设备可以利用接近光传感器180G检测用户手持电子设备贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes. The light emitting diodes may be infrared light emitting diodes. Electronic devices emit infrared light outwards through light-emitting diodes. Electronic devices use photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object in the vicinity of the electronic device. When insufficient reflected light is detected, the electronic device may determine that there is no object in the vicinity of the electronic device. The electronic device can use the proximity light sensor 180G to detect that the user holds the electronic device close to the ear to make a call, so as to automatically turn off the screen to save power. The proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
环境光传感器180L用于感知环境光亮度。电子设备可以根据感知的环境光亮度自适应调节显示屏194亮度。The ambient light sensor 180L is used for sensing ambient light brightness. The electronic device can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
指纹传感器180H用于获取指纹。电子设备可以利用获取的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。The fingerprint sensor 180H is used to acquire fingerprints. Electronic devices can use the acquired fingerprint features to unlock fingerprints, access application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and so on.
温度传感器180J用于检测温度。在一些实施例中,电子设备利用温度传感器180J检测的温度,执行温度处理策略。The temperature sensor 180J is used to detect temperature. In some embodiments, the electronic device uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy.
触摸传感器180K,也称触控面板或触敏表面。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备的表面,与显示屏194所处的位置不同。The touch sensor 180K is also called a touch panel or a touch-sensitive surface. The touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”. The touch sensor 180K is used to detect a touch operation on or near it. The touch sensor can pass the detected touch operation to the application processor to determine the type of touch event. Visual output related to the touch operation can be provided through the display screen 194 . In some other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device, which is different from the position of the display screen 194 .
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。The bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 180M can also contact the human pulse and receive the blood pressure beating signal. In some embodiments, the bone conduction sensor 180M can also be disposed in the earphone, combined into a bone conduction earphone. The audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function. The application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备可以接收按键输入,产生与电子设备的用户设置以及功能控制有关的键信号输入。The keys 190 include a power key, a volume key and the like. The key 190 may be a mechanical key. It can also be a touch button. The electronic device can receive key input and generate key signal input related to user settings and function control of the electronic device.
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。The motor 191 can generate a vibrating reminder. The motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback. For example, touch operations applied to different applications (such as taking pictures, playing audio, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 . Different application scenarios (for example: time reminder, receiving information, alarm clock, games, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect can also support customization.
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。The indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备的接触和分离。电子设备可以支持1个或N个SIM卡接口,N为大 于1的正整数。电子设备通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备采用SIM,即:嵌入式SIM卡。SIM卡可以嵌在电子设备中,不能和电子设备分离。The SIM card interface 195 is used for connecting a SIM card. The SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to realize contact and separation with the electronic device. The electronic device can support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The electronic device interacts with the network through the SIM card to realize functions such as calling and data communication. In some embodiments, the electronic device adopts a SIM, that is, an embedded SIM card. A SIM card can be embedded in an electronic device and cannot be separated from the electronic device.
图1示例性所示的电子设备可以通过显示屏194显示一下各个实施例中所描述的各个用户界面。电子设备可以通过触摸传感器180K在各个用户界面中检测触控操作,例如在各个用户界面中的点击操作(如在图标上的触摸操作、双击操作),又例如在各个用户界面中的向上或向下的滑动操作,或执行画圆圈手势的操作,等等。在一些实施例中,电子设备可以通过陀螺仪传感器180B、加速度传感器180E等检测用户手持电子设备执行的运动手势,例如晃动电子设备。在一些实施例中,电子设备可以通过摄像头193检测非触控的手势操作。The electronic device exemplarily shown in FIG. 1 can display various user interfaces described in various embodiments through a display screen 194 . The electronic device can detect a touch operation in each user interface through the touch sensor 180K, such as a click operation (such as a touch operation on an icon, a double-click operation) in each user interface, and for example, an upward or downward movement in each user interface. swipe down, or perform a circle gesture, and so on. In some embodiments, the electronic device may detect motion gestures performed by the user holding the electronic device, such as shaking the electronic device, through the gyroscope sensor 180B, the acceleration sensor 180E, and the like. In some embodiments, the electronic device can detect non-touch gesture operations through the camera 193 .
在本申请的一些实施例中,电子设备可以通过摄像头193捕捉用户的运动图像。In some embodiments of the present application, the electronic device can capture the moving image of the user through the camera 193 .
电子设备的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备的软件结构。The software system of the electronic device may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture. In this embodiment of the present application, the Android system with layered architecture is taken as an example to illustrate the software structure of the electronic device.
图2是本申请实施例的电子设备的软件结构框图。FIG. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present application.
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将系统分为四层,从上至下分别为应用程序层,应用程序框架层,运行时(Android runtime)和系统库,以及内核层。The layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces. In some embodiments, the system is divided into four layers, which are application program layer, application program framework layer, runtime (Android runtime) and system library, and kernel layer from top to bottom.
应用程序层可以包括一系列应用程序包。The application layer can consist of a series of application packages.
如图2所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序(也可以称为应用)。As shown in FIG. 2, the application package may include application programs (also called applications) such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。The application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer. The application framework layer includes some predefined functions.
如图2所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。As shown in Figure 2, the application framework layer can include window managers, content providers, view systems, phone managers, resource managers, notification managers, and so on.
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。A window manager is used to manage window programs. The window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。Content providers are used to store and retrieve data and make it accessible to applications. Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。The view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. The view system can be used to build applications. A display interface can consist of one or more views. For example, a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
电话管理器用于提供电子设备的通信功能。例如通话状态的管理(包括接通,挂断等)。The phone manager is used to provide communication functions of electronic devices. For example, the management of call status (including connected, hung up, etc.).
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话界面形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。The notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction. For example, the notification manager is used to notify the download completion, message reminder, etc. The notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog interface. For example, prompting text information in the status bar, issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
运行时包括核心库和虚拟机。运行时负责系统的调度和管理。The runtime includes the core library and virtual machine. The runtime is responsible for the scheduling and management of the system.
核心库包含两部分:一部分是编程语言(例如,java语言)需要调用的功能函数,另一部分是安卓的核心库。The core library includes two parts: one part is the function function that the programming language (for example, java language) needs to call, and the other part is the core library of Android.
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的编程文件(例如,java文件)执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。The application layer and the application framework layer run in virtual machines. The virtual machine executes programming files (for example, java files) of the application program layer and the application program framework layer as binary files. The virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),二维图形引擎(例如:SGL)等。A system library can include multiple function modules. For example: surface manager (surface manager), media library (media libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了二维(2-Dimensional,2D)和三维(3-Dimensional,3D)图层的融合。The surface manager is used to manage the display subsystem, and provides fusion of two-dimensional (2-Dimensional, 2D) and three-dimensional (3-Dimensional, 3D) layers for multiple applications.
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。The media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc. The media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
三维图形处理库用于实现3D图形绘图,图像渲染,合成,和图层处理等。The 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
2D图形引擎是2D绘图的绘图引擎。2D graphics engine is a drawing engine for 2D drawing.
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动,虚拟卡驱动。The kernel layer is the layer between hardware and software. The kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, and a virtual card driver.
下面结合捕获拍照场景,示例性说明电子设备软件以及硬件的工作流程。The workflow of the software and hardware of the electronic device will be exemplarily described below in conjunction with capturing a photographing scene.
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。以该触摸操作是触摸单击操作,该单击操作所对应的控件为相机应用图标的控件为例,相机应用调用应用框架层的接口,启动相机应用,进而通过调用内核层启动摄像头驱动,通过摄像头193捕获静态图像或视频。When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes touch operations into original input events (including touch coordinates, time stamps of touch operations, and other information). Raw input events are stored at the kernel level. The application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Take the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the camera application icon as an example. The camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer. Camera 193 captures still images or video.
本申请的说明书和权利要求书及附图中的术语“用户界面”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接收形式之间的转换。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。The term "user interface" in the specification, claims and drawings of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user. conversion between. The commonly used form of user interface is the graphical user interface (graphic user interface, GUI), which refers to the user interface related to computer operation displayed in a graphical way. It can be an icon, window, control and other interface elements displayed on the display screen of the electronic device, where the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc. Visual interface elements.
下面介绍电子设备100上的用于应用程序菜单的示例性用户界面。An exemplary user interface for the application menu on the electronic device 100 is described below.
图3A示例性示出了电子设备上的用于展示电子设备安装的应用程序的示例性用户界面31。FIG. 3A exemplarily shows an exemplary user interface 31 on an electronic device for displaying applications installed on the electronic device.
用户界面31显示一个放置有应用图标的界面,该界面可包括多个应用图标,例如,时钟应用图标309、日历应用图标311、日历应用图标311、图库应用图标313、备忘录应用图标315、文件管理应用图标317、电子邮件应用图标319、音乐应用图标319、钱包应用图标323、华为视频应用图标325、运动健康应用图标327、天气应用图标329、浏览器应用图标331、智慧生活应用图标333、设置应用图标335、录音机应用图标337、应用商城应用图标339等。上述多个应用图标上方可显示有移动通信信号的一个或多个信号强度指示符303、时间指示符305、电池状态指示符307等。上述多个应用图标下方还可显示有界面指示符349,以表明 当前显示界面与其他界面的位置关系。界面指示符下方有多个托盘图标,如电话应用图标345、信息应用图标347、通讯录应用图标343、相机应用图标341。托盘图标在界面切换时保持显示。本申请实施例对用户界面31上显示的内容不作限定。The user interface 31 displays an interface on which application icons are placed, and the interface may include a plurality of application icons, for example, a clock application icon 309, a calendar application icon 311, a calendar application icon 311, a gallery application icon 313, a memo application icon 315, a file management Application icon 317, email application icon 319, music application icon 319, wallet application icon 323, Huawei video application icon 325, sports health application icon 327, weather application icon 329, browser application icon 331, smart life application icon 333, settings An application icon 335, a recorder application icon 337, an application store application icon 339, and the like. One or more signal strength indicators 303, a time indicator 305, a battery status indicator 307, etc. of mobile communication signals may be displayed above the plurality of application icons. An interface indicator 349 may also be displayed below the above-mentioned multiple application icons to indicate the positional relationship between the currently displayed interface and other interfaces. There are multiple tray icons below the interface indicator, such as a phone application icon 345 , an information application icon 347 , an address book application icon 343 , and a camera application icon 341 . The tray icon remains displayed when the interface is switched. The embodiment of the present application does not limit the content displayed on the user interface 31 .
可以理解的是,图3A仅仅示例性示出了电子设备上的用户界面,不应构成对本申请实施例的限定。It can be understood that FIG. 3A only exemplarily shows a user interface on an electronic device, and should not be construed as limiting the embodiment of the present application.
本申请以下实施例中,电子设备的应用程序“运动健康”、“相机”可提供“运动检测”的功能,其中,“运动检测”功能可用于在运动过程中,对用户的运动姿势进行检测,计算出用户在运动过程中相关关节的关节力/力矩,获知用户在该运动过程中的损伤风险情况。In the following embodiments of the present application, the application programs "Sports Health" and "Camera" of the electronic device can provide the function of "motion detection", wherein the "motion detection" function can be used to detect the user's motion posture during the exercise process , calculate the joint force/moment of the relevant joints of the user during the exercise, and learn the user's injury risk during the exercise.
“运动健康”、“相机”是电子设备上安装的应用程序,本申请对该应用程序的名称不作限制。"Sports and health" and "camera" are applications installed on electronic devices, and this application does not limit the names of the applications.
本申请实施例提供的运动分析的方法可以应用于多种场景,包括但不限于:The motion analysis method provided by the embodiment of this application can be applied to various scenarios, including but not limited to:
(1)在运动类应用中进行运动检测的场景(1) Scenarios for motion detection in motion applications
场景一:scene one:
如图3B所示,电子设备可以检测到作用于“运动健康”图标327的用户操作200(如在图标327上的点击操作),响应于该操作,可以显示图3C示例性所示的用户界面32。用户界面32可以是“运动健康”应用程序的主用户界面,该用户界面可包括运动模式列表351、导航栏352、搜索栏353、控件354、控件355、控件356、控件357、控件358。As shown in Figure 3B, the electronic device can detect a user operation 200 (such as a click operation on the icon 327) acting on the "exercise and health" icon 327, and in response to this operation, the user interface shown in Figure 3C can be displayed exemplarily 32. User interface 32 may be the main user interface of the "Exercise and Health" application, which may include exercise mode list 351 , navigation bar 352 , search bar 353 , controls 354 , controls 355 , controls 356 , controls 357 , and controls 358 .
其中,运动模式列表351可以显示有一个或多个运动模式选项。这一个或多个运动模式选项以包括:室内跑步选项、健身选项、瑜伽选项、步行选项、骑行选项、跳绳选项。这一个或多个运动模式选项在界面上可以表现为文字信息。不限于此,这一个或多个运动模式选项在界面上还可以表现为图标或者其他形式的交互元素(interactive element,IE)。Wherein, the exercise mode list 351 may display one or more exercise mode options. The one or more exercise mode options may include: an indoor running option, a fitness option, a yoga option, a walking option, a cycling option, and a rope skipping option. The one or more exercise mode options can be presented as text information on the interface. Not limited thereto, the one or more exercise mode options may also be represented as icons or other forms of interactive elements (interactive element, IE) on the interface.
其中,控件354、356可用于监听触发打开“运动课程”的用户操作。电子设备100可以检测到作用于控件354的用户操作(如在控件354上的点击操作),响应于该操作,电子设备100可以显示图3D所示的用户界面33。用户界面33可包括控件360、控件361。控件360可用于监听触发打开“开始训练”的用户操作。电子设备可以检测到作用于控件360的用户操作(如在控件360上的点击操作202),响应于该操作,电子设备100可以显示如图3E所示的用户界面34。Among them, the controls 354 and 356 can be used to monitor user operations that trigger opening of the "exercise course". The electronic device 100 may detect a user operation on the control 354 (such as a click operation on the control 354 ), and in response to the operation, the electronic device 100 may display the user interface 33 shown in FIG. 3D . User interface 33 may include controls 360 , 361 . Control 360 may be used to listen for user actions that trigger opening of "Start Training". The electronic device may detect a user operation on the control 360 (such as the click operation 202 on the control 360), and in response to the operation, the electronic device 100 may display the user interface 34 as shown in FIG. 3E.
用户界面34可包括控件362、控件363。控件362可用于监听用户触发选择运动姿势检测功能控件的用户操作(如在控件362上的点击操作203)。控件363可用于监听用户触发不选择运动姿势检测功能控件的用户操作(如在控件363上的点击操作)。 User interface 34 may include controls 362 , 363 . The control 362 can be used to monitor the user's operation (such as the click operation 203 on the control 362 ) that triggers the selection of the motion gesture detection function control. The control 363 can be used to monitor the user's operation (such as a click operation on the control 363 ) to trigger the control not to select the motion posture detection function.
场景二:Scene two:
如图3B所示,电子设备可以检测到作用于“运动健康”的图标327的用户操作200(如在图标327上的点击操作),响应于该操作,可以显示图3C示例性所示的用户界面32。用户界面32可以是“运动健康”应用程序的用户界面,该用户界面可包括运动模式列表351、导航栏352、搜索栏353、控件354、控件355、控件356、控件357、控件358。As shown in Figure 3B, the electronic device can detect the user's operation 200 (such as a click operation on the icon 327) acting on the icon 327 of "Exercise and Health", and in response to this operation, the user operation 200 exemplarily shown in Figure 3C can be displayed. interface32. The user interface 32 may be the user interface of the "Exercise and Health" application, and the user interface may include an exercise mode list 351 , a navigation bar 352 , a search bar 353 , controls 354 , 355 , 356 , 357 , and 358 .
其中,控件358可用于监听触发打开“模拟检测”的用户操作。如图3F所示,电子设备100可以检测到作用于控件358的用户操作(如在控件358上的点击操作204),响应于该操作,电子设备100触发启动模拟运动检测功能。Among them, the control 358 can be used to monitor the user operation that triggers the opening of "simulation detection". As shown in FIG. 3F , the electronic device 100 may detect a user operation on the control 358 (such as the click operation 204 on the control 358 ), and in response to the operation, the electronic device 100 triggers the activation of the analog motion detection function.
另一种情形,如图3D所示,Another situation, as shown in Figure 3D,
用户界面33上可以包括控件361,控件361可用于监听触发打开“模拟检测”的用户操作。电子设备可以检测到作用于控件361的用户操作(如在控件361上的点击操作),响应于该操作,可以显示图3C示例性所示的用户界面32。用户界面32可以是“运动健康”应用程序的用户界面,电子设备100触发启动模拟运动检测功能。The user interface 33 may include a control 361, and the control 361 may be used to monitor a user operation that triggers turning on "simulation detection". The electronic device can detect a user operation on the control 361 (such as a click operation on the control 361 ), and in response to the operation, the user interface 32 exemplarily shown in FIG. 3C can be displayed. The user interface 32 may be the user interface of the "sports and health" application program, and the electronic device 100 triggers to start the simulated exercise detection function.
场景三:Scene three:
如图3G所示,电子设备可以检测到作用于“相机”的图标341的用户操作205(如在图标341上的点击操作),响应于该操作,可以显示图3H示例性所示的用户界面35。用户界面35可以是“相机”应用程序的用户界面,该用户界面可包括相机设置列表364、拍摄模式列表365、运动检测选项366、控件367、控件368、控件368、区域370。其中:As shown in FIG. 3G, the electronic device can detect a user operation 205 (such as a click operation on the icon 341) acting on the icon 341 of the "camera", and in response to the operation, the user interface shown in FIG. 3H can be displayed exemplarily. 35. User interface 35 may be that of a “camera” application, which may include camera settings list 364 , capture mode list 365 , motion detection options 366 , controls 367 , controls 368 , controls 368 , and fields 370 . in:
相机设置列表364可用于显示一个或多个相机设置选项,以便用户调整相机设置参数。拍摄模式列表365可以显示一个或多个拍摄模式选项,可以包括:光圈选项、夜景选项、人像选项、拍照选项、录像选项、专业选项、运动检测选项。控件367用于监听出发打开“图库”的用户操作。控件368用于监听出发拍摄的用户操作。控件369用于监听出发切换摄像头的用户操作。区域370可以用于显示摄像头采集的图像。The camera setting list 364 can be used to display one or more camera setting options, so that the user can adjust the camera setting parameters. The shooting mode list 365 may display one or more shooting mode options, which may include: aperture options, night scene options, portrait options, photo taking options, video recording options, professional options, and motion detection options. The control 367 is used to monitor the user operation of starting to open the “Gallery”. Control 368 is used to monitor the user's operation to start shooting. The control 369 is used to monitor the user's operation of starting and switching the camera. Area 370 may be used to display images captured by the camera.
电子设备100可以检测到作用于运动检测选项的用户操作(如在运动检测选项366的点击操作206),响应于该操作,电子设备100可以触发运动检测功能。The electronic device 100 may detect a user action on the motion detection option (such as the click operation 206 on the motion detection option 366), and in response to the action, the electronic device 100 may trigger the motion detection function.
可理解的,以上场景仅为示例,本申请实施例提供的运动分析的方法还可以应用到其他场景中,这里不做限制。It can be understood that the above scenarios are only examples, and the motion analysis method provided in the embodiment of the present application can also be applied to other scenarios, which is not limited here.
基于上述场景,下面介绍电子设备100上实现的用户界面(user interface,UI)的一些实施例。Based on the above scenario, some embodiments of a user interface (user interface, UI) implemented on the electronic device 100 are introduced below.
图4A-图4D示例性示出了“运动健康”应用程序中人脸识别模块的用户界面。电子设备100可以检测到作用于用户界面34的控件362的用户操作,响应于该操作,电子设备可以显示如图4A示例性所示的用户界面40。用户界面40可用于提示用户即将人脸识别,如显示文字“即将人脸识别”401,提示时间可以为5s,提示时间结束,电子设备100可以显示用户界面41,并开始人脸识别。如图4B所示,用户界面41示例性示出了人脸识别界面,采集人脸信息。电子设备100通过摄像头采集到人脸信息之后,可以进行一些必要的处理,将处理后的人脸信息与已存储的人脸信息模板进行匹配,以基于该人脸信息模板调取用户的身体评估信息。其中该人脸信息模板可以是用户在电子设备100进行人脸识别之前输入的。4A-4D exemplarily show the user interface of the face recognition module in the "Sports and Health" application program. The electronic device 100 may detect a user operation on the control 362 of the user interface 34, and in response to the operation, the electronic device may display the user interface 40 as exemplarily shown in FIG. 4A. The user interface 40 can be used to remind the user that face recognition is about to be performed. For example, the text "face recognition is about to be displayed" 401 is displayed, and the prompt time can be 5 seconds. After the prompt time is over, the electronic device 100 can display the user interface 41 and start face recognition. As shown in FIG. 4B , the user interface 41 exemplarily shows a face recognition interface to collect face information. After the electronic device 100 collects the face information through the camera, it can perform some necessary processing, and match the processed face information with the stored face information template, so as to retrieve the user's physical assessment based on the face information template. information. The face information template may be input by the user before the electronic device 100 performs face recognition.
若采集到的人脸信息与已存储的人脸信息模板匹配失败,电子设备100可以显示如图4C所示的用户界面42。用户界面42可以显示提示403,用于提示用户为新用户,即已存储的人脸信息模板中未存储该用户的人脸信息,无法调取身份评估信息,用户需要进行身体测量评估以获取用户的身体评估信息。提示403的提示时间可以为5s,提示结束后,电子设备可以显示图4E-图4I示例性所示的身体测量评估功能所提供的用户界面。If the collected face information fails to match the stored face information template, the electronic device 100 may display a user interface 42 as shown in FIG. 4C . The user interface 42 can display a prompt 403, which is used to prompt the user to be a new user, that is, the user's face information is not stored in the stored face information template, and the identity evaluation information cannot be retrieved, and the user needs to perform body measurement evaluation to obtain user information. physical assessment information. The prompt time of prompt 403 may be 5s, and after the prompt ends, the electronic device may display the user interface provided by the body measurement evaluation function exemplarily shown in Fig. 4E-Fig. 4I.
若采集到的人脸信息与已存储的人脸信息模板匹配成功,电子设备100可以检测用户的上一次登录时间,若用户的上一次登录时间未超出预设时间段,电子设备100可以获取用户的身份评估信息。If the collected face information matches the stored face information template successfully, the electronic device 100 can detect the user's last login time, and if the user's last login time does not exceed the preset time period, the electronic device 100 can obtain the user's last login time. identity assessment information.
若采集到的人脸信息与已存储的人脸信息模板匹配成功,电子设备100可以检测用户的上一次登录时间,若用户的上一次登录时间超出预设时间段,电子设备可以显示用户界面43。用户界面43可以包括提示404、控件405、控件406。提示404用于提示用户为登录时间超 出预设时间段的用户,建议用户重新进行身体测量评估。控件405用于监听触发重新测量评估的用户操作。电子设备可以检测到作用于控件405的用户操作(如在控件405上的点击操作),响应于该操作,电子设备可以显示如图4E-图4I示例性所示的身体测量评估功能所提供的用户界面。控件406可用于监听触发沿用原有数据的用户操作。原有数据指示用户在此之前存储于电子设备100中的身体评估信息。If the collected face information is successfully matched with the stored face information template, the electronic device 100 can detect the user's last login time, and if the user's last login time exceeds the preset time period, the electronic device can display the user interface 43 . User interface 43 may include prompt 404 , control 405 , control 406 . Prompt 404 is used to prompt the user for the user whose login time exceeds the preset time period, suggesting that the user perform body measurement assessment again. Control 405 is used to listen for user operations that trigger re-measurement evaluation. The electronic device may detect a user operation acting on the control 405 (such as a click operation on the control 405), and in response to the operation, the electronic device may display the body measurement evaluation function provided by the body measurement evaluation function as exemplarily shown in FIGS. 4E-4I. User Interface. Control 406 can be used to monitor user operations that trigger the use of original data. The original data indicates the user's physical assessment information stored in the electronic device 100 before that.
图4E-图4I示例性示出了身体测量评估功能所提供的用户界面。身体测量评估可以包括身体参数评估和身体状态评估两个方面。4E-4I exemplarily illustrate user interfaces provided by the anthropometric assessment function. Anthropometric assessment may include two aspects: physical parameter assessment and physical state assessment.
图4E-图4I示例性示出了身体参数评估的相关用户界面。4E-4I exemplarily illustrate relevant user interfaces for physical parameter assessment.
如图4E所示,电子设备100可以在用户界面44上显示控件407和控件408。As shown in FIG. 4E , the electronic device 100 may display a control 407 and a control 408 on the user interface 44 .
控件407可以监听触发设备检测的用户操作。电子设备100可以检测到作用于控件407的用户操作(如在控件407上的点击操作),响应于该操作,电子设备100可以开启摄像头,并显示用户界面45。如图4F所示,用户界面45可以包括通过摄像头检测到的用户图像409和提示框410。提示410可以显示用户的身高信息。具体实现中,电子设备100可以开启摄像头,检测用户身高,并将用户身高信息显示于提示框410。电子设备100检测用户身高可以为:将电子设备100对准待测量用户,对准脚的位置点点击创建测量点,移动电子设备向上至待测量用户的头部位置即可测量出身高信息。电子设备100检测身高信息完成后,可以显示用户界面46。如图4G所示,用户界面46可以包括:提示框411、控件412和控件413。提示框411可以用于显示用户是否连接体脂秤以获取用户的体重和体脂率信息(body mass index,BMI)。控件412可以监听触发连接体脂秤的用户操作。电子设备100可以检测到作用于控件412的用户操作(如在控件412上的点击操作),响应于该操作,电子设备100可以连接体脂秤,显示用户界面47。如图4H所示,用户界面47可以显示通过体脂秤获取的体重和体脂率信息。电子设备100获取用户的身高、体重、体脂率信息后,可以显示如图4J-图4N所示的身体状态评估的用户界面。 Control 407 may listen for user operations that trigger device detection. The electronic device 100 may detect a user operation on the control 407 (such as a click operation on the control 407 ), and in response to the operation, the electronic device 100 may turn on the camera and display the user interface 45 . As shown in FIG. 4F , the user interface 45 may include a user image 409 detected by a camera and a prompt box 410 . Prompt 410 may display the user's height information. In a specific implementation, the electronic device 100 may turn on the camera, detect the user's height, and display the user's height information in the prompt box 410 . The electronic device 100 can detect the height of the user by: aiming the electronic device 100 at the user to be measured, pointing at the position of the foot and clicking to create a measurement point, moving the electronic device up to the head of the user to measure the height information. After the electronic device 100 detects the height information, it may display the user interface 46 . As shown in FIG. 4G , the user interface 46 may include: a prompt box 411 , a control 412 and a control 413 . The prompt box 411 can be used to display whether the user is connected to the body fat scale to obtain the user's weight and body fat percentage information (body mass index, BMI). Control 412 may listen for user operations that trigger connection to the body fat scale. The electronic device 100 may detect a user operation on the control 412 (such as a click operation on the control 412 ), and in response to the operation, the electronic device 100 may connect to the body fat scale and display the user interface 47 . As shown in FIG. 4H , the user interface 47 can display the body weight and body fat percentage information obtained through the body fat scale. After the electronic device 100 acquires the user's height, weight, and body fat percentage information, it may display a user interface for body state assessment as shown in FIGS. 4J-4N .
控件408可以监听触发用户输入相关数据的用户操作。电子设备100可以检测到作用于控件408的用户操作(如在控件408上的点击操作),响应于该操作,电子设备100可以显示用户界面48。用户界面48可包括输入框416,该输入框可用于接收用户输入的身高、体重和体脂率的信息。电子设备100可以检测到作用于输入框416的用户操作(如在输入框416上的输入操作),响应于该操作,可以显示如图4J-图4N所示的身体状态评估的用户界面。 Control 408 may listen for user operations that trigger user input of relevant data. The electronic device 100 may detect a user operation on the control 408 (eg, a click operation on the control 408 ), and in response to the operation, the electronic device 100 may display the user interface 48 . The user interface 48 may include an input box 416, which may be used to receive user-entered information on height, weight, and body fat percentage. The electronic device 100 may detect a user operation acting on the input box 416 (such as an input operation on the input box 416), and in response to the operation, may display a user interface for body state assessment as shown in FIGS. 4J-4N.
图4J-图4N示例性示出了身体状态评估的用户界面。4J-4N exemplarily illustrate user interfaces for physical state assessment.
如图4J所示,用户界面49可以包括:提示框417、控件418、控件419。提示框417可用于提示即将进行身体状态评估。As shown in FIG. 4J , the user interface 49 may include: a prompt box 417 , a control 418 , and a control 419 . The prompt box 417 may be used to prompt that a physical state assessment is about to be performed.
控件418可以监听触发设备检测身体状态的用户操作。电子设备100可以检测到作用于控件418的用户操作(如在控件418上的点击操作),响应于该操作,电子设备100可以显示用户界面50。如图4K所示,用户界面50可包括提示框420,提示框420可用于提示用户进行身体状态评估时需要注意的信息,该信息可以是提示“即将开启摄像仪,请将手放置于损伤部位,将根据手放置时间反映严重程度。”提示框420的提示时间可以为5s,提示时间结束,电子设备可以开启摄像头,显示用户界面51。如图4L所示,用户界面51可以包括区域421、显示框422。区域421可以显示电子设备100通过摄像头采集到的用户图像,区域421可以显示用户的整体图像,也可以是显示用户的下肢图像。显示框422可用于显示用户 将手放置于损伤部位的时间反映的损伤程度。显示框422可包括放置时间为<2s、2-4s、4-6s、6-8s、>8s时反映的损伤程度框,<2s、2-4s、4-6s、6-8s、>8s的损伤程度框随放置时间从短至长颜色逐渐变化,可分别显示为蓝、绿、黄、橙、红色。例如,电子设备100检测到用户的放置时间为5s时,对应4-6s的损伤程度框将变为红色。电子设备100可以检测到损伤程度框颜色显示完成后,显示用户界面52。如图4M所示,用户界面52可包括提示框423。提示框423可用于显示用户损伤部位受力为标准受力的百分比。电子设备100获取用户的身体状态评估信息后,可以显示如图5A-图5C所示的检测骨骼节点的用户界面。控件419可以监听触发手动输入身体状态的用户操作。电子设备100可以检测到作用于控件419的用户操作(如在控件419上的点击操作),响应于该操作,电子设备100可以显示用户界面53。用户界面53可包括输入框424、425。输入框424可用于接收用户输入的损伤部位名称。输入框425可用于接收用户该损伤部位的损伤程度,损伤程度可以分为1-5档,数字越大,损伤程度越严重。1-5挡分别对应相应的损伤程度框,可以分别显示为蓝、绿、黄、橙、红色。电子设备100可以检测到作用于输入框425上1-5档损伤程度框的用户操作(如在输入框425上1-5档中任一档的点击操作),对应损伤程度框显示对应颜色。电子设备100可以检测到作用于用户界面53的用户操作完成后,显示用户界面52。用户界面52可包括提示框423。提示框423可用于显示用户损伤部位受力为标准受力的百分比。电子设备100获取用户的身体状态评估信息后,可以显示如图5A-图5C所示的检测骨骼节点的用户界面。The control 418 may listen for user operations that trigger the device to detect a body state. The electronic device 100 may detect a user operation on the control 418 (eg, a click operation on the control 418 ), and in response to the operation, the electronic device 100 may display the user interface 50 . As shown in FIG. 4K, the user interface 50 may include a prompt box 420, which can be used to prompt the user to pay attention to the information when evaluating the physical state. The information may be a prompt "The camera is about to be turned on, please place your hand on the injured part , will reflect the severity according to the time the hand is placed." The prompt time of the prompt box 420 can be 5s, and when the prompt time ends, the electronic device can turn on the camera and display the user interface 51 . As shown in FIG. 4L , the user interface 51 may include an area 421 and a display frame 422 . The area 421 may display the image of the user collected by the electronic device 100 through the camera, and the area 421 may display the overall image of the user, or the image of the user's lower limbs. The display frame 422 can be used to display the damage degree reflected by the time when the user puts the hand on the damage site. The display frame 422 can include the damage degree frame reflected when the placement time is <2s, 2-4s, 4-6s, 6-8s, >8s, <2s, 2-4s, 4-6s, 6-8s, >8s The color of the damage degree box gradually changes from short to long with the placement time, and can be displayed as blue, green, yellow, orange, and red respectively. For example, when the electronic device 100 detects that the user left it for 5s, the damage degree box corresponding to 4-6s will turn red. The electronic device 100 may display the user interface 52 after detecting that the color of the damage degree frame is displayed. As shown in FIG. 4M , the user interface 52 may include a prompt box 423 . The prompt box 423 can be used to display the percentage of the force on the damaged part of the user as the standard force. After the electronic device 100 acquires the user's body state assessment information, it may display the user interface for detecting bone nodes as shown in FIGS. 5A-5C . Control 419 may listen for user operations that trigger manual input of the body state. The electronic device 100 may detect a user operation on the control 419 (eg, a click operation on the control 419 ), and in response to the operation, the electronic device 100 may display the user interface 53 . The user interface 53 may include input boxes 424 , 425 . The input box 424 can be used to receive the name of the injury site input by the user. The input box 425 can be used to receive the damage degree of the damaged part of the user, and the damage degree can be divided into 1-5 grades, and the larger the number, the more serious the damage degree. Gears 1-5 correspond to the corresponding damage degree boxes, which can be displayed as blue, green, yellow, orange, and red respectively. The electronic device 100 may detect a user operation acting on the 1-5 damage degree box on the input box 425 (for example, a click operation on any one of the 1-5 grades on the input box 425), and the corresponding damage degree box displays a corresponding color. The electronic device 100 may display the user interface 52 after detecting that the user operation on the user interface 53 is completed. The user interface 52 may include a prompt box 423 . The prompt box 423 can be used to display the percentage of the force on the damaged part of the user as the standard force. After the electronic device 100 acquires the user's body state assessment information, it may display the user interface for detecting bone nodes as shown in FIGS. 5A-5C .
图5A-图5C示例性示出了检测骨骼节点的用户界面。5A-5C exemplarily illustrate a user interface for detecting skeletal nodes.
如图5A所示,电子设备开启摄像头,显示用户界面54,该用户界面可包括区域501、提示框502。区域501可用于显示摄像头实时采集的用户图像,电子设备100可以实时刷新其中的显示内容,以便电子设备100通过采集的用户图像检测该用户的骨骼节点位置。提示框502可显示检测骨骼节点的状态,可以是文字“骨骼节点检测中…”。电子设备100检测骨骼节点完成后,可以显示如图6所示的空间姿态校准的用户界面。As shown in FIG. 5A , the electronic device turns on a camera and displays a user interface 54 , which may include an area 501 and a prompt box 502 . The area 501 can be used to display the user image collected by the camera in real time, and the electronic device 100 can refresh the displayed content therein in real time, so that the electronic device 100 can detect the position of the user's bone nodes through the collected user image. The prompt box 502 can display the state of the detected bone node, which can be the text "Detecting the bone node...". After the detection of the skeletal nodes is completed, the electronic device 100 may display a user interface for space attitude calibration as shown in FIG. 6 .
在一些实施例中,电子设备100实时采集的用户图像时,如图5B和图5C所示,如果电子设备100检测到用户未呈直立状态,存在姿势异常情形,如左腿弯曲等,则可以在提示框502中输出提示信息504,提示信息504可以是文字“姿势异常,请保持直立”,可用于提示用户调整姿势,保持直立状态。电子设备100检测到用户调整为直立状态后,可以显示用户界面54。In some embodiments, when the user image is collected by the electronic device 100 in real time, as shown in FIG. 5B and FIG. 5C , if the electronic device 100 detects that the user is not in an upright state, and there is an abnormal posture, such as the left leg is bent, etc., it can A prompt message 504 is output in the prompt box 502. The prompt message 504 may be the text "abnormal posture, please keep upright", which may be used to prompt the user to adjust the posture and maintain an upright state. After the electronic device 100 detects that the user adjusts to the upright state, it may display the user interface 54 .
图6示例性示出了空间姿态校准的用户界面。Fig. 6 exemplarily shows a user interface for space attitude calibration.
如图6所示,用户界面60可以包括提示框601和提示框602。As shown in FIG. 6 , the user interface 60 may include a prompt box 601 and a prompt box 602 .
提示框601可用于显示用户界面60为设定离地参考值的界面,提示框601可以是文字“空间姿态校准中…”。The prompt box 601 may be used to display that the user interface 60 is an interface for setting the reference value of the ground lift, and the prompt box 601 may be the text "Space attitude calibration...".
提示框602可用于输出时间倒计时,可以是数字3、2、1的变化,以用于提示用户控件姿态校准所需时间。The prompt box 602 can be used to output the countdown of the time, which can be a change of the number 3, 2, 1, to remind the user of the time required for the attitude calibration of the control.
当电子设备100检测到空间姿态校准完成后,可以显示如图7所示的设定离地参考值的用户界面。When the electronic device 100 detects that the space attitude calibration is completed, it may display the user interface for setting the ground clearance reference value as shown in FIG. 7 .
图7示例性示出了设定离地/触地参考值的用户界面。FIG. 7 exemplarily shows a user interface for setting the ground lift/ground contact reference value.
如图7所示,用户界面70可以包括提示框701、控件702和输入框703。As shown in FIG. 7 , the user interface 70 may include a prompt box 701 , a control 702 and an input box 703 .
提示框701可用于显示用户界面70为设定离地参考值的界面,可以是文字“设定离地参考值”。The prompt box 701 may be used to display that the user interface 70 is an interface for setting the ground lift reference value, and may be the text "set the ground lift reference value".
控件702可以监听触发设备设定用户的离地参考值的用户操作。电子设备100可以检测到作用于控件702的用户操作(如在控件702上的点击操作),响应于该操作,电子设备100可以设定用户的离地参考值。电子设备100检测到用户的离地/触地参考值设定完成后,可以显示如图8A或图8B所示的受力检测的用户界面。 Control 702 may listen for a user operation that triggers the device to set the user's ground clearance reference value. The electronic device 100 may detect a user operation on the control 702 (such as a click operation on the control 702 ), and in response to the operation, the electronic device 100 may set the user's ground clearance reference value. After the electronic device 100 detects that the user's ground lift/ground contact reference value is set, it may display a force detection user interface as shown in FIG. 8A or FIG. 8B .
输入框703可用于接收用户输入的离地参考值。电子设备100可以检测到作用于输入框703的用户操作(如在输入框703上的输入操作),响应于该操作,可以显示如图8A或图8B所示的受力检测的用户界面。The input box 703 can be used to receive the ground reference value input by the user. The electronic device 100 may detect a user operation acting on the input box 703 (such as an input operation on the input box 703), and in response to the operation, may display a force detection user interface as shown in FIG. 8A or FIG. 8B.
图8A-图8C示例性示出了受力检测的用户界面。8A-8C exemplarily illustrate the user interface of force detection.
如图8A所示,用户界面80可以包括区域801、区域802和提示框803、提示框804。As shown in FIG. 8A , the user interface 80 may include an area 801 , an area 802 , a prompt box 803 , and a prompt box 804 .
区域801可用于显示摄像头193实时采集的用户运动图像。区域802可用于显示运动课程的示范动作图像。提示框803可用于显示当前运动动作名称。提示框804可用于显示用户运动量,可以是数字和文字的结合,如“5千卡”。The area 801 can be used to display the moving images of the user collected by the camera 193 in real time. Area 802 may be used to display exemplary action images for an exercise session. The prompt box 803 can be used to display the name of the current sports action. The prompt box 804 can be used to display the user's exercise amount, which can be a combination of numbers and words, such as "5 kcal".
在一些实施例中,如图8B所示,用户界面81可包括区域801、区域802、提示框803、提示框804和图标805、提示框806。区域801、区域802、提示框803、提示框804可以参考用户界面80中相关描述,在此不再赘述。图标805可用于突出显示用户的关节受力部位,并通过颜色标注凸显受力大小。如,关节受力较小时,可在图标805上显示为绿色;关节受力较大时,可在图标805上显示为黄色;关节受力超过关节受力阈值,可能产生损伤风险时,在图标805上显示为红色。提示框806可用于显示对应关节部位的受力值。In some embodiments, as shown in FIG. 8B , the user interface 81 may include an area 801 , an area 802 , a prompt box 803 , a prompt box 804 , an icon 805 , and a prompt box 806 . For the area 801 , the area 802 , the prompt box 803 , and the prompt box 804 , reference may be made to the relevant description in the user interface 80 , and details are not repeated here. The icon 805 can be used to highlight the force-bearing parts of the user's joints, and highlight the magnitude of the force through color marking. For example, when the joint force is small, it can be displayed in green on the icon 805; when the joint force is large, it can be displayed in yellow on the icon 805; It is displayed in red on the 805. The prompt box 806 can be used to display the force value of the corresponding joint.
电子设备100检测到用户存在一定损伤风险,可以显示如图9A-图9D所示的输出风险提示信息的用户界面。The electronic device 100 detects that the user has a certain risk of injury, and may display a user interface for outputting risk prompt information as shown in FIGS. 9A-9D .
图8C示例性示出了对运动课程进行模拟检测的一种用户界面。用户界面82包括区域807,区域807可用于显示运动课程中示范动作的图像。FIG. 8C exemplarily shows a user interface for simulated testing of exercise sessions. The user interface 82 includes an area 807 that may be used to display images of exemplary moves during an exercise session.
图9A-图9E示例性示出了输出风险提示信息的用户界面。9A-9E exemplarily show a user interface for outputting risk warning information.
如图9A所示,用户界面90可包括提示框901、902和控件903。提示框901可用于显示运动动作的高风险信息,可以是文字“检测到当前运动动作存在较高损伤风险”。提示框902可用于显示当前动作存在较高风险的原因,可以是文字“左腿受力过大”。控件903可以监听触发返回受力检测的用户操作。电子设备100可以检测到作用于控件903的用户操作(如在控件903上的点击操作),响应于该操作,电子设备100可以显示如图8A或图8B所示的受力检测的用户界面。As shown in FIG. 9A , the user interface 90 may include prompt boxes 901 , 902 and a control 903 . The prompt box 901 may be used to display high-risk information of sports actions, which may be the text "It is detected that the current sports action has a high risk of injury". The prompt box 902 can be used to display the reason why the current action has a higher risk, and it can be the text "excessive force on the left leg". The control 903 may monitor user operations that trigger returning to force detection. The electronic device 100 may detect a user operation on the control 903 (such as a click operation on the control 903 ), and in response to the operation, the electronic device 100 may display a user interface for force detection as shown in FIG. 8A or 8B .
电子设备100在预设时间段内未检测到作用于控件903的用户操作,或用户界面90显示时间超出预设时间段时,可以显示用户91。如图9B所示,用户界面91可以包括控件903、提示框904、控件905和906。控件903可以参考用户界面90中相关描述,在此不再赘述。提示框904可以显示文字“是否继续此运动?”。The electronic device 100 may display the user 91 when no user operation on the control 903 is detected within the preset time period, or when the display time of the user interface 90 exceeds the preset time period. As shown in FIG. 9B , the user interface 91 may include a control 903 , a prompt box 904 , and controls 905 and 906 . For the control 903, reference may be made to related descriptions in the user interface 90, and details are not repeated here. The prompt box 904 may display the text "Continue this exercise?".
控件905可以监听触发继续运动并接收运动指导的用户操作。电子设备100可以检测到作用于控件905的用户操作(如在控件905上的点击操作),响应于该操作,电子设备100 可以显示用户界面92。如图9C所示,用户界面92可以包括控件903和提示框907。提示框907可以显示存在较高风险的运动动作的调整方案。电子设备100可以检测到用户界面92显示时间超出预设时间段,可以显示如图8A或图8B所示的受力检测的用户界面。The control 905 may listen for user operations that trigger continued movement and receive movement guidance. The electronic device 100 may detect a user operation on the control 905 (eg, a click operation on the control 905 ), and in response to the operation, the electronic device 100 may display the user interface 92 . As shown in FIG. 9C , the user interface 92 may include a control 903 and a prompt box 907 . The prompt box 907 may display an adjustment plan for sports actions with higher risks. The electronic device 100 may detect that the display time of the user interface 92 exceeds a preset time period, and may display the force detection user interface as shown in FIG. 8A or FIG. 8B .
控件906可以监听触发切换运动课程的用户操作。电子设备100可以检测到作用于控件906的用户操作(如在控件906上的点击操作),响应于该操作,电子设备100可以显示用户界面93。用户界面93可以包括控件908和控件909。控件908可以监听触发返回原运动课程的用户操作。电子设备100可以检测到作用于控件908的用户操作(如在控件908上的点击操作),响应于该操作,电子设备100可以显示如图8A或图8B所示的受力检测的用户界面或用户界面92。控件909可以监听触发选择推荐的运动课程的用户操作。电子设备100可以检测到作用于控件909的用户操作(如在控件909上的点击操作),响应于该操作,电子设备100可以切换为用户选择的推荐运动课程。 Control 906 may listen for user operations that trigger switching of exercise classes. The electronic device 100 may detect a user operation on the control 906 (such as a click operation on the control 906 ), and in response to the operation, the electronic device 100 may display the user interface 93 . User interface 93 may include control 908 and control 909 . Control 908 may listen for user operations that trigger a return to the original exercise session. The electronic device 100 may detect a user operation acting on the control 908 (such as a click operation on the control 908), and in response to the operation, the electronic device 100 may display a force detection user interface as shown in FIG. 8A or FIG. 8B or user interface92. Control 909 may listen for user actions that trigger selection of a recommended exercise session. The electronic device 100 may detect a user operation on the control 909 (such as a click operation on the control 909 ), and in response to the operation, the electronic device 100 may switch to the recommended exercise course selected by the user.
在一些实施例中,电子设备100显示用户界面90后,可以显示用户界面92。用户界面92可包括提示框901、902和控件903。电子设备100检测到用户界面90的显示时间超出预设时间段时,可以显示用户界面92。用户界面92可以包括控件903和提示框907。电子设备100可以检测到作用于控件906的用户操作(如在控件906上的点击操作),响应于该操作或检测到用户界面92显示时间超出预设时间段,可以显示如图8A或图8B所示的受力检测的用户界面。In some embodiments, the electronic device 100 may display the user interface 92 after displaying the user interface 90 . The user interface 92 may include prompt boxes 901 , 902 and a control 903 . When the electronic device 100 detects that the display time of the user interface 90 exceeds a preset time period, it may display the user interface 92 . The user interface 92 may include a control 903 and a prompt box 907 . The electronic device 100 may detect a user operation acting on the control 906 (such as a click operation on the control 906), and in response to the operation or detect that the display time of the user interface 92 exceeds a preset period of time, it may display the user interface as shown in FIG. 8A or FIG. 8B The user interface of the force detection is shown.
图9E示例性示出了对运动课程进行模拟检测的一种用户界面。用户界面94可以包括:提示框910,提示框910中可以包括提示911、912、913。提示911可以用来提示该运动课程的模拟检测完成,提示912可用以提示该运动课程的损伤风险程度,提示913可用以显示该运动课程中具有较高损伤风险的动作及针对该动作的调整方案。FIG. 9E exemplarily shows a user interface for performing a simulation test on an exercise course. The user interface 94 may include: a prompt box 910 , and the prompt box 910 may include prompts 911 , 912 , and 913 . Prompt 911 can be used to prompt that the simulation test of the exercise course is completed, prompt 912 can be used to prompt the degree of injury risk of the exercise course, and prompt 913 can be used to display the action with a higher risk of injury in the exercise course and the adjustment plan for the action .
下面,以使用电子设备进行运动分析为例,详细说明本申请提供的一种运动分析的方法。In the following, a motion analysis method provided by the present application will be described in detail by taking motion analysis using an electronic device as an example.
图10示出了一种运动分析的方法的详细流程。如图10所示,该方法可以包括:FIG. 10 shows a detailed flow of a method for motion analysis. As shown in Figure 10, the method may include:
S101:电子设备接收用户针对第一应用的用户操作,该用户操作用于指示电子设备获取身份评估信息。S101: The electronic device receives a user operation of a user on a first application, where the user operation is used to instruct the electronic device to obtain identity evaluation information.
第一应用可以是电子设备中的运动类应用,如智能手机或电视中的运动应用、专业运动检测系统等,也可以是相机。The first application may be a motion application in an electronic device, such as a motion application in a smart phone or a TV, a professional motion detection system, etc., or may be a camera.
示例性地,第一应用可以是图3A中电子设备100中的“运动健康”应用,可以是图3A中电子设备中的“相机”。“运动健康”是智能手机、平板电脑等电子设备上的一款运动健身应用程序,本申请实施例对该应用程序的名称不做限制。“相机”是智能手机、平板电脑等电子设备上的一款拍照应用程序,本申请实施例对该应用程序的名称不做限制。Exemplarily, the first application may be the "sports and health" application in the electronic device 100 in FIG. 3A, and may be the "camera" in the electronic device in FIG. 3A. "Sports Health" is a sports and fitness application program on electronic devices such as smart phones and tablet computers. The embodiment of this application does not limit the name of the application program. "Camera" is a photo-taking application program on electronic devices such as smart phones and tablet computers, and the embodiment of the present application does not limit the name of the application program.
针对第一应用的用户操作可以是用户的触控操作,也可以是用户的语音操作、手势操作等,在此不作限定。The user operation for the first application may be a user's touch operation, or may be a user's voice operation, gesture operation, etc., which is not limited herein.
身份评估信息可以包括身体参数评估信息,还可以包括身体状态评估信息。身体参数评估信息可以包括用户的身高、体重、体脂率,身体状态评估信息可以包括用户的损伤部位和损伤程度。损伤是指人体受到外界各种创伤因素作用所引起的皮肉、筋骨、脏腑等组织结构的破坏及其所带来的局部或全身反应。损伤部位即人体受到损伤的部位,例如脚踝、膝盖等。The identity evaluation information may include physical parameter evaluation information, and may also include physical state evaluation information. The body parameter evaluation information may include the user's height, weight, and body fat percentage, and the body state evaluation information may include the user's injury site and injury degree. Injury refers to the destruction of the human body's skin, muscles, bones, viscera and other tissue structures caused by various external traumatic factors and the resulting local or systemic reactions. The damaged part refers to the damaged part of the human body, such as ankle and knee.
用户在电子设备上触发获取身体评估信息的方式可以是在第一应用的主界面中触发具有启动运动检测功能的控件;或,用户可以在第一应用的主界面中触发某运动课程控件,显示 该运动课程的主界面,在该运动课程的主界面中触发具有启动运动检测功能的控件,以获取身份评估信息。The way for the user to trigger the acquisition of physical assessment information on the electronic device may be to trigger a control with the function of starting motion detection in the main interface of the first application; or, the user may trigger a certain exercise course control in the main interface of the first application to display The main interface of the exercise course, triggering the control with the function of starting the exercise detection in the main interface of the exercise course, so as to obtain identity evaluation information.
在一种可能的实施方式中,用户界面可以是如图3B-图3E所示的用户界面。电子设备可以检测到作用于“运动健康”的图标327的用户操作200(如在图标327上的点击操作),响应于该操作,可以显示图3C示例性所示的用户界面32。电子设备100可以检测到作用于用户界面32中的控件354的用户操作(如在控件354上的点击操作),响应于该操作,电子设备100可以显示图3D所示的用户界面33。用户界面33为该运动课程的介绍主界面。电子设备100可以检测到作用于用户界面33中控件360的用户操作(如在控件360上的点击操作),响应于该操作,电子设备100可以显示用户界面34。电子设备可以检测到作用于用户界面34中控件362的用户操作(如在控件362上的点击操作),触发获取身份评估信息的操作。In a possible implementation manner, the user interface may be the user interface shown in FIG. 3B-FIG. 3E. The electronic device may detect a user operation 200 (such as a click operation on the icon 327 ) acting on the icon 327 of "exercise and health", and in response to the operation, the user interface 32 exemplarily shown in FIG. 3C may be displayed. The electronic device 100 may detect a user operation on the control 354 in the user interface 32 (such as a click operation on the control 354 ), and in response to the operation, the electronic device 100 may display the user interface 33 shown in FIG. 3D . The user interface 33 is the main interface for introducing the exercise course. The electronic device 100 may detect a user operation on the control 360 in the user interface 33 (such as a click operation on the control 360 ), and in response to the operation, the electronic device 100 may display the user interface 34 . The electronic device may detect a user operation on the control 362 in the user interface 34 (such as a click operation on the control 362 ), and trigger the operation of acquiring identity assessment information.
在一种可能的实施方式中,用户界面可以是如图3B-图3E所示的用户界面。电子设备100可以检测到作用于用户界面33中控件358的用户操作(如在控件358上的点击操作),响应于该操作,电子设备100触发启动模拟运动检测功能,以获取身份评估信息。In a possible implementation manner, the user interface may be the user interface shown in FIG. 3B-FIG. 3E. The electronic device 100 may detect a user operation on the control 358 in the user interface 33 (such as a click operation on the control 358 ), and in response to the operation, the electronic device 100 triggers and activates a simulated motion detection function to obtain identity assessment information.
在一种可能的实施方式中,用户界面可以是如图3B和图3C所示的用户界面。电子设备可以检测到作用于“运动健康”的图标327的用户操作200(如在图标327上的点击操作),响应于该操作,可以显示图3C示例性所示的用户界面32。电子设备100可以检测到作用于用户界面32中控件358的用户操作(如在控件358上的点击操作),响应于该操作,电子设备100触发启动模拟运动检测功能,以获取身份评估信息。In a possible implementation manner, the user interface may be the user interface shown in FIG. 3B and FIG. 3C . The electronic device may detect a user operation 200 (such as a click operation on the icon 327 ) acting on the icon 327 of "exercise and health", and in response to the operation, the user interface 32 exemplarily shown in FIG. 3C may be displayed. The electronic device 100 may detect a user operation on the control 358 in the user interface 32 (such as a click operation on the control 358 ), and in response to the operation, the electronic device 100 triggers the activation of a simulated motion detection function to obtain identity assessment information.
在另一种可能的实施方式中,用户界面可以是如图3G和图3H所示的用户界面。如图3G所示,电子设备可以检测到作用于“相机”的图标341的用户操作205(如在图标341上的点击操作),响应于该操作,可以显示图3H示例性所示的用户界面35。电子设备100可以检测到作用于用户界面35中运动检测选项的用户操作(如在拍摄模式列表365上的点击操作),响应于该操作,电子设备100可以触发运动检测功能,以获取身份评估信息。In another possible implementation manner, the user interface may be the user interface shown in FIG. 3G and FIG. 3H . As shown in FIG. 3G, the electronic device can detect a user operation 205 (such as a click operation on the icon 341) acting on the icon 341 of the "camera", and in response to the operation, the user interface shown in FIG. 3H can be displayed exemplarily. 35. The electronic device 100 can detect a user operation acting on the motion detection option in the user interface 35 (such as a click operation on the shooting mode list 365), and in response to the operation, the electronic device 100 can trigger the motion detection function to obtain identity assessment information .
S102:电子设备获取用户的身份评估信息,其中,身份评估信息包括身体参数评估信息。S102: The electronic device acquires identity assessment information of the user, where the identity assessment information includes physical parameter assessment information.
身体参数评估信息可以包括体重和身高。Physical parameter assessment information may include weight and height.
身份评估信息还可以包括身体状态评估信息。身体状态评估信息可以包括用户的损伤部位和损伤程度。Identity assessment information may also include physical status assessment information. The physical state evaluation information may include the user's injury site and injury degree.
身份评估信息还可以包括运动能力指标,运动能力指标是指用户能承受的运动强度。The identity evaluation information may also include an exercise ability index, which refers to the exercise intensity that the user can bear.
在一些实施例中,电子设备可以通过摄像头等探测设备探测或接收用户输入的身份评估信息。例如,电子设备可以开启摄像头,检测用户图像,通过检测到的用户图像获取出该用户的身高、体重、损伤部位和损伤程度;电子设备可以通过跳转至已安装的体重测量应用程序,检测用户的体重;通过将获取的用户身高数据和体重数据进行处理,可以获取用户的BMI。又例如,电子设备可以获取用户自行输入的身高、体重或BMI数值、损伤部位和损伤程度等信息。电子设备可以通过获取的身高和体重,通过体重÷身高 2计算BMI数值。 In some embodiments, the electronic device may detect or receive the identity assessment information input by the user through a detection device such as a camera. For example, the electronic device can turn on the camera, detect the user's image, and obtain the user's height, weight, injury site and injury degree through the detected user image; the electronic device can detect the user's weight by jumping to the installed weight measurement application. The user's weight; by processing the obtained user's height data and weight data, the user's BMI can be obtained. For another example, the electronic device may obtain information such as height, weight or BMI value, injury site and injury degree input by the user. The electronic device can use the obtained height and weight to calculate the BMI value by weight ÷ height 2 .
在一种实现中,电子设备可以通过摄像头等探测设备获取用户的身份评估信息。例如,电子设备可以开启摄像头,检测用户图像,通过检测到的用户图像获取出该用户的身高;电子设备可以通过蓝牙连接体脂秤,获取用户的体重和BMI数值;电子设备可以通过开启摄像头检测用户将身体某部位放置于损伤部位及其放置时间来获取用户的身体状态评估信息例如,电子设备可以通过开启摄像头检测用户将手放置于损伤部位及其放置时间来获取用户的身体 状态评估信息。具体可如图4E-图4H、图4J-图4M所示,在此不再赘述。In one implementation, the electronic device may obtain the user's identity evaluation information through a detection device such as a camera. For example, the electronic device can turn on the camera, detect the user's image, and obtain the user's height through the detected user image; the electronic device can connect to the body fat scale through Bluetooth, and obtain the user's weight and BMI value; The user places a certain part of the body on the damaged part and the time it was placed to obtain the user's physical state assessment information. For example, the electronic device can obtain the user's physical state assessment information by turning on the camera to detect the user's hand placed on the damaged part and the time it was placed. Specifically, it can be shown in FIG. 4E-FIG. 4H, FIG. 4J-FIG. 4M, which will not be repeated here.
在另一种实现中,电子设备接收用户输入的身份评估信息。例如,电子设备可以接收用户输入的身高、体重、BMI数值、损伤部位和损伤程度。如图4E和图4I所示,电子设备100可以检测到作用于控件408的用户操作(如在控件408上的点击操作),响应于该操作,电子设备100可以显示用户界面48。用户界面48可包括输入框416,该输入框可用于接收用户输入的身高、体重和体脂率的信息。电子设备100可以检测到作用于输入框416的用户操作(如在输入框416上的输入操作),响应于该操作,可以显示如图4J所示的身体状态评估的用户界面。电子设备100可以检测到作用于用户界面49中控件419的用户操作(如在控件419上的点击操作),响应于该操作,电子设备100可以显示用户界面53。用户界面53可包括输入框424、425。输入框424可用于接收用户输入的损伤部位名称。输入框425可用于接收用户该损伤部位的损伤程度,损伤程度可以分为1-5档,数字越大,损伤程度越严重。1-5挡分别对应相应的损伤程度框,可以分别显示为蓝、绿、黄、橙、红色。电子设备100可以检测到作用于输入框425上1-5档损伤程度框的用户操作(如在输入框425上1-5档中任一档的点击操作),对应损伤程度框显示对应颜色。电子设备100可以检测到作用于用户界面53的用户操作完成后,显示用户界面52。用户界面52可包括提示框423。提示框423可用于显示用户损伤部位受力为标准受力的百分比。In another implementation, the electronic device receives identity assessment information input by a user. For example, the electronic device may receive the user's input of height, weight, BMI value, injury site and injury degree. As shown in FIGS. 4E and 4I , the electronic device 100 may detect a user operation on the control 408 (such as a click operation on the control 408 ), and in response to the operation, the electronic device 100 may display the user interface 48 . The user interface 48 may include an input box 416, which may be used to receive user-entered information on height, weight, and body fat percentage. The electronic device 100 may detect a user operation on the input box 416 (such as an input operation on the input box 416), and in response to the operation, may display a user interface for body state assessment as shown in FIG. 4J . The electronic device 100 may detect a user operation on the control 419 in the user interface 49 (such as a click operation on the control 419 ), and in response to the operation, the electronic device 100 may display the user interface 53 . The user interface 53 may include input boxes 424 , 425 . The input box 424 can be used to receive the name of the injury site input by the user. The input box 425 can be used to receive the damage degree of the damaged part of the user, and the damage degree can be divided into 1-5 grades, and the larger the number, the more serious the damage degree. Gears 1-5 correspond to the corresponding damage degree boxes, which can be displayed as blue, green, yellow, orange, and red respectively. The electronic device 100 can detect a user operation acting on the 1-5 damage degree box on the input box 425 (for example, a click operation on any one of the 1-5 grades on the input box 425), and the corresponding damage degree box displays a corresponding color. The electronic device 100 may display the user interface 52 after detecting that the user operation on the user interface 53 is completed. The user interface 52 may include a prompt box 423 . The prompt box 423 can be used to display the percentage of the force on the damaged part of the user as the standard force.
在另一些实施例中,电子设备可以获取用户的身份特征,基于身份特征获取身份评估信息。身份特征可以是人脸信息、指纹信息等。例如,电子设备可以在接收用户操作后,开启摄像头,获取人脸图像,经过处理后得到人脸信息;将处理得到的人脸信息与电子设备存储的人脸信息模板进行匹配,调取身体评估信息;又例如:In some other embodiments, the electronic device may acquire the identity feature of the user, and acquire the identity evaluation information based on the identity feature. Identity features can be face information, fingerprint information, etc. For example, after receiving the user's operation, the electronic device can turn on the camera, obtain the face image, and obtain the face information after processing; match the processed face information with the face information template stored in the electronic device, and call the body assessment information; for example:
在一种实现中,电子设备获取用户的身份特征,在电子设备已存有的身份特征中不包括该身份特征时,可以通过摄像头等探测设备探测或接收用户输入的身份评估信息。例如,电子设备可以通过摄像头获取人脸图像,经过处理后得到人脸信息。将处理后的人脸信息与已存储的人脸信息模板进行匹配,电子设备检测到匹配失败,电子设备可以通过摄像头等探测设备探测或接收用户输入的身份评估信息。In one implementation, the electronic device obtains the identity feature of the user, and when the identity feature stored in the electronic device does not include the identity feature, it can detect or receive the identity evaluation information input by the user through a detection device such as a camera. For example, an electronic device may obtain a face image through a camera, and obtain face information after processing. Match the processed face information with the stored face information template, and if the electronic device detects that the matching fails, the electronic device can detect or receive identity evaluation information input by the user through a detection device such as a camera.
在另一种实现中,电子设备获取用户的身份特征,在电子设备已存有的身份特征中包括该身份特征,电子设备检测用户上一次的登录时间超出预设时间段,电子设备可以通过摄像头等探测设备探测或接收用户输入的身份评估信息。例如,电子设备可以通过摄像头获取人脸图像,经过处理后得到人脸信息。将处理后的人脸信息与已存储的人脸信息模板进行匹配,电子设备检测到匹配的人脸信息模板,电子设备检测用户上一次的登录时间超出预设时间段,电子设备可以通过摄像头等探测设备探测或接收用户输入的身份评估信息。In another implementation, the electronic device obtains the user's identity feature, and includes the identity feature in the existing identity feature of the electronic device. The electronic device detects that the user's last login time exceeds a preset time period, and the electronic device can and other detection devices to detect or receive identity assessment information input by users. For example, an electronic device may obtain a face image through a camera, and obtain face information after processing. Match the processed face information with the stored face information template, the electronic device detects the matched face information template, the electronic device detects that the user's last login time exceeds the preset time period, and the electronic device can pass through the camera, etc. The detection device detects or receives user-entered identity assessment information.
示例性地,用户界面可以是如图4A-图4D所示的用户界面。电子设备100通过摄像头采集到人脸信息之后,可以进行一些必要的处理,将处理后的人脸信息与已存储的人脸信息模板进行匹配,以基于该人脸信息模板调取用户的身体评估信息。其中该人脸信息模板可以是用户在电子设备100进行人脸识别之前输入的。本申请实施例对人脸识别的器件和具体算法不做限定,只要能实现人脸识别即可。Exemplarily, the user interface may be the user interface shown in FIGS. 4A-4D . After the electronic device 100 collects the face information through the camera, it can perform some necessary processing, and match the processed face information with the stored face information template, so as to retrieve the user's physical assessment based on the face information template. information. The face information template may be input by the user before the electronic device 100 performs face recognition. The embodiment of the present application does not limit the devices and specific algorithms for face recognition, as long as the face recognition can be realized.
若采集到的人脸信息与已存储的人脸信息模板匹配失败,电子设备100可以显示如图4C所示的用户界面42。用户界面42可以显示提示403,用于提示用户为新用户,即已存储的人脸信息模板中未存储该用户的人脸信息,无法调取身份评估信息,用户需要进行身体测量评估以获取用户的身体评估信息。提示403的提示时间可以为5s,提示结束后,电子设备可以 显示图4E-图4I示例性所示的身体测量评估功能所提供的用户界面。If the collected face information fails to match the stored face information template, the electronic device 100 may display a user interface 42 as shown in FIG. 4C . The user interface 42 can display a prompt 403, which is used to prompt the user to be a new user, that is, the user's face information is not stored in the stored face information template, and the identity evaluation information cannot be retrieved, and the user needs to perform body measurement evaluation to obtain user information. physical assessment information. The prompt time of prompt 403 may be 5s, and after the prompt ends, the electronic device may display the user interface provided by the body measurement evaluation function exemplarily shown in Fig. 4E-Fig. 4I.
若采集到的人脸信息与已存储的人脸信息模板匹配成功,电子设备100可以检测用户的上一次登录时间,若用户的上一次登录时间未超出预设时间段,电子设备100可以获取用户的身份评估信息。If the collected face information matches the stored face information template successfully, the electronic device 100 can detect the user's last login time, and if the user's last login time does not exceed the preset time period, the electronic device 100 can obtain the user's last login time. identity assessment information.
若采集到的人脸信息与已存储的人脸信息模板匹配成功,电子设备100可以检测用户的上一次登录时间,若用户的上一次登录时间超出预设时间段,电子设备可以显示用户界面43。用户界面43可以包括提示404、控件405、控件406。提示404用于提示用户为登录时间超出预设时间段的用户,建议用户重新进行身体测量评估。控件405用于监听触发重新测量评估的用户操作。电子设备可以检测到作用于控件405的用户操作(如在控件405上的点击操作),响应于该操作,电子设备可以显示如图4E-图4I示例性所示的身体测量评估功能所提供的用户界面。控件406可用于监听触发沿用原有数据的用户操作。原有数据指示用户在此之前存储于电子设备100中的身体评估信息。If the collected face information is successfully matched with the stored face information template, the electronic device 100 can detect the user's last login time, and if the user's last login time exceeds the preset time period, the electronic device can display the user interface 43 . User interface 43 may include prompt 404 , control 405 , control 406 . Prompt 404 is used to prompt the user for a user whose login time exceeds a preset time period, suggesting that the user re-perform body measurement assessment. Control 405 is used to listen for user operations that trigger re-measurement evaluation. The electronic device may detect a user operation acting on the control 405 (such as a click operation on the control 405), and in response to the operation, the electronic device may display the body measurement evaluation function provided by the body measurement evaluation function as exemplarily shown in FIGS. 4E-4I. User Interface. Control 406 can be used to monitor user operations that trigger the use of original data. The original data indicates the user's physical assessment information stored in the electronic device 100 before that.
具体实现中,电子设备可以通过获取用户的身体参数评估信息来分布用户各部位的身体质量,还可以根据身体参数评估信息及用户在预设时间段内的运动量来计算运动能力指标,通过BMI和运动能力指标可以对应计算用户的承力参考值。电子设备可以设定人体关节的标准机体所能承受力的极限为标准受力。不同体型的用户,其关节的承受力的极限有所不同。在实现中,承力参考值可以等于标准受力乘以参数。其具体实现可以包括:电子设备可以将获取的BMI值分为多个BMI值区间,根据多个BMI值区间对应设定标准受力的不同百分比为不同BMI值区间的BMI受力,该百分比为初始基准比。电子设备可以根据用户的BMI设定标准运动量,其后可根据用户的日常运动量中的最大运动量的百分比动态调整标准运动量。用户日常运动量由0增长至标准运动量时,初始基准比动态降低10%。例如可以将BMI值分为n个BMI值区间,该n个BMI值区间的BMI受力可以为:100%×标准受力、90%×标准受力、80%×标准受力....(10-(n-1))×标准受力。In a specific implementation, the electronic device can distribute the body mass of each part of the user by acquiring the user's body parameter evaluation information, and can also calculate the exercise capacity index according to the body parameter evaluation information and the user's exercise amount within a preset time period, and use BMI and The exercise capacity index can correspond to the calculation of the user's load-bearing reference value. Electronic equipment can set the limit of the standard body's ability to bear the joints of the human body as the standard force. Users of different body types have different limits to the bearing capacity of their joints. In an implementation, the force reference value may be equal to the standard force multiplied by the parameter. Its specific implementation may include: the electronic device can divide the obtained BMI value into multiple BMI value intervals, and set different percentages of standard force corresponding to the multiple BMI value intervals as the BMI force of different BMI value intervals. The percentage is initial benchmark ratio. The electronic device can set a standard amount of exercise according to the user's BMI, and then dynamically adjust the standard amount of exercise according to the percentage of the maximum amount of exercise in the user's daily exercise amount. When the user's daily exercise volume increases from 0 to the standard exercise volume, the initial baseline ratio is dynamically reduced by 10%. For example, the BMI value can be divided into n BMI value intervals, and the BMI force of the n BMI value intervals can be: 100% × standard force, 90% × standard force, 80% × standard force.... (10-(n-1))×standard force.
在一些实施例中,电子设备可以将获取的BMI值分为<24、24-27、27-30、30-35、>35这五个区间,对应的BMI受力可以为:100%×标准受力、90%×标准受力、80%×标准受力、70%×标准受力、60%×标准受力,该BMI受力即为承力参考值。In some embodiments, the electronic device can divide the obtained BMI value into five intervals of <24, 24-27, 27-30, 30-35, and >35, and the corresponding BMI force can be: 100%×standard Force, 90%×standard force, 80%×standard force, 70%×standard force, 60%×standard force, the BMI force is the bearing reference value.
进一步,在具体实现中,电子设备也可以通过获取用户的身体状态评估信息,获知用户身体是否存在损伤部位,该损伤部位的损伤程度,从而对应计算降低损伤部位的受力评价,结合用户的BMI受力和标准运动量,可以对应计算出该损伤部位的承力参考值。电子设备可以将用户损伤部位的损伤程度分为多个程度,根据不同程度的损伤评价该损伤部位的受力与标准受力的关系,再根据多个BMI区间和损伤加权后的标准受力对应计算出BMI受力。Further, in a specific implementation, the electronic device can also obtain the user's physical state evaluation information to know whether there is a damaged part of the user's body and the degree of damage of the damaged part, so as to calculate and reduce the force evaluation of the damaged part, and combine the user's BMI The force and the standard amount of exercise can be used to calculate the reference value of the force of the injured part. The electronic device can divide the damage degree of the user's damaged part into multiple degrees, evaluate the relationship between the force of the damaged part and the standard force according to the different degrees of damage, and then correspond to the standard force after multiple BMI intervals and damage weighting. Calculate BMI force.
在一些实施例中,可以将用户在损伤部位的放置时间设置为小于2s、2-4s、4-6s、6-8s、>8s,损伤部位的放置时间与该损伤部位的受力评价一一对应,依据放置时间从短至长,损伤部位的受力评估分别为标准受力的90%、80%、70%、60%和50%。例如,电子设备检测到用户的手放置于膝关节,且放置时间为4-6s,则其损伤部位的受力评价对应标准受力的70%。再依据多个BMI区间和该损伤部位的受力评价对应计算BMI受力。In some embodiments, the user's placement time at the damaged part can be set to be less than 2s, 2-4s, 4-6s, 6-8s, >8s, and the placement time of the damaged part is related to the force evaluation of the damaged part one by one. Correspondingly, according to the placement time from short to long, the force evaluation of the damaged part is 90%, 80%, 70%, 60% and 50% of the standard force respectively. For example, if the electronic device detects that the user's hand is placed on the knee joint for 4-6 seconds, the force evaluation of the damaged part corresponds to 70% of the standard force. Then the BMI force is calculated correspondingly according to multiple BMI intervals and the force evaluation of the damaged part.
可以理解的是,步骤S102也可以在步骤S103或104后执行。It can be understood that step S102 may also be performed after step S103 or S104.
S103:电子设备检测目标对象的骨骼节点的位置,以获得骨骼节点的空间位置关系。S103: The electronic device detects the positions of the skeletal nodes of the target object, so as to obtain the spatial positional relationship of the skeletal nodes.
上述目标对象可以是用户,也可以是已选定运动课程中的运动图像,如运动课程中标准 示范动作的图像。The above-mentioned target object can be a user, and can also be a moving image in a selected exercise course, such as an image of a standard demonstration action in an exercise course.
电子设备可以针对用户或视频中,例如运动课程中标准示范动作的图像,通过骨骼点识别技术分析该用户的骨骼点,如踝关节点、膝关节点、髋关节点等。The electronic device can analyze the user's bone points, such as ankle joint points, knee joint points, hip joint points, etc., through the bone point recognition technology for the user or in the video, such as images of standard demonstration actions in sports courses.
电子设备可以通过摄像头、传感器等检测用户的骨骼节点,上述骨骼节点的位置用于指示用户的关节和各关节的连接关系及骨骼节点的空间位置关系。The electronic device can detect the user's skeletal nodes through cameras, sensors, etc., and the positions of the above-mentioned skeletal nodes are used to indicate the user's joints and the connection relationship between each joint and the spatial position relationship of the skeletal nodes.
进一步地,电子设备可以结合深度摄像模组或根据上述身体参数评估信息分析用户的身体比例、胖瘦等体型情况。Further, the electronic device can combine the depth camera module or analyze the user's body proportions, fat and thin, etc. according to the above-mentioned body parameter evaluation information.
示例性地,电子设备可以通过摄像头、红外传感器、光学标记仪、3D扫描仪等传感器检测用户的骨骼节点。电子设备还可以通过多人线性模型(skinned multi-person linear,SMPL)、视频前景提取算法(visual background extractor,VIBE)等深度学习网络,构建骨骼模型。例如,通过SMPL模型构建人体骨骼模型时可以使用上述采集的用户身体参数评估信息中的身高、体重等,结合国标GB-T17245-2004进行构建。Exemplarily, the electronic device may detect the user's bone nodes through sensors such as cameras, infrared sensors, optical markers, and 3D scanners. Electronic devices can also build bone models through deep learning networks such as skinned multi-person linear (SMPL) and visual background extractor (VIBE). For example, when building a human skeleton model through the SMPL model, the height, weight, etc. in the user's physical parameter evaluation information collected above can be used to construct it in conjunction with the national standard GB-T17245-2004.
具体的,电子设备可以利用获取的用户图像和人体骨骼点定位算法,检测出用户的骨骼节点,此处的骨骼节点是指确定骨骼点的坐标。进一步地,可以结合骨骼节点的坐标和上述身体参数评估信息确定用户的体型。其中,人体骨骼点定位算法的输入可以是用户的图像,输出可以是骨骼节点的坐标。电子设备可以检测出如图11所示的基本的人体骨骼节点,如左髋关节点、右髋关节点、左膝关节点、右膝关节点等。可以理解的是,不限于图11所示的骨骼节点,电子设备可以检测出更多或更少的骨骼节点。Specifically, the electronic device can detect the user's bone node by using the acquired user image and the human body bone point positioning algorithm, where the bone node refers to the coordinates of the determined bone point. Further, the body shape of the user may be determined in combination with the coordinates of the skeletal nodes and the above body parameter evaluation information. Wherein, the input of the human skeleton point positioning algorithm may be the user's image, and the output may be the coordinates of the skeleton nodes. The electronic device can detect basic human skeleton nodes as shown in FIG. 11 , such as left hip joint, right hip joint, left knee joint, right knee joint, and the like. It can be understood that, not limited to the skeleton nodes shown in FIG. 11 , the electronic device can detect more or less skeleton nodes.
在一些实施例中,电子设备可以开启摄像头,获取用户图像,通过分析用户图像来识别骨骼节点,获取的骨骼节点图可如图11所示。In some embodiments, the electronic device can turn on the camera, acquire user images, and identify skeletal nodes by analyzing the user images, and the acquired skeletal node graph can be shown in FIG. 11 .
在一种实现中,电子设备开启摄像头,获取图像,若电子设备未能从该图像中检测到骨骼节点,则表明电子设备未能检测到用户,电子设备可以输出未能检测到用户的提示信息,可以是文字“未检测到用户”。In one implementation, the electronic device turns on the camera to acquire the image, if the electronic device fails to detect the skeletal node from the image, it indicates that the electronic device fails to detect the user, and the electronic device can output a prompt message that the user cannot be detected , which can be the text "No user detected".
在另一种实现中,如图5A所示,电子设备开启摄像头,获取图像,基于该图像识别用户的骨骼节点。In another implementation, as shown in FIG. 5A , the electronic device turns on the camera, acquires an image, and identifies the user's bone nodes based on the image.
在具体实现中,电子设备可以对用户进行骨骼节点检测,在检测过程中,若电子设备检测到用户的姿态存在异常,则可以输出相关异常提示。姿态异常是指在用于处于运动状态前,进行骨骼节点检测时,用户未保持直立状态。直立是指上肢自然下垂、脚尖向前、目视前方的自然站立状态,下肢姿态异常可以是腿部弯曲、脚部离开地面等。用户未保持直立状态可能导致电子设备检测出的骨骼节点图不准确,从而导致后续受力检测产生误差。如图5B-图5C,电子设备100检测到用户的下肢姿态异常时,则输出姿态异常的提示504,提示用户进行姿态调整,保持直立状态,继续进行骨骼节点检测。可以理解的是,姿态异常的提示提醒可以是电子设备上的屏幕文字显示,也可以是语音提醒等。In a specific implementation, the electronic device may detect the skeletal nodes of the user, and during the detection process, if the electronic device detects that the posture of the user is abnormal, it may output a relevant abnormal prompt. Abnormal posture means that the user does not maintain an upright state when performing bone node detection before the user is in motion. Upright refers to the natural standing state in which the upper limbs naturally hang down, the toes are forward, and the eyes are forward. The abnormal posture of the lower limbs can be that the legs are bent and the feet are off the ground. If the user does not maintain an upright state, the skeletal node graph detected by the electronic device may be inaccurate, resulting in errors in subsequent force detection. As shown in FIGS. 5B-5C , when the electronic device 100 detects that the posture of the user's lower limbs is abnormal, it outputs a prompt 504 of abnormal posture, prompting the user to adjust the posture, maintain an upright state, and continue to detect bone nodes. It can be understood that the reminder of the abnormal posture can be a text display on the screen of the electronic device, or a voice reminder, etc.
S104:电子设备进行空间姿态校准,构建参考坐标系。S104: The electronic device performs space attitude calibration, and constructs a reference coordinate system.
空间姿态校准是指在目标对象直立状态时,根据上述骨骼节点为目标对象运动设定参考坐标系,用以校准目标对象的运动状态。例如,以检测用户的为例,可以地面和用户直立方向构建参考系,那么用户的跳跃动作相对于该坐标系呈现双脚远离地面,向上起跳的运动状态。若不进行空间姿态校准,则无法具体确定该用户的运动状态。Spatial posture calibration refers to setting a reference coordinate system for the movement of the target object according to the above-mentioned skeletal nodes when the target object is in an upright state, so as to calibrate the motion state of the target object. For example, taking the detection of the user as an example, the ground and the user's upright direction can be used to construct a reference system, then the user's jumping action presents a motion state of both feet away from the ground and upward jumping relative to this coordinate system. Without spatial attitude calibration, the user's motion state cannot be specifically determined.
在具体实施例中,参考坐标系可以以用户腰部或足部、以两脚的连线的中点等为原点, 以正面朝向、头与颈的距离方向、两脚连线方向或垂直方向等为坐标轴,建立空间坐标系。该参考坐标系为后续运动中,用户均相对于该坐标系进行运动。In a specific embodiment, the reference coordinate system can be based on the user's waist or feet, the midpoint of the line connecting the two feet, etc. As the coordinate axis, establish a space coordinate system. The reference coordinate system is the coordinate system relative to which the user moves in subsequent movements.
为了便于理空间姿态校准,举例说明,如图12所示,以用户两脚连线的中点为原点,以两脚连线方向为u轴,以头颈垂直距离方向为v轴,头与颈的距离为垂直方向作为w轴,以uv平面的垂直方向为w轴,建立参考坐标系uvw。In order to facilitate physical space posture calibration, for example, as shown in Figure 12, the midpoint of the line connecting the user's feet is the origin, the direction of the line connecting the two feet is the u-axis, and the vertical distance between the head and neck is the v-axis. The distance is the vertical direction as the w-axis, and the vertical direction of the uv plane as the w-axis to establish a reference coordinate system uvw.
可理解的是,建立参考坐标系后,可以通过参考坐标系获知通过上述骨骼节点检测出的骨骼节点的坐标。It can be understood that after the reference coordinate system is established, the coordinates of the bone nodes detected through the aforementioned bone nodes can be obtained through the reference coordinate system.
可选的,电子设备进行空间姿态校准时,可以在用户界面中显示校准倒计时。举例说明,电子设备可以显示如图6所述的用户界面60,用户界面60用于显示空间姿态校准的倒计时3、2、1。可以理解的是,校准倒计时可以是电子设备上的屏幕文字显示,也可以是语音提醒等。Optionally, when the electronic device performs space attitude calibration, it may display a calibration countdown on the user interface. For example, the electronic device may display a user interface 60 as shown in FIG. 6 , and the user interface 60 is used to display countdowns 3 , 2 , and 1 of space attitude calibration. It can be understood that the calibration countdown can be a text display on the screen of the electronic device, or a voice reminder, etc.
S105:电子设备获取离地参考值设定,离地参考值用于判定用户的离地状态。S105: The electronic device acquires a ground lift reference value setting, and the ground lift reference value is used to determine the user's ground lift state.
离地参考值是指用户的足部相对于双脚触地状态,左右踝关节相对于上述参考坐标系,检测为离地状态时的最小距离。The ground lift reference value refers to the minimum distance when the user's feet are detected to be in the ground lift state relative to the state of both feet touching the ground, and the left and right ankle joints are relative to the above-mentioned reference coordinate system.
在一些实施例中,电子设备可以通过自行设定,从而获取离地参考值,以判断用户的离地状态。In some embodiments, the electronic device can obtain the ground lift reference value by setting itself, so as to judge the user's ground lift state.
在一些实施例中,在电子设备可以通过自行设定或接收用户输入的方式,从而获取离地参考值,以判断用户的离地/触底状态。In some embodiments, the electronic device can obtain the ground lift reference value by setting itself or receiving user input, so as to judge the user's ground lift/bottomed state.
在一种实现方式中,电子设备可以自行设定用户的离地参考值。如图7所示,电子设备100可以检测到作用于控件702的用户操作(如在控件702上的点击操作),响应于该操作,电子设备100可以自行设定用户的离地/触底参考值。In an implementation manner, the electronic device can set the user's ground clearance reference value by itself. As shown in FIG. 7, the electronic device 100 can detect a user operation acting on the control 702 (such as a click operation on the control 702), and in response to this operation, the electronic device 100 can set the user's ground lift/bottom reference by itself. value.
在另一种实现方式中,电子设备可以检测到用户输入的离地参考值。如图7所示,电子设备100可以检测到作用于输入框703的用户操作(如在输入框703上的输入操作),响应于该操作,电子设备100可以接收用户设定的离地参考值。In another implementation manner, the electronic device may detect the ground clearance reference value input by the user. As shown in FIG. 7, the electronic device 100 can detect a user operation acting on the input box 703 (such as an input operation on the input box 703), and in response to the operation, the electronic device 100 can receive the reference value of the ground clearance set by the user. .
在另一些实施例中,电子设备还可以根据用户的身体参数评估信息设定用户的离地参考值。电子设备可以根据身体参数评估信息获取的BMI值的高低可以动态调整用户的离地参考值。如将用户的BMI值分为多个BMI区间,当BMI值位于正常区间时,可以设置用户的离地参考值为xcm,BMI值越大的BMI区间,其离地参考值依次降低。In some other embodiments, the electronic device may also set the user's ground clearance reference value according to the user's physical parameter evaluation information. The electronic device can dynamically adjust the user's ground clearance reference value according to the level of the BMI value obtained from the body parameter evaluation information. For example, the user's BMI value is divided into multiple BMI intervals. When the BMI value is in the normal interval, the user's ground clearance reference value can be set to xcm. The BMI interval with a larger BMI value has a lower ground clearance reference value.
示例性地,参考坐标系为上述参考坐标系uvw,电子设备可接收用户的离地参考值为15cm,即意味着用户的踝关节相对于参考坐标系的uv平面,距离大于或等于15cm时,电子设备才能检测到用户的足部处于离地状态,用户可能在做跳跃类运动。Exemplarily, the reference coordinate system is the above-mentioned reference coordinate system uvw, and the electronic device can receive the user's reference value of 15cm from the ground, which means that when the distance between the user's ankle joint and the uv plane of the reference coordinate system is greater than or equal to 15cm, The electronic device can detect that the user's feet are off the ground, and the user may be doing jumping exercises.
S106:电子设备基于骨骼节点,获取目标对象的受力情况。S106: The electronic device acquires the force condition of the target object based on the bone node.
上述目标对象可以是用户,也可以视频中的运动图像,如运动课程中标准示范动作的图像。The above-mentioned target object may be a user, or a moving image in a video, such as an image of a standard demonstration action in an exercise course.
电子设备可以针对用户,获取用户的受力数值,分析用户在运动过程中的受力情况。电子设备也可以基于视频中的运动图像,如运动课程中标准示范动作的图像,结合用户的身体参数评估信息和/或身体状态评估信息,对该图像中的受力情况进行分析。The electronic device can obtain the force value of the user for the user, and analyze the force of the user during the exercise. The electronic device can also analyze the stress situation in the image based on the moving image in the video, such as the image of the standard demonstration action in the exercise course, combined with the user's body parameter evaluation information and/or body state evaluation information.
获取目标对象的受力情况,具体步骤如图13所示。The specific steps to obtain the force of the target object are shown in Figure 13.
S201:依据目标对象的身体参数获取所述目标对象的第一数据;该第一数据包括人体环节的 质量、质心、转动惯量。S201: Obtain the first data of the target object according to the body parameters of the target object; the first data includes the mass, center of mass, and moment of inertia of the human body links.
人体惯性参数包括人体环节的质量、质心位置及其转动惯量,是进行人体运动及运动损伤与预防研究的基本参量。人体环节包括:大腿、小腿、足部、上臂、前臂等。例如人体环节的质量、质心、转动惯量,具体如大腿、小腿、足部的质量、质心、转动惯量等。The inertial parameters of the human body include the mass of the human body, the position of the center of mass and its moment of inertia, which are the basic parameters for the research on human motion and sports injury and prevention. Human body links include: thighs, calves, feet, upper arms, forearms, etc. For example, the mass, center of mass, and moment of inertia of human body links, such as the mass, center of mass, and moment of inertia of thighs, calves, and feet.
可理解的,依据目标对象的身体参数信息,例如身高和体重,可以获取大腿、小腿、足部的质量、质心的位置和转动惯量,这些数据可以通过身体参数信息结合成年人人体惯性参数国家标准(GB-T17245-2004)获得。例如,可通过男子大腿的回归方程Y=B 0+B 1X 1+B 2X 2(其中B 0为回归方程常数项,B 1为体重的回归系数,B 2为身高的回归系数)来计算男子大腿的质量或质心位置。 It is understandable that according to the body parameter information of the target object, such as height and weight, the mass of the thigh, calf, and foot, the position of the center of mass, and the moment of inertia can be obtained. These data can be combined with the national standard of adult human body inertia parameters through the body parameter information (GB-T17245-2004) obtained. For example, through the regression equation Y=B 0 +B 1 X 1 +B 2 X 2 of man’s thigh (wherein B 0 is the constant term of the regression equation, B 1 is the regression coefficient of weight, B 2 is the regression coefficient of height) Calculate the mass or center-of-mass position of the man's thigh.
在获取了人体惯性参数中的人体环节的质心后,可以通过上述参考坐标系获知个人体环节质心的坐标。After obtaining the center of mass of the human body link in the inertial parameters of the human body, the coordinates of the center of mass of the human body link can be obtained through the above reference coordinate system.
可以理解的是,S201获得的第一数据也可以在步骤S102中获得。It can be understood that the first data obtained in S201 may also be obtained in step S102.
S202:获取目标对象的第二数据和足部的离地状态;第二数据包括人体关节的运动速度、角速度、位置信息。S202: Obtain second data of the target object and the ground-off state of the foot; the second data includes movement speed, angular velocity, and position information of human joints.
上述人体环节质心的运动速度、角速度可以指大腿、小腿等人体环节的运动速度、角速度,人体关节的位置信息可以通过检测人体关节在上述参考坐标系中的坐标值获知。The motion velocity and angular velocity of the center of mass of the above-mentioned human body links can refer to the motion speed and angular velocity of human body links such as thighs and calves, and the position information of human body joints can be obtained by detecting the coordinate values of human body joints in the above-mentioned reference coordinate system.
足部离地状态可以包括第一状态、第二状态和第三状态。第一状态指示双脚腾空状态,第二状态指示单脚触地状态,第三状态指示双脚触地状态。具体的,足部离地状态可以根据目标对象的足部相对参考平面(如上述参考坐标系中的uv平面)的距离是否超出离地参考值来判断,可参见步骤S105中的描述。The foot-off state may include a first state, a second state, and a third state. The first state indicates the state of both feet in the air, the second state indicates the state of one foot touching the ground, and the third state indicates the state of both feet touching the ground. Specifically, the ground lift state can be judged according to whether the distance between the target object's feet relative to the reference plane (such as the uv plane in the above reference coordinate system) exceeds the ground lift reference value, which can be referred to the description in step S105.
S203:基于离地状态计算踝关节的第一数值和第二数值。S203: Calculate a first numerical value and a second numerical value of the ankle joint based on the ground-off state.
通过判断足部的离地状态,可以计算踝关节的第一数值和第二数值。该第一数值为踝关节的关节力,第二数值为踝关节力矩。可以理解的是,人体具有左右足部,即左右踝关节,第一数值可以包括左、右踝关节的关节力,第二数值可以包括左、右踝关节力矩。在进行左/右膝关节、髋关节等受力计算中,对应使用该左/右踝关节的关节力或力矩。The first numerical value and the second numerical value of the ankle joint can be calculated by judging the ground-off state of the foot. The first value is the joint force of the ankle joint, and the second value is the moment of the ankle joint. It can be understood that the human body has left and right feet, that is, left and right ankle joints, the first value may include the joint forces of the left and right ankle joints, and the second value may include the moments of the left and right ankle joints. In the force calculation of the left/right knee joint, hip joint, etc., the joint force or moment of the left/right ankle joint is correspondingly used.
可理解的,可以通过足部的三种状态计算踝关节的第一数值和第二数值。Understandably, the first numerical value and the second numerical value of the ankle joint can be calculated according to the three states of the foot.
当离地状态为第一状态的情况下,双脚腾空,左右踝关节的第一数值和第二数值均为0。When the ground-off state is the first state, both feet are in the air, and the first value and the second value of the left and right ankle joints are both 0.
当离地状态为第二状态的情况下,单脚触地,左右踝关节其一的受力为0,只需计算另一踝关节的受力即可。可通过人体环节的质量和人体环节的速度的乘积求和来计算另一踝关节的关节力,通过人体环节的质量和人体环节的角速度的乘积值减去人体环节质心到参考点的矢量与人体环节重量的乘积的差求和来计算另一踝关节的力矩。参考点可以是上述参考坐标系中的原点,人体环节质心到参考点的矢量可以通过人体环节质心在上述参考坐标系中的坐标与参考点的坐标进行计算获取。When the off-the-ground state is the second state, one foot touches the ground, the force on one of the left and right ankle joints is 0, and it is only necessary to calculate the force on the other ankle joint. The joint force of the other ankle joint can be calculated by summing the product of the mass of the human body link and the velocity of the human body link, and subtracting the vector from the mass center of the human body link to the reference point and the human body The difference between the products of the link weights is summed to calculate the moment at the other ankle joint. The reference point can be the origin in the above reference coordinate system, and the vector from the centroid of the human body link to the reference point can be obtained by calculating the coordinates of the centroid of the human body link in the above reference coordinate system and the coordinates of the reference point.
具体的,可以通过以下公式计算出另一踝关节的受力,其中F 1、M 1或F 2、M 2为0,计算出F 2、M 2或F 1、M 1的数值, Specifically, the force of another ankle joint can be calculated by the following formula, where F 1 , M 1 or F 2 , M 2 are 0, and the values of F 2 , M 2 or F 1 , M 1 can be calculated,
F 1+F 2=Σm iΔv ci+G F 1 +F 2 =Σm i Δv ci +G
M 1+M 2=Σ(J iΔω i-r i×m ig) M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
其中:F 1、F 2为踝关节的第一数值,M 1、M 2为踝关节的第二数值, m i为人体环节的质量,v ci为人体环节质心的运动速度,G为根据身体参数计算的用户的重量,J i为人体环节的转动惯量,ω i为人体环节质心的角速度,r i为依据人体环节质心到参考点的矢量,g为重力加速度。以下不再赘述。 Among them: F 1 and F 2 are the first numerical value of the ankle joint, M 1 and M 2 are the second numerical value of the ankle joint, m i is the mass of the human body link, v ci is the movement speed of the mass center of the human body link, and G is the The weight of the user calculated by the parameters, J i is the moment of inertia of the human body link, ω i is the angular velocity of the center of mass of the human body link, ri is the vector from the center of mass of the human body link to the reference point, and g is the acceleration of gravity. The following will not repeat them.
以左脚触地为例,右脚受力/力矩为0,设左足受力/力矩分别为F left、M left,右足受力/力矩分别为F right、M right,根据动量定理、动量矩定理可以计算出: Take the left foot touching the ground as an example, the force/moment of the right foot is 0, let the force/moment of the left foot be F left , M left respectively, and the force/moment of the right foot be F right , M right , according to the momentum theorem, momentum moment Theorem can be calculated:
F left+F right=Σm footΔv cfoot+G F left +F right =Σm foot Δv cfoot +G
M left+M right=Σ(J footΔω foot-r foot×m footg) M left +M right =Σ(J foot Δω foot -r foot ×m foot g)
由于F right、M right为0,所以F left、M left为: Since F right and M right are 0, so F left and M left are:
F left=Σm iΔv ci+G F left =Σm i Δv ci +G
M ieft=Σ(J iΔω i-r i×m ig) M ieft =Σ(J i Δω i -r i ×m i g)
当离地状态为第三状态的情况下,双脚触地,可以根据第一数据、第二数据计算踝关节的第一数值和第二数值;第一数据为人体环节的质量、质心、转动惯量,第二数据为人体关节的位置信息,第二坐标为目标对象的重心投影坐标,第三坐标、第四坐标为踝关节坐标,第二坐标、,第三坐标、第四坐标根据第二数据获知。具体的,可以通过人体环节的质量和人体环节的速度的乘积求和来计算左、右踝关节的关节力和,通过人体环节的质量和人体环节的角速度的乘积值减去人体环节质心到参考点的矢量与人体环节重量的乘积的差求和来计算左、右踝关节的力矩和。参考点可以是上述参考坐标系中的原点,人体环节质心到参考点的矢量可以通过人体环节质心在上述参考坐标系中的坐标与参考点的坐标进行计算获取。再通过重心投影坐标及左、右踝关节的坐标及左、右踝关节的关节力和、左、右踝关节的力矩和来分别计算出左、右踝关节的关节力和左、右踝关节的力矩。重心投影坐标可以依据重心距离上述参考坐标系的参考平面(如上述uv平面)的垂直映射确定。第三坐标、第四坐标是左、右踝关节在上述参考坐标系中的坐标,可以根据上述参考坐标系获取,如图14所示。设P proj为重心投影坐标,P 1为左踝关节坐标,P 2为右踝关节坐标。 When the ground-off state is the third state, the feet touch the ground, and the first value and the second value of the ankle joint can be calculated according to the first data and the second data; the first data is the mass, center of mass, and rotation of the human body link Inertia, the second data is the position information of human joints, the second coordinate is the projection coordinate of the center of gravity of the target object, the third and fourth coordinates are ankle joint coordinates, the second coordinate, the third coordinate, and the fourth coordinate are based on the second Data learned. Specifically, the sum of the joint forces of the left and right ankle joints can be calculated by summing the product of the mass of the human body link and the velocity of the human body link, and subtracting the mass center of the human body link from the product value of the mass of the human body link and the angular velocity of the human body link to the reference The difference summation of the product of the vector of the point and the weight of the human body links is used to calculate the torque sum of the left and right ankle joints. The reference point can be the origin in the above reference coordinate system, and the vector from the centroid of the human body link to the reference point can be obtained by calculating the coordinates of the centroid of the human body link in the above reference coordinate system and the coordinates of the reference point. Then calculate the joint forces of the left and right ankle joints and the left and right ankle joints through the projected coordinates of the center of gravity, the coordinates of the left and right ankle joints, the sum of the joint forces of the left and right ankle joints, and the sum of the moments of the left and right ankle joints moment. The projected coordinates of the center of gravity can be determined according to the vertical mapping between the center of gravity and the reference plane (such as the uv plane mentioned above) of the above reference coordinate system. The third coordinate and the fourth coordinate are the coordinates of the left and right ankle joints in the above reference coordinate system, which can be obtained according to the above reference coordinate system, as shown in FIG. 14 . Let P proj be the projected coordinates of the center of gravity, P 1 be the coordinates of the left ankle joint, and P 2 be the coordinates of the right ankle joint.
该状态下,可以通过以下公式计算所述第一数值和第二数值:In this state, the first value and the second value can be calculated by the following formula:
F 1+F 2=Σm iΔv ci+G F 1 +F 2 =Σm i Δv ci +G
M 1+M 2=Σ(J iΔω i-r i×m ig) M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
Figure PCTCN2022127953-appb-000017
Figure PCTCN2022127953-appb-000017
Figure PCTCN2022127953-appb-000018
Figure PCTCN2022127953-appb-000018
Figure PCTCN2022127953-appb-000019
Figure PCTCN2022127953-appb-000019
Figure PCTCN2022127953-appb-000020
Figure PCTCN2022127953-appb-000020
S204:基于目标对象的运动姿势,构建第一坐标系,第一坐标系用于构建齐次变换矩阵和获取人体关节在所述第一坐标系的第一坐标。S204: Based on the motion posture of the target object, construct a first coordinate system, the first coordinate system is used to construct a homogeneous transformation matrix and obtain first coordinates of human joints in the first coordinate system.
其中,第一坐标系可以包括:基准子坐标系、第一子坐标系、第二子坐标系;齐次变换矩阵基于基准子坐标系、第一子坐标系、第二子坐标系之间的关系构建;基准子坐标系、第一子坐标系、第二子坐标系之间的关系包括坐标轴之间的距离和角度;第一坐标为人体关节在所述基准子坐标系中的坐标。可理解的,可以依据目标对象的骨骼节点位置可以构建下肢坐标系,在髋关节、膝关节、踝关节处均可建立坐标系。基准子坐标系为基于某一骨骼节点建立的坐标系,第二、第三坐标系为基于另外两个骨骼节点建立的坐标系。Wherein, the first coordinate system may include: a reference sub-coordinate system, a first sub-coordinate system, and a second sub-coordinate system; the homogeneous transformation matrix is based on the relationship between the reference sub-coordinate system, the first sub-coordinate system, and the second sub-coordinate system Relationship construction: the relationship between the reference sub-coordinate system, the first sub-coordinate system, and the second sub-coordinate system includes the distance and angle between the coordinate axes; the first coordinate is the coordinate of the human body joint in the reference sub-coordinate system. It is understandable that the lower limb coordinate system can be established according to the bone node position of the target object, and the coordinate system can be established at the hip joint, knee joint, and ankle joint. The reference sub-coordinate system is the coordinate system established based on a certain bone node, and the second and third coordinate systems are the coordinate systems established based on the other two bone nodes.
其中,可以在髋关节球形踝的球心处建立基准坐标系,髋关节包含三个旋转自由度,膝关节包括一个旋转自由度,踝关节包含两个旋转自由度。电子设备还可以在足部建立人工坐标系,用以表述足的朝向。Among them, the reference coordinate system can be established at the spherical center of the spherical ankle of the hip joint. The hip joint contains three rotational degrees of freedom, the knee joint contains one rotational degree of freedom, and the ankle joint contains two rotational degrees of freedom. The electronic device can also establish an artificial coordinate system on the foot to describe the orientation of the foot.
举例说明,可以建立如图15所示的下肢坐标系。其建立过程具体为:假设以髋关节建立基准坐标系,假设第一次旋转为绕Z 0轴,X 0Y 0Z 0为基准坐标系,X 0朝向足部前方,Z 0朝向人体侧,依据右手定则确定的Y 0朝向。假设髋关节第二次为绕Z 1侧抬腿,根据右手定则确定X 1的方向,Z 0转化为Z 1。假设髋关节第三次次为绕Z 2转腿,根据右手定则确定X 2的方向,Z 1转化为Z 2。因此,以上三次完成了髋关节的旋转。假设第四次为绕膝关节转,转轴为Z 3,依据右手定则确定X 3的方向,Z 2转化为Z 3,依次类推建立如图15所示的下肢坐标系。其中设a为相邻子坐标系z轴距离,d为相邻子坐标系x轴距离,α为相邻子坐标系z轴夹角,θ为X轴间初始夹角+待计算关节转动角,可以获取如图16所示的初始状态量表。举一例说明,X 0和X 1相交,因此d为0;Z 0和Z 1相交,因此a为0;Z 0和Z 1的夹角为90,因此α为90。 For example, the lower limb coordinate system as shown in Figure 15 can be established. The establishment process is as follows: Assume that the reference coordinate system is established with the hip joint, assuming that the first rotation is around the Z 0 axis, X 0 Y 0 Z 0 is the reference coordinate system, X 0 faces the front of the foot, and Z 0 faces the side of the human body. The Y 0 orientation determined by the right-hand rule. Assuming that the hip joint lifts the leg around Z 1 for the second time, the direction of X 1 is determined according to the right-hand rule, and Z 0 is transformed into Z 1 . Assuming that the hip joint rotates the leg around Z 2 for the third time, the direction of X 2 is determined according to the right-hand rule, and Z 1 is transformed into Z 2 . Therefore, the above three times complete the rotation of the hip joint. Suppose the fourth rotation is around the knee joint, the axis of rotation is Z 3 , the direction of X 3 is determined according to the right-hand rule, Z 2 is transformed into Z 3 , and so on to establish the lower limb coordinate system as shown in Figure 15. Where a is the z-axis distance of the adjacent sub-coordinate system, d is the x-axis distance of the adjacent sub-coordinate system, α is the z-axis angle between the adjacent sub-coordinate systems, and θ is the initial angle between the X-axis + the joint rotation angle to be calculated , the initial state scale shown in Figure 16 can be obtained. As an example, X 0 and X 1 intersect, so d is 0; Z 0 and Z 1 intersect, so a is 0; the angle between Z 0 and Z 1 is 90, so α is 90.
可以理解的是,基准坐标系的建立不作限定。It can be understood that the establishment of the reference coordinate system is not limited.
S205:基于齐次变换矩阵,根据第一坐标和第一数据计算小腿的角速度。S205: Based on the homogeneous transformation matrix, calculate the angular velocity of the calf according to the first coordinates and the first data.
依据上述下肢坐标系,可以获取相邻关节坐标系的坐标轴之间的距离、夹角。如,依据图15所示的下肢坐标系可以获取相邻坐标系x、z轴之间的距离,相邻坐标系z轴之间的夹角。通过相邻关节坐标系的坐标轴之间的距离、夹角可以构建齐次变换矩阵,齐次变换矩阵如下式:According to the above lower limb coordinate system, the distance and included angle between coordinate axes of adjacent joint coordinate systems can be obtained. For example, the distance between the x and z axes of adjacent coordinate systems and the included angle between the z axes of adjacent coordinate systems can be obtained according to the lower limb coordinate system shown in FIG. 15 . The homogeneous transformation matrix can be constructed through the distance and angle between the coordinate axes of the adjacent joint coordinate systems. The homogeneous transformation matrix is as follows:
Figure PCTCN2022127953-appb-000021
(1≤i≤6,i为正整数)
Figure PCTCN2022127953-appb-000021
(1≤i≤6, i is a positive integer)
其中,p为踝关节、膝关节、髋关节在上述基准坐标系中的坐标值,a为相邻子坐标系z轴距离,d为相邻子坐标系x轴距离,α为相邻子坐标系z轴夹角,θ为X轴间初始夹角+待计算关节转动角。如,依据图15所示的下肢坐标系,a为相邻子坐标系z轴距离,d为相邻子坐标系x轴距离,α为相邻子坐标系z轴夹角,其初始状态量可如图16所示。Among them, p is the coordinate value of the ankle joint, knee joint and hip joint in the above reference coordinate system, a is the z-axis distance of the adjacent sub-coordinate system, d is the x-axis distance of the adjacent sub-coordinate system, and α is the adjacent sub-coordinate is the included angle of the z-axis, and θ is the initial included angle between the x-axes + the joint rotation angle to be calculated. For example, according to the lower extremity coordinate system shown in Figure 15, a is the z-axis distance of adjacent sub-coordinate systems, d is the x-axis distance of adjacent sub-coordinate systems, α is the z-axis angle between adjacent sub-coordinate systems, and its initial state quantity It can be shown in Figure 16.
依据人体关节在上述基准坐标系(如髋关节坐标系)中的坐标构建如下矩阵:Construct the following matrix according to the coordinates of the human body joints in the above-mentioned reference coordinate system (such as the hip joint coordinate system):
Figure PCTCN2022127953-appb-000022
Figure PCTCN2022127953-appb-000022
其中,m为依据A矩阵相乘获得的系数,即三角函数相乘的系数,p为第一坐标,a为相邻坐标系z轴距离、d为相邻坐标系x轴距离、α为相邻坐标系z轴之间的夹角,θ为初始夹角+人体关节的转动角。Among them, m is the coefficient obtained by multiplying the A matrix, that is, the coefficient of the trigonometric function multiplication, p is the first coordinate, a is the z-axis distance of the adjacent coordinate system, d is the x-axis distance of the adjacent coordinate system, and α is the phase The included angle between the z axes of the adjacent coordinate system, θ is the initial included angle + the rotation angle of the human joint.
通过将T矩阵中的数据对应至齐次变换矩阵可以求解出转动角θ值,通过求导计算出该环节的角速度。示例性的,可以将该关节的转动角通过一阶微分推导出该环节的角速度。By corresponding the data in the T matrix to the homogeneous transformation matrix, the value of the rotation angle θ can be solved, and the angular velocity of this link can be calculated by derivation. Exemplarily, the angular velocity of the link can be derived from the rotation angle of the joint through a first-order differential.
通过将T矩阵中的数据对应至齐次变换矩阵带入检测到的膝关节的上述数据,可以计算出小腿的角速度。By bringing the data in the T matrix corresponding to the homogeneous transformation matrix into the above data of the detected knee joint, the angular velocity of the lower leg can be calculated.
进一步的,通过在T矩阵和齐次变换矩阵带入检测到的髋关节的上述数据,还可以计算出大腿的角速度。Further, the angular velocity of the thigh can also be calculated by bringing the above-mentioned detected data of the hip joint into the T matrix and the homogeneous transformation matrix.
S206:基于第一数据、第二数据、踝关节的第一数值和第二数值及膝关节的角速度计算膝关节的第三数值和第四数值。S206: Calculate a third value and a fourth value of the knee joint based on the first data, the second data, the first value and the second value of the ankle joint, and the angular velocity of the knee joint.
基于通过上述获取的小腿质量、质心和转动惯量、小腿质心的速度、膝关节和踝关节的位置信息、踝关节的关节力和力矩及上述计算出的小腿的角速度可以计算出膝关节的第三数值和第四数值。其中,第三数值是指膝关节的关节力,第四数据是指膝关节的力矩。可以通过以下公式计算膝关节的第三数值和第四数值:Based on the mass, center of mass and moment of inertia of the calf obtained above, the velocity of the center of mass of the calf, the position information of the knee joint and ankle joint, the joint force and moment of the ankle joint, and the angular velocity of the calf calculated above, the third angle of the knee joint can be calculated. value and the fourth value. Wherein, the third numerical value refers to the joint force of the knee joint, and the fourth data refers to the torque of the knee joint. The third and fourth values of the knee joint can be calculated by the following formulas:
Figure PCTCN2022127953-appb-000023
Figure PCTCN2022127953-appb-000023
Figure PCTCN2022127953-appb-000024
Figure PCTCN2022127953-appb-000024
其中,F3为第三数值,M4为第四数值,m shank为所述第一数据中的小腿质量,
Figure PCTCN2022127953-appb-000025
为所述第二数据中小腿质心的速度,r shank为小腿质心到参考点的矢量,r foot为足部质心到参考点的矢量,J shank为小腿的转动惯量,
Figure PCTCN2022127953-appb-000026
为小腿的角速度。
Wherein, F3 is the third numerical value, M4 is the fourth numerical value, m shank is the calf mass in the first data,
Figure PCTCN2022127953-appb-000025
is the velocity of the center of mass of the calf in the second data, r shank is the vector from the center of mass of the calf to the reference point, r foot is the vector from the center of mass of the foot to the reference point, J shank is the moment of inertia of the calf,
Figure PCTCN2022127953-appb-000026
is the angular velocity of the leg.
可理解的是,计算出膝关节的第三数值和第四数值后,可以结合大腿质量、质心、转动惯量、运动速度、膝关节和髋关节的位置信息、膝关节的关节力和力矩及上述计算出的大腿的角速度可以计算出髋关节的第五数值和第六数值。其中,第五数值是指髋关节的关节力,第六数据是指髋关节的力矩。可以通过以下公式计算髋关节的第五数值和第六数值:It is understandable that after calculating the third value and the fourth value of the knee joint, the mass of the thigh, the center of mass, the moment of inertia, the speed of motion, the position information of the knee joint and the hip joint, the joint force and moment of the knee joint and the above-mentioned The calculated angular velocity of the thigh can be used to calculate the fifth value and the sixth value of the hip joint. Wherein, the fifth numerical value refers to the joint force of the hip joint, and the sixth data refers to the torque of the hip joint. The fifth and sixth values of the hip joint can be calculated by the following formula:
Figure PCTCN2022127953-appb-000027
Figure PCTCN2022127953-appb-000027
Figure PCTCN2022127953-appb-000028
Figure PCTCN2022127953-appb-000028
其中,F 5为第五数值,M 6为第六数值,m thigh为大腿质量,
Figure PCTCN2022127953-appb-000029
为大腿质心的速度,r thigh为大腿质心到参考点的矢量,r shank为小腿质心到参考点的矢量,J thigh为所述第一数据中大腿的转动惯量,
Figure PCTCN2022127953-appb-000030
为所述大腿的角速度。
Among them, F 5 is the fifth value, M 6 is the sixth value, m thigh is the mass of the thigh,
Figure PCTCN2022127953-appb-000029
is the velocity of the center of mass of the thigh, r thigh is the vector from the center of mass of the thigh to the reference point, rshank is the vector from the center of mass of the calf to the reference point, J thigh is the moment of inertia of the thigh in the first data,
Figure PCTCN2022127953-appb-000030
is the angular velocity of the thigh.
可理解,电子设备检测用户的运动动作时,可以将检测到的用户的运动图像显示在屏幕上。进一步的,电子设备还可以在显示的运动图像上显示用户的关节受力情况,如关节受力部位和关节力大小。It can be understood that when the electronic device detects the motion of the user, it may display the detected motion image of the user on the screen. Furthermore, the electronic device can also display the user's joint force on the displayed motion image, such as the position of the joint force and the magnitude of the joint force.
在一种可能的实施方式中,电子设备可以显示用户的运动图像。如电子设备可以在显示运动课程中的动作示范的同时显示用户的运动图像,电子设备也可以只显示用户的运动图像。如图8A所示,电子设备显示用户界面80,用户界面80可以显示有运动课程示范动作的图像802和检测到的用户的运动图像801。In a possible implementation manner, the electronic device may display the moving image of the user. For example, the electronic device can display the user's moving image while displaying the action demonstration in the exercise course, or the electronic device can only display the user's moving image. As shown in FIG. 8A , the electronic device displays a user interface 80 , and the user interface 80 may display an image 802 of an exercise course demonstration action and a detected motion image 801 of the user.
在另一种可能的实施方式中,显示用户运动图像时还可以将上述计算获得的关节力或力矩的数值通过色标叠加显示于运动图像中用户的对应部位。具体的,如图8B所示,可以在用户对应受力部位叠加圆,并在圆旁显示该部位的关节力/力矩的数值。还可以通过在用户对应受力部位叠加具有颜色的圆,颜色可以包括黄绿红,对应受力数值较小,可以显示黄色;受力数值逐渐增大,则从黄色变为绿色,再变为红色。可以理解的是,叠加色标的形状和颜色等并不受此限制。可以理解的是,图8B中显示的用户图像时为了更清晰的示明可以通过标记注明受力部位,图8B中的用户图像可以是显示通过上述检测出的骨骼节点图,也可以显示用户的实际运动图像,在该运动图像上进行标记。In another possible implementation manner, when the user's moving image is displayed, the value of the joint force or moment obtained through the above calculation may also be superimposed and displayed on the corresponding part of the user in the moving image through a color code. Specifically, as shown in FIG. 8B , a circle may be superimposed on the corresponding force-bearing part of the user, and the numerical value of the joint force/moment of the part is displayed beside the circle. It is also possible to superimpose a colored circle on the user's corresponding force-bearing part. The color can include yellow, green and red. The corresponding force value is small and yellow can be displayed; the force value gradually increases, changing from yellow to green and then to red. It can be understood that the shape and color of the superimposed color scale are not limited thereto. It can be understood that the user image shown in FIG. 8B can be marked to mark the force-bearing part for a clearer illustration. The user image in FIG. The actual moving image of , and mark it on the moving image.
在一种可能的实现中,电子设备对运动课程进行模拟检测时,可以显示运动课程中标准示范动作的图像,如图8C所示。In a possible implementation, when the electronic device simulates and detects the exercise course, it may display images of standard demonstration actions in the exercise course, as shown in FIG. 8C .
可选的,电子设备在受力检测前或受力检测过程中可以实时检测重心地面投影,以判断用户重心是否偏离稳定区间。若用户重心偏离稳定区间,用户可能出现运动姿势不稳或摔倒情形,电子设备可以提示用户调整身体重心。该提示可以通过用户界面文字提示,也可以在运动过程中检测到用户有偏离稳定区间情形时,语音提示用户调整身体重心。Optionally, the electronic device can detect the ground projection of the center of gravity in real time before or during the force detection, so as to determine whether the user's center of gravity deviates from the stable range. If the user's center of gravity deviates from the stable range, the user may experience unstable posture or fall, and the electronic device can prompt the user to adjust the body's center of gravity. The prompt can be prompted through text on the user interface, or when it is detected that the user deviates from the stable range during exercise, the user can be prompted to adjust the center of gravity by voice.
S107:电子设备基于目标对象运动时各关节的受力情况判断产生运动风险的情况,输出风险提示信息。S107: The electronic device judges the occurrence of a movement risk based on the stress on each joint when the target object moves, and outputs risk prompt information.
在具体实施例中,电子设备对目标对象进行受力检测时,可以获得目标对象运动时各关节的对应关节受力数据,如踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述髋关节的第五数值和第六数值中的至少一项,将该数据与参考数据比较,数值比较结果超过预设阈值时,电子设备可以显示风险提示信息。In a specific embodiment, when the electronic device detects the force of the target object, it can obtain the corresponding joint force data of each joint when the target object moves, such as the first value and the second value of the ankle joint, the first value and the second value of the knee joint. At least one of the three numerical values and the fourth numerical value and the fifth numerical value and the sixth numerical value of the hip joint are compared with the reference data, and when the numerical comparison result exceeds a preset threshold, the electronic device can display risk warning information.
可以理解的是,预设阈值可以根据实际需要进行设置,本申请对此不作限制。It can be understood that the preset threshold can be set according to actual needs, which is not limited in the present application.
关节受力情况可以是关节力数值。关节力的上述参考数据可以是人体关节受力阈值(可以是上述BMI受力),也可以是上述承力参考值。关节受力阈值可以是在实验中,统计一定数量用户的关节力最大值评估而来。电子设备可以通过计算关节力与人体关节受力阈值或承力参考值的比值是否与预设阈值(如数值1)的比值来判断运动损伤风险。The stress situation of the joint may be a numerical value of the joint force. The above-mentioned reference data of joint force may be the stress threshold of human joints (it may be the above-mentioned BMI force), or it may be the above-mentioned force-bearing reference value. The joint force threshold can be evaluated by counting the maximum joint force of a certain number of users in the experiment. The electronic device can determine the risk of sports injury by calculating whether the ratio of the joint force to the human joint force threshold or the force reference value is the ratio to a preset threshold (such as a value of 1).
可理解的,电子设备还可以基于关节力矩判断产生运动风险的情况,输出风险提示。电子设备可以通过目标对象的关节力矩,计算累计做功,通过将累计做功和累计做功阈值进行比较,若累计做功和累计做功阈值比值是否与预设阈值(如1)的比值来判断运动损伤风险。It is understandable that the electronic device can also judge the occurrence of movement risk based on the joint torque, and output a risk prompt. The electronic device can calculate the cumulative work through the joint torque of the target object, and judge the risk of sports injury by comparing the cumulative work with the cumulative work threshold, and if the ratio of the cumulative work to the cumulative work threshold is the ratio of the preset threshold (such as 1).
在一些实施例中,电子设备可以显示第一提示。如图9A所示,电子设备100基于上述受力检测,可以显示用户界面90。用户界面90可以显示提示901,电子设100备检测到作用于控件903的用户操作(如在控件903上的点击操作),响应于该操作,电子设备100可以显示如图8A或图8B所示的受力检测的用户界面。In some embodiments, the electronic device may display the first prompt. As shown in FIG. 9A , the electronic device 100 may display a user interface 90 based on the above force detection. The user interface 90 may display a prompt 901, and the electronic device 100 detects a user operation acting on the control 903 (such as a click operation on the control 903), and in response to the operation, the electronic device 100 may display the The user interface for force testing.
在一些实施例中,电子设备显示第一提示后,可以显示第二提示。举例说明,电子设备基于上述受力检测,获得用户运动时各部位的对应关节力数值,将该数值与对应的关节受力阈值比较,若关节力数值小于关节受力阈值,则电子设备重复执行受力检测;若关节力数值 大于或等于关节受力阈值,则电子设备输出第一提示。举例说明,如图9A和图9C所示,电子设备100检测到关节力数值大于关节受力阈值时,可以显示用户界面90,用户界面90可以显示提示框901以提示用户当前运动动作存在较高损伤风险,还可以显示用户存在损伤风险的原因,如用户左腿受力过大。电子设备显示提示框901超出预设时间段后,可以显示用户界面92,用户界面92可以显示提示框907以指导用户调整运动姿态。电子设备100可以检测到用户选择返回原运动课程的操作或检测到用户界面92显示时间超出预设时间段,可以显示如图8A或图8B所示的受力检测的用户界面。In some embodiments, after the electronic device displays the first prompt, it may display the second prompt. For example, based on the above-mentioned force detection, the electronic device obtains the corresponding joint force value of each part when the user is exercising, and compares the value with the corresponding joint force threshold. If the joint force value is less than the joint force threshold, the electronic device repeats. Force detection; if the joint force value is greater than or equal to the joint force threshold, the electronic device outputs a first prompt. For example, as shown in FIG. 9A and FIG. 9C , when the electronic device 100 detects that the value of the joint force is greater than the joint force threshold, it can display the user interface 90, and the user interface 90 can display a prompt box 901 to prompt the user that the current motion action has a higher value. Injury risk can also show the reason why the user is at risk of injury, for example, the user's left leg is under too much force. After the electronic device displays the prompt box 901 beyond the preset time period, it may display the user interface 92, and the user interface 92 may display the prompt box 907 to guide the user to adjust the exercise posture. The electronic device 100 may detect that the user chooses to return to the original exercise course or detect that the display time of the user interface 92 exceeds a preset time period, and may display the force detection user interface as shown in FIG. 8A or FIG. 8B .
在另一些实施例中,电子设备显示第一信息后,可以显示第一选项和第二选项,检测到用户作用于第一选项的用户操作时,响应于该操作,电子设备可以显示第二提示;电子设备检测到用户作用于第一选项的用户操作时,响应于该操作,电子设备可以显示第三提示。In some other embodiments, after the electronic device displays the first information, it may display the first option and the second option, and when a user operation on the first option is detected, in response to the operation, the electronic device may display the second prompt ; When the electronic device detects a user operation on the first option by the user, in response to the operation, the electronic device may display a third prompt.
具体的,如图9A、图9B、图9C所示,电子设备100基于受力检测,可以显示用户界面90,提示当前运动动作具有较高风险。电子设备100在预设时间段内未检测到作用于控件903的用户操作,或用户界面90显示时间超出预设时间段时,可以显示用户91。电子设备可以检测到作用于控件905的用户操作(如在控件905上的点击操作),响应于该操作,电子设备100可以显示用户界面92。用户界面92可以包括提示框907,提示框907可以显示存在较高风险的运动动作的调整方案。电子设备100可以检测到用户界面92显示时间超出预设时间段,可以显示如图8A或图8B所示的受力检测的用户界面。Specifically, as shown in FIG. 9A , FIG. 9B , and FIG. 9C , based on the force detection, the electronic device 100 may display a user interface 90 to prompt that the current exercise action has a high risk. The electronic device 100 may display the user 91 when no user operation on the control 903 is detected within the preset time period, or when the display time of the user interface 90 exceeds the preset time period. The electronic device can detect a user operation on the control 905 (eg, a click operation on the control 905 ), and in response to the operation, the electronic device 100 can display the user interface 92 . The user interface 92 may include a prompt box 907, and the prompt box 907 may display an adjustment plan for an exercise action with a higher risk. The electronic device 100 may detect that the display time of the user interface 92 exceeds a preset time period, and may display the force detection user interface as shown in FIG. 8A or FIG. 8B .
在具体实施例中,如图9A、图9B、图9D所示,电子设备100基于受力检测,可以显示用户界面90,提示当前运动动作具有较高风险。电子设备100在预设时间段内未检测到作用于控件903的用户操作,或用户界面90显示时间超出预设时间段时,可以显示用户91。电子设备可以检测到作用于控件906的用户操作(如在控件906上的点击操作),响应于该操作,电子设备100可以显示用户界面93。电子设备100可以检测到作用于用户界面93中控件908的用户操作(如在控件908上的点击操作),响应于该操作,电子设备100可以显示如图8A或图8B所示的受力检测的用户界面或用户界面92。电子设备100也可以检测到作用于控件909的用户操作(如在控件909上的点击操作),响应于该操作,电子设备100可以切换为用户选择的推荐运动课程。In a specific embodiment, as shown in FIG. 9A , FIG. 9B , and FIG. 9D , based on the force detection, the electronic device 100 may display a user interface 90 to prompt that the current exercise action has a high risk. The electronic device 100 may display the user 91 when no user operation on the control 903 is detected within the preset time period, or when the display time of the user interface 90 exceeds the preset time period. The electronic device can detect a user operation on the control 906 (such as a click operation on the control 906 ), and in response to the operation, the electronic device 100 can display the user interface 93 . The electronic device 100 can detect a user operation (such as a click operation on the control 908) acting on the control 908 in the user interface 93, and in response to the operation, the electronic device 100 can display the force detection as shown in FIG. 8A or FIG. 8B The user interface or user interface 92. The electronic device 100 may also detect a user operation acting on the control 909 (such as a click operation on the control 909), and in response to the operation, the electronic device 100 may switch to the recommended exercise course selected by the user.
在对运动课程进行模拟检测的情形中,电子设备可以在检测到运动动作相对用户而言具有一定损伤风险时,即输出风险提示;电子设备也可以在对运动课程模拟检测完成后,输出该运动课程的风险提示。举例说明,如图9A-图9D所示,电子设备100可以检测运动课程中的运动动作相对用户而言具有较高损伤风险时,即输出该运动动作存在较高风险的风险提示,具体步骤不再赘述。又例如,如图9E所示,电子设备100在对运动课程检测完成后,显示用户界面94,用户界面94用于可提示模拟检测已完成,还可以显示该运动课程的损伤风险程度、具有较高风险的运动动作等。In the case of simulated detection of exercise courses, the electronic device can output a risk prompt when it detects that the exercise action has a certain risk of injury to the user; the electronic device can also output the exercise after the simulation detection of the exercise course Risk warning of the course. For example, as shown in FIG. 9A-FIG. 9D, the electronic device 100 can detect that the sports action in the exercise course has a higher risk of injury than the user, that is, output a risk reminder that the sports action has a higher risk. The specific steps are not as follows: Let me repeat. For another example, as shown in FIG. 9E , after the electronic device 100 completes the exercise course detection, it displays the user interface 94. The user interface 94 is used to prompt that the simulation test has been completed, and can also display the degree of injury risk of the exercise course, which has a relatively high risk of injury. High-risk sports activities, etc.
在一些实施例中,电子设备基于上述受力检测,获得目标对象运动时各部位的对应关节力数值,可以将该数值与承力参考值比较,若关节力数值/关节力阈值的比值小于0.6,则该运动动作为低风险运动动作,电子设备重复执行受力检测;若关节力数值/关节力阈值的比值为0.6-0.9,则该运动动作为中风险运动动作,电子设备可以继续进行受力检测,同时电子设备还可以输出调整该运动动作的指导信息;若关节力数值/关节力阈值的比值大于0.9,则该运动动作为高风险运动动作,电子设备可以输出风险提示信息。In some embodiments, based on the above force detection, the electronic device obtains the corresponding joint force value of each part when the target object moves, and can compare the value with the force reference value, if the ratio of the joint force value/joint force threshold is less than 0.6 , the movement is a low-risk movement, and the electronic device repeatedly performs force detection; if the ratio of the joint force value/joint force threshold is 0.6-0.9, the movement is a medium-risk movement, and the electronic device can continue to perform stress detection. At the same time, the electronic device can also output guidance information for adjusting the movement; if the ratio of joint force value/joint force threshold is greater than 0.9, the movement is a high-risk movement, and the electronic device can output risk warning information.
图17示出了另一种运动分析的方法的流程,该方法是针对运动课程进行关节力/力矩的模拟检测。如图17所示,该方法可以包括:FIG. 17 shows the flow of another motion analysis method, which is to perform simulated detection of joint force/torque for a motion course. As shown in Figure 17, the method may include:
S301:电子设备接收用户针对第一应用的用户操作,该用户操作用于指示电子设备获取该用户的身份评估信息。S301: The electronic device receives a user operation by a user on a first application, where the user operation is used to instruct the electronic device to obtain identity evaluation information of the user.
上述电子设备接收用户针对第一应用的用户操作和电子设备获取该用户的身份评估信息可以参考上述步骤S101。For the electronic device receiving the user's user operation on the first application and the electronic device acquiring the user's identity evaluation information, reference may be made to the above step S101.
S302:电子设备获取用户的身份评估信息。S302: The electronic device acquires identity evaluation information of the user.
S302可以参考上述步骤S102中的相关描述。For S302, reference may be made to relevant descriptions in the above step S102.
S303:电子设备检测运动课程中标准示范动作的图像中骨骼节点的位置,以获得骨骼节点的空间位置关系。S303: The electronic device detects the position of the bone nodes in the image of the standard demonstration action in the exercise course, so as to obtain the spatial position relationship of the bone nodes.
S303可以参考上述步骤S103的相关描述。For S303, reference may be made to the relevant description of the above-mentioned step S103.
S304:电子设备对运动课程中标准示范动作的图像进行空间姿态校准,构建参考坐标系。S304: The electronic device performs spatial attitude calibration on images of standard demonstration actions in the exercise course, and constructs a reference coordinate system.
S304可以参考上述步骤S104的相关描述。For S304, reference may be made to the relevant description of the above-mentioned step S104.
S305:电子设备基于骨骼节点,结合运动课程中标准示范动作的图像进行受力分析,获取受力情况。S305: Based on the skeletal nodes, the electronic device performs force analysis in combination with images of standard demonstration actions in the exercise course to obtain force conditions.
在该步骤中,电子设备需自行设定离地参考值,离地参考值用于判定运动课程中标准示范动作的图像是否处于离地状态。In this step, the electronic device needs to set a ground-off reference value by itself, and the ground-lift reference value is used to determine whether the image of the standard demonstration action in the exercise course is in the ground-off state.
状态,可以参考上述步骤S105中的相关描述。Status, you can refer to the relevant description in the above step S105.
可理解,电子设备可以针对运动课程中标准示范动作的图像获取相应的身高、体重信息,结合身高体重信息和骨骼节点位置,可获得骨骼节点之间的质量、质心位置、转动惯量等。具体可参考上述步骤S106中的相关描述。It can be understood that the electronic device can obtain the corresponding height and weight information for the image of the standard demonstration action in the exercise course, and combine the height and weight information and the position of the bone nodes to obtain the mass between the bone nodes, the position of the center of mass, the moment of inertia, etc. For details, reference may be made to relevant descriptions in the above step S106.
S306:电子设备基于运动课程中标准示范动作的图像中各关节的受力情况判断产生运动风险的情况,输出风险提示信息。S306: The electronic device judges the occurrence of exercise risk based on the stress of each joint in the image of the standard demonstration action in the exercise course, and outputs risk prompt information.
S306可以参考上述步骤S107中的相关描述。For S306, reference may be made to relevant descriptions in the above-mentioned step S107.
以上所述,以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。As mentioned above, the above embodiments are only used to illustrate the technical solutions of the present application, and are not intended to limit them; although the present application has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand that: it can still understand the foregoing The technical solutions described in each embodiment are modified, or some of the technical features are replaced equivalently; and these modifications or replacements do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the various embodiments of the application.

Claims (23)

  1. 一种运动分析方法,其特征在于,包括:A motion analysis method, characterized in that, comprising:
    依据目标对象的身体参数获取所述目标对象的第一数据;所述第一数据包括人体环节的质量、质心、转动惯量;Acquiring the first data of the target object according to the physical parameters of the target object; the first data includes the mass, center of mass, and moment of inertia of the human body link;
    获取所述目标对象的第二数据和足部的离地状态;所述第二数据包括人体环节质心的运动速度、角速度及人体关节位置信息;Obtaining the second data of the target object and the ground-off state of the feet; the second data includes the movement velocity, angular velocity and position information of the human body joints of the center of mass of the human body link;
    基于所述离地状态计算所述踝关节的第一数值和第二数值;calculating a first value and a second value of the ankle joint based on the ground lift state;
    基于所述目标对象的运动姿势,构建第一坐标系,所述第一坐标系用于构建齐次变换矩阵和获取所述人体关节在所述第一坐标系的第一坐标;Constructing a first coordinate system based on the movement posture of the target object, the first coordinate system is used to construct a homogeneous transformation matrix and obtain the first coordinates of the human joints in the first coordinate system;
    基于所述齐次变换矩阵,根据所述第一坐标和所述第一数据计算小腿的角速度;calculating the angular velocity of the calf according to the first coordinates and the first data based on the homogeneous transformation matrix;
    基于所述第一数据、所述第二数据、所述踝关节的第一数值和第二数值及所述小腿的角速度计算所述膝关节的第三数值和第四数值。calculating a third value and a fourth value of the knee joint based on the first data, the second data, the first value and the second value of the ankle joint, and the angular velocity of the lower leg.
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:The method according to claim 1, further comprising:
    基于所述齐次变换矩阵,根据所述第一坐标和所述第一数据计算大腿的角速度;calculating the angular velocity of the thigh according to the first coordinates and the first data based on the homogeneous transformation matrix;
    基于所述第一数据、所述第二数据、所述踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述大腿的角速度计算所述髋关节的第五数值和第六数值。Calculate the hip joint based on the first data, the second data, the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint and the angular velocity of the thigh fifth and sixth values.
  3. 根据权利要求1或2所述的方法,其特征在于,基于所述离地状态计算所述踝关节的第一数值和第二数值,包括:The method according to claim 1 or 2, wherein calculating the first value and the second value of the ankle joint based on the off-the-ground state comprises:
    当所述离地状态为第一状态的情况下,所述踝关节的第一数值和第二数值均为0;When the ground-off state is the first state, both the first value and the second value of the ankle joint are 0;
    当所述离地状态为第二状态的情况下,根据所述第一数据和所述第二数据计算所述踝关节的第一数值和第二数值;When the ground-off state is a second state, calculate a first value and a second value of the ankle joint according to the first data and the second data;
    当所述离地状态为第三状态的情况下,根据所述第一数据、所述第二数据计算所述踝关节的第一数值和第二数值;所述第二数据为所述人体关节的位置信息,第二坐标为所述目标对象的重心投影坐标,第三坐标、第四坐标为所述踝关节坐标,所述第二坐标、第三坐标、第四坐标根据所述第二数据获得。When the ground-off state is the third state, calculate the first value and the second value of the ankle joint according to the first data and the second data; the second data is the human body joint The position information of the target object, the second coordinate is the projection coordinate of the center of gravity of the target object, the third coordinate and the fourth coordinate are the coordinates of the ankle joint, and the second coordinate, the third coordinate and the fourth coordinate are based on the second data get.
  4. 根据权利要求3所述的方法,其特征在于,所述根据所述第一数据和所述第二数据计算所述踝关节的第一数值和第二数值,包括:通过以下公式计算所述第一数值和第二数值,The method according to claim 3, wherein the calculating the first value and the second value of the ankle joint according to the first data and the second data comprises: calculating the first value by the following formula a value and a second value,
    F 1+F 2=Σm iΔv ci+G F 1 +F 2 =Σm i Δv ci +G
    M 1+M 2=Σ(J iΔω i-r i×m ig) M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
    其中:F1、F2为所述踝关节的第一数值,M1、M2为所述踝关节的第二数值,m i为所述人体环节的质量,v ci为所述人体环节质心的运动速度,G为根据所述身体参数计算的所述用户的重量,J i为所述人体环节的转动惯量,ω i为所述人体环节质心的角速度,r i为依据所述人体环节质心到参考点的矢量,g为重力加速度。 Wherein: F1, F2 are the first numerical value of described ankle joint, M1, M2 are the second numerical value of described ankle joint, m i is the quality of described human body link, v ci is the movement velocity of described human body link barycenter, G is the weight of the user calculated according to the body parameters, J i is the moment of inertia of the human body link, ω i is the angular velocity of the center of mass of the human body link, r i is the distance from the center of mass of the human body link to the reference point Vector, g is the acceleration due to gravity.
  5. 根据权利要求3所述的方法,其特征在于,所述根据所述第一数据、所述第二数据计算所述踝关节的第一数值和第二数值;所述第二数据为所述人体关节的位置信息,第二坐标为 所述目标对象的重心投影坐标,第三坐标、第四坐标为所述踝关节坐标,所述第二坐标、第三坐标、第四坐标根据所述第二数据获得,包括:通过以下公式计算所述第一数值和第二数值,The method according to claim 3, characterized in that, the first value and the second value of the ankle joint are calculated according to the first data and the second data; the second data is the human body The position information of the joint, the second coordinate is the projection coordinate of the center of gravity of the target object, the third coordinate and the fourth coordinate are the coordinates of the ankle joint, and the second coordinate, the third coordinate and the fourth coordinate are based on the second The data acquisition includes: calculating the first value and the second value by the following formula,
    F 1+F 2=Σm iΔv ci+G F 1 +F 2 =Σm i Δv ci +G
    M 1+M 2=Σ(J iΔω i-r i×m ig) M 1 +M 2 =Σ(J i Δω i -r i ×m i g)
    Figure PCTCN2022127953-appb-100001
    Figure PCTCN2022127953-appb-100001
    Figure PCTCN2022127953-appb-100002
    Figure PCTCN2022127953-appb-100002
    Figure PCTCN2022127953-appb-100003
    Figure PCTCN2022127953-appb-100003
    Figure PCTCN2022127953-appb-100004
    Figure PCTCN2022127953-appb-100004
    其中:F1、F2为所述踝关节的第一数值,M1、M2为所述踝关节的第二数值,m i为所述人体环节的质量,v ci为所述人体环节质心的运动速度,G为根据所述身体参数计算的所述用户的重量,J i为所述人体环节的转动惯量,ω i为所述人体环节质心的角速度,r i为所述人体环节质心到参考点的矢量,P proj为所述第二坐标,P 1为所述第三坐标,P 2为所述第四坐标。 Wherein: F1, F2 are the first numerical value of described ankle joint, M1, M2 are the second numerical value of described ankle joint, m i is the quality of described human body link, v ci is the movement velocity of described human body link barycenter, G is the weight of the user calculated according to the body parameters, J i is the moment of inertia of the human body link, ω i is the angular velocity of the center of mass of the human body link, r i is the vector from the center of mass of the human body link to the reference point , P proj is the second coordinate, P 1 is the third coordinate, and P 2 is the fourth coordinate.
  6. 根据权利要求1-5任一项所述的方法,其特征在于,基于所述齐次变换矩阵,根据所述第一坐标和所述第一数据计算小腿的角速度,包括:将所述第一坐标对应以下公式计算所述人体关节的转动角,基于所述人体关节的转动角计算所述小腿的角速度,The method according to any one of claims 1-5, wherein, based on the homogeneous transformation matrix, calculating the angular velocity of the calf according to the first coordinates and the first data includes: converting the first The coordinates correspond to the following formula to calculate the rotation angle of the human joint, and calculate the angular velocity of the lower leg based on the rotation angle of the human joint,
    Figure PCTCN2022127953-appb-100005
    Figure PCTCN2022127953-appb-100005
    Figure PCTCN2022127953-appb-100006
    Figure PCTCN2022127953-appb-100006
    其中,m为系数,p为所述第一坐标,a、d、α为所述第一坐标系中已知距离或角度,θ为初始夹角+所述人体关节的转动角。Wherein, m is a coefficient, p is the first coordinate, a, d, and α are known distances or angles in the first coordinate system, and θ is the initial included angle + the rotation angle of the human joint.
  7. 根据权利要求6所述的方法,其特征在于,所述基于所述第一数据、所述第二数据、所述踝关节的第一数值和第二数值及所述小腿的角速度计算所述膝关节的第三数值和第四数值,包括:通过以下公式计算所述第三数值和第四数值,The method according to claim 6, wherein the knee joint is calculated based on the first data, the second data, the first value and the second value of the ankle joint, and the angular velocity of the lower leg. The third numerical value and the fourth numerical value of the joint, comprising: calculating the third numerical value and the fourth numerical value by the following formula,
    Figure PCTCN2022127953-appb-100007
    Figure PCTCN2022127953-appb-100007
    Figure PCTCN2022127953-appb-100008
    Figure PCTCN2022127953-appb-100008
    其中,F3为所述第三数值,M4为所述第四数值,m shank为所述第一数据中的小腿质量,
    Figure PCTCN2022127953-appb-100009
    为所述第二数据中小腿质心的速度,r shank为依据所述第一数据和所述第二数据得出的小腿质心距离参考点的矢量,r foot为依据所述第一数据和所述第二数据得出的足部质心距离参考点的矢量,J shank为所述第一数据中小腿的转动惯量,
    Figure PCTCN2022127953-appb-100010
    为所述小腿的角速度。
    Wherein, F3 is the third numerical value, M4 is the fourth numerical value, m shank is the calf mass in the first data,
    Figure PCTCN2022127953-appb-100009
    is the velocity of the center of mass of the calf in the second data, r shank is the vector of the distance from the center of mass of the calf to the reference point based on the first data and the second data, r foot is based on the first data and the The vector of the center of mass of the foot obtained from the second data from the reference point, J shank is the moment of inertia of the calf in the first data,
    Figure PCTCN2022127953-appb-100010
    is the angular velocity of the lower leg.
  8. 根据权利要求7所述的方法,其特征在于,所述基于所述齐次变换矩阵,根据所述第一坐标和所述第一数据计算大腿的角速度,包括:将所述第一坐标对应以下公式计算人体关节的转动角,基于所述人体关节的转动角计算所述大腿的角速度,The method according to claim 7, wherein the calculating the angular velocity of the thigh according to the first coordinate and the first data based on the homogeneous transformation matrix comprises: corresponding the first coordinate to the following The formula calculates the rotation angle of the human body joint, and calculates the angular velocity of the thigh based on the rotation angle of the human body joint,
    Figure PCTCN2022127953-appb-100011
    Figure PCTCN2022127953-appb-100011
    Figure PCTCN2022127953-appb-100012
    Figure PCTCN2022127953-appb-100012
    其中,m为系数,p为所述第一坐标,a、d、α为所述第一坐标系中已知距离或角度,θ为初始夹角+所述人体关节的转动角。Wherein, m is a coefficient, p is the first coordinate, a, d, and α are known distances or angles in the first coordinate system, and θ is the initial included angle + the rotation angle of the human joint.
  9. 根据权利要求8所述的方法,其特征在于,所述基于所述第一数据、所述第二数据、所述踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述大腿的角速度计算所述髋关节的第五数值和第六数值,包括:通过以下公式计算所述第五数值和第六数值,The method according to claim 8, characterized in that, based on the first data, the second data, the first value and the second value of the ankle joint, the third value of the knee joint and The fourth numerical value and the angular velocity of the thigh are used to calculate the fifth numerical value and the sixth numerical value of the hip joint, including: calculating the fifth numerical value and the sixth numerical value by the following formula,
    Figure PCTCN2022127953-appb-100013
    Figure PCTCN2022127953-appb-100013
    Figure PCTCN2022127953-appb-100014
    Figure PCTCN2022127953-appb-100014
    其中,F5为所述第五数值,M6为所述第六数值,m thigh为所述第一数据中的大腿质量,
    Figure PCTCN2022127953-appb-100015
    为所述第二数据中大腿质心的速度,r thigh为依据所述第一数据和所述第二数据得出的大腿质心距离参考点的矢量,r shank为依据所述第一数据和所述第二数据得出的小腿质心距离参考点的矢量,J thigh为所述第一数据中大腿的转动惯量,
    Figure PCTCN2022127953-appb-100016
    为所述大腿的角速度。
    Wherein, F5 is the fifth numerical value, M6 is the sixth numerical value, m thigh is the thigh mass in the first data,
    Figure PCTCN2022127953-appb-100015
    is the velocity of the center of mass of the thigh in the second data, r thigh is the vector of the center of mass of the thigh from the reference point based on the first data and the second data, and r shank is the vector based on the first data and the second data The second data obtains the vector of the calf center of mass distance from the reference point, and J thigh is the moment of inertia of the thigh in the first data,
    Figure PCTCN2022127953-appb-100016
    is the angular velocity of the thigh.
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述第一坐标系包括:基准子坐标系、第一子坐标系、第二子坐标系;The method according to any one of claims 1-9, wherein the first coordinate system comprises: a reference sub-coordinate system, a first sub-coordinate system, and a second sub-coordinate system;
    所述齐次变换矩阵基于所述基准子坐标系、第一子坐标系、第二子坐标系之间的关系构建;所述基准子坐标系、第一子坐标系、第二子坐标系之间的关系包括坐标轴之间的距离和角度;The homogeneous transformation matrix is constructed based on the relationship between the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system; the relationship between the reference sub-coordinate system, the first sub-coordinate system and the second sub-coordinate system The relationship between them includes the distance and angle between the coordinate axes;
    所述第一坐标为所述人体关节在所述基准子坐标系中的坐标。The first coordinates are coordinates of the human body joints in the reference sub-coordinate system.
  11. 根据权利要求1-10任一项所述的方法,其特征在于,获取所述足部的离地状态,包括:The method according to any one of claims 1-10, wherein obtaining the ground-off state of the foot comprises:
    显示第一用户界面,所述第一用户界面用于显示离地参考值的设定;所述离地参考值用于判断所述足部的离地状态;Displaying a first user interface, the first user interface is used to display the setting of the ground lift reference value; the ground lift reference value is used to judge the ground lift state of the foot;
    接收针对所述离地参考值的设定操作。A setting operation for the ground clearance reference value is received.
  12. 根据权利要求1-11任一项所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 1-11, wherein the method further comprises:
    显示第二用户界面,所述第二用户界面显示所述目标对象的第一图像,在所述第一图像上叠加第一区域和第一标识;所述第一区域为所述人体关节在所述第一图像上的区域,所述第一标识为所述踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述髋关节的第五数值和第六数值中的至少一项。Displaying a second user interface, where the second user interface displays a first image of the target object, and a first area and a first logo are superimposed on the first image; the first area is where the human joint is located. The area on the first image, the first identification is the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint, and the fifth value and the second value of the hip joint At least one of the six values.
  13. 如权利要求2-12任一项所述的方法,其特征在于,计算所述髋关节的第五数值和第六数值之后,还包括:The method according to any one of claims 2-12, characterized in that, after calculating the fifth value and the sixth value of the hip joint, further comprising:
    基于所述踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述髋关节的第五数值和第六数值中的至少一项,判断是否产生运动风险。Based on at least one of the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint, and the fifth value and the sixth value of the hip joint, determine whether a movement risk occurs .
  14. 根据权利要求13所述的方法,其特征在于,所述判断是否产生运动风险,包括:The method according to claim 13, wherein said judging whether a movement risk occurs comprises:
    判断所述踝关节的第一数值和第二数值、所述膝关节的第三数值和第四数值及所述髋关节的第五数值和第六数值中的至少一项与第一参考数值的比值与第一阈值的大小;judging the difference between at least one of the first value and the second value of the ankle joint, the third value and the fourth value of the knee joint, the fifth value and the sixth value of the hip joint and the first reference value the size of the ratio to the first threshold;
    若所述比值大于第一阈值,输出风险提示信息。If the ratio is greater than the first threshold, output risk prompt information.
  15. 根据权利要求14所述的方法,其特征在于,输出风险提示信息,包括:The method according to claim 14, characterized in that outputting risk warning information includes:
    输出第一提示;output the first prompt;
    或者,or,
    输出第一提示,所述第一提示包括第一选项;接收作用于所述第一选项的第二操作,输出第二提示。Outputting a first prompt, the first prompt including a first option; receiving a second operation acting on the first option, outputting a second prompt.
  16. 根据权利要求14或15所述的方法,其特征在于,所述第一参考数值为所述人体关节受力阈值。The method according to claim 14 or 15, characterized in that the first reference value is the stress threshold of the human body joints.
  17. 根据权利要求1-16任一项所述的方法,其特征在于,依据目标对象的身体参数获取所述目标对象的第一数据之前,还包括:The method according to any one of claims 1-16, wherein, before obtaining the first data of the target object according to the physical parameters of the target object, further comprising:
    对用户进行身体测量评估;所述身体测量评估包括评估身体状态,所述身体状态包括损伤部位和所述损伤部位的损伤程度。An anthropometric evaluation is performed on the user; the anthropometric evaluation includes evaluating a physical state, and the physical state includes an injury site and an injury degree of the injury site.
  18. 根据权利要求17所述的方法,其特征在于,所述评估身体状态,包括:The method according to claim 17, wherein said assessing the physical state comprises:
    检测第一部位放置于所述用户的损伤部位,检测所述第一部位放置于所述损伤部位的时间;所述第一部位为所述用户的身体部位;Detecting that the first part is placed on the damaged part of the user, and detecting the time when the first part is placed on the damaged part; the first part is a body part of the user;
    根据所述时间确定所述损伤程度。The degree of damage is determined based on the time.
  19. 根据权利要求17或18所述的方法,其特征在于,所述第一参考数值为承力参考值,所述承力参考值根据所述身体测量评估调整。The method according to claim 17 or 18, characterized in that, the first reference value is a bearing reference value, and the bearing reference value is adjusted according to the body measurement assessment.
  20. 根据权利要求1-19任一项所述的方法,其特征在于,所述目标对象为所述用户或已选定运动课程中的运动图像。The method according to any one of claims 1-19, wherein the target object is a moving image in the user or a selected exercise course.
  21. 一种运动分析装置,其特征在于,包括用于执行如权利要求1至20任一项所述的方法的单元。A motion analysis device, characterized by comprising a unit for performing the method according to any one of claims 1 to 20.
  22. 一种电子设备,包括触控屏,存储器,一个或多个处理器,多个应用程序,以及一个或多个程序;其中,所述一个或多个程序被存储在所述存储器中;其特征在于,所述一个或 多个处理器在执行所述一个或多个程序时,使得所述电子设备实现如权利要求1-20任一项所述的方法。An electronic device comprising a touch screen, memory, one or more processors, multiple application programs, and one or more programs; wherein, the one or more programs are stored in the memory; its features That is, when the one or more processors execute the one or more programs, the electronic device implements the method according to any one of claims 1-20.
  23. 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1-20任一项所述的方法。A computer storage medium, characterized by comprising computer instructions, and when the computer instructions are run on an electronic device, the electronic device is made to execute the method according to any one of claims 1-20.
PCT/CN2022/127953 2021-10-29 2022-10-27 Athletic analysis method and apparatus, and electronic device and computer storage medium WO2023072195A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111276279.2A CN116072291A (en) 2021-10-29 2021-10-29 Motion analysis method, motion analysis device, electronic equipment and computer storage medium
CN202111276279.2 2021-10-29

Publications (1)

Publication Number Publication Date
WO2023072195A1 true WO2023072195A1 (en) 2023-05-04

Family

ID=86160493

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/127953 WO2023072195A1 (en) 2021-10-29 2022-10-27 Athletic analysis method and apparatus, and electronic device and computer storage medium

Country Status (2)

Country Link
CN (1) CN116072291A (en)
WO (1) WO2023072195A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105596021A (en) * 2014-11-19 2016-05-25 株式会社东芝 Image analyzing device and image analyzing method
US10213645B1 (en) * 2011-10-03 2019-02-26 Swingbyte, Inc. Motion attributes recognition system and methods
US20190224528A1 (en) * 2018-01-22 2019-07-25 K-Motion Interactive, Inc. Method and System for Human Motion Analysis and Instruction
CN111062247A (en) * 2019-11-07 2020-04-24 郑州大学 Human body movement intention prediction method oriented to exoskeleton control
CN112957033A (en) * 2021-02-01 2021-06-15 山东大学 Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation
CN113283116A (en) * 2021-06-16 2021-08-20 北京理工大学 Multi-information fusion human motion analysis method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10213645B1 (en) * 2011-10-03 2019-02-26 Swingbyte, Inc. Motion attributes recognition system and methods
CN105596021A (en) * 2014-11-19 2016-05-25 株式会社东芝 Image analyzing device and image analyzing method
US20190224528A1 (en) * 2018-01-22 2019-07-25 K-Motion Interactive, Inc. Method and System for Human Motion Analysis and Instruction
CN111062247A (en) * 2019-11-07 2020-04-24 郑州大学 Human body movement intention prediction method oriented to exoskeleton control
CN112957033A (en) * 2021-02-01 2021-06-15 山东大学 Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation
CN113283116A (en) * 2021-06-16 2021-08-20 北京理工大学 Multi-information fusion human motion analysis method and device

Also Published As

Publication number Publication date
CN116072291A (en) 2023-05-05

Similar Documents

Publication Publication Date Title
WO2020177585A1 (en) Gesture processing method and device
US10716494B2 (en) Method of providing information according to gait posture and electronic device for same
KR102254151B1 (en) Method for providing information according to gait posture and electronic device therefor
EP4020491A1 (en) Fitness-assisted method and electronic apparatus
CN109840061A (en) The method and electronic equipment that control screen is shown
WO2020253758A1 (en) User interface layout method and electronic device
CN107101665A (en) Movable information provides method and supports the electronic installation of methods described
CN111544852B (en) Method and related apparatus for correcting body-building posture
CN108604432A (en) Electronic equipment and method for controlling it
WO2021008589A1 (en) Application running mehod and electronic device
WO2021169394A1 (en) Depth-based human body image beautification method and electronic device
WO2022095788A1 (en) Panning photography method for target user, electronic device, and storage medium
CN111400605A (en) Recommendation method and device based on eyeball tracking
CN108712641A (en) Electronic equipment and its image providing method for providing VR images based on polyhedron
Dhillon et al. Leveraging consumer sensing devices for telehealth
CN113808446A (en) Fitness course interaction method and related device
CN110022948B (en) Mobile device for providing exercise content and wearable device connected thereto
CN111882642A (en) Texture filling method and device for three-dimensional model
WO2023072195A1 (en) Athletic analysis method and apparatus, and electronic device and computer storage medium
CN115188064A (en) Method for determining motion guidance information, electronic equipment and motion guidance system
CN107422854A (en) Action identification method and terminal applied to virtual reality
EP4006754A1 (en) Prompting method for fitness training, and electronic device
CN112711335B (en) Virtual environment picture display method, device, equipment and storage medium
EP4224485A1 (en) Adaptive action evaluation method, electronic device, and storage medium
WO2022095983A1 (en) Gesture misrecognition prevention method, and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22886070

Country of ref document: EP

Kind code of ref document: A1