CN114145710B - Body data detection method and device and electronic equipment - Google Patents

Body data detection method and device and electronic equipment Download PDF

Info

Publication number
CN114145710B
CN114145710B CN202010936369.9A CN202010936369A CN114145710B CN 114145710 B CN114145710 B CN 114145710B CN 202010936369 A CN202010936369 A CN 202010936369A CN 114145710 B CN114145710 B CN 114145710B
Authority
CN
China
Prior art keywords
angle
value
user
leg
length
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010936369.9A
Other languages
Chinese (zh)
Other versions
CN114145710A (en
Inventor
易炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN202010936369.9A priority Critical patent/CN114145710B/en
Publication of CN114145710A publication Critical patent/CN114145710A/en
Application granted granted Critical
Publication of CN114145710B publication Critical patent/CN114145710B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4533Ligaments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1071Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring angles, e.g. using goniometers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1072Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring distances on the body, e.g. measuring length, height or thickness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1075Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Rheumatology (AREA)
  • Physiology (AREA)
  • Rehabilitation Therapy (AREA)
  • Telephone Function (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a body data detection method and device and electronic equipment. The method comprises the following steps: responding to the data detection instruction, and acquiring acceleration data of the electronic equipment; calculating a first angle value based on the acceleration data, wherein the first angle value represents the lifting angle of the arm of the user; calculating a second angle value based on the first angle value, the first length value, the second length value and a third length value, wherein the second angle value represents the leg lifting angle of the user, the first length value represents the shoulder height of the user, the second length value represents the arm length of the user, and the third length value represents the leg length of the user; and outputting a leg flexibility detection result of the user based on the leg lifting angle. Therefore, under the condition that the shoulder height, the arm length and the leg length of the user are obtained, after the lifting angle of the arm of the user is detected, the lifting angle of the leg of the user is further calculated, and then the leg flexibility detection result of the user is obtained.

Description

Body data detection method and device and electronic equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for detecting body data, and an electronic device.
Background
As users are increasingly concerned about their physical state, users are more desirous of using portable electronic devices to detect their physical state. For example, with the popularity of smartwatches, more users wear smartwatches with them, and related smartwatches can only detect heart rate, blood oxygen saturation and walking steps, while with the increasing functional demands of users, users may expect smartwatches to have more physical data to detect and function, for example, to detect the flexibility of their legs. However, the inventors have found in the study that the related electronic devices do not have a function of detecting the flexibility of the leg portions.
Disclosure of Invention
In view of the above, the present application provides a body data detection method, apparatus and electronic device, so as to improve the above problem.
In a first aspect, the present application provides a body data detection method, applied to an electronic device, the method comprising: responding to a data detection instruction, and acquiring acceleration data of the electronic equipment; calculating a first angle value based on the acceleration data, wherein the first angle value represents the lifting angle of the arm of the user; calculating a second angle value based on the first angle value, the first length value, the second length value and a third length value, wherein the second angle value represents a leg lifting angle of a user, the first length value represents shoulder height of the user, the second length value represents arm length of the user, and the third length value represents leg length of the user; and outputting a leg flexibility detection result of the user based on the leg lifting angle.
In a second aspect, the present application provides a body data detection apparatus for operation in an electronic device, the apparatus comprising: the acceleration data acquisition unit is used for responding to the data detection instruction and acquiring acceleration data of the electronic equipment; the data calculation unit is used for calculating a first angle value based on the acceleration data, wherein the first angle value represents the lifting angle of the arm of the user; the data calculation unit is further configured to calculate a second angle value based on the first angle value, the first length value, the second length value and the third length value, where the second angle value represents a leg lifting angle of the user, the first length value represents shoulder height of the user, the second length value represents arm length of the user, and the third length value represents leg length of the user; and the detection unit is used for outputting a leg flexibility detection result of the user based on the leg lifting angle.
In a third aspect, the present application provides an electronic device comprising an acceleration sensor, a processor, and a memory; one or more programs are stored in the memory and configured to be executed by the processor to implement the methods described above.
In a fourth aspect, the present application provides a computer readable storage medium having program code stored therein, wherein the program code, when executed by a processor, performs the method described above.
According to the body data detection method, the body data detection device and the electronic equipment, after the acceleration data of the electronic equipment are obtained in response to the data detection instruction, a first angle value representing the lifting angle of the arm of the user is calculated based on the acceleration data, and then a second angle value representing the leg lifting angle of the user is calculated based on the first angle value, a first length value representing the shoulder height of the user, a second length value representing the arm length of the user and a third length value representing the leg length of the user, so that a leg flexibility detection result of the user is output based on the leg lifting angle. Therefore, under the condition that the shoulder height, the arm length and the leg length of the user are obtained, after the lifting angle of the arm of the user is detected, the lifting angle of the leg of the user is further calculated, and then the leg flexibility detection result of the user is obtained.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for detecting body data according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a control interface in an embodiment of the application;
FIG. 3 is a schematic diagram of another control interface in an embodiment of the application;
FIG. 4 is a flow chart of a method for detecting body data according to another embodiment of the present application;
FIG. 5 is a schematic diagram of a specified action by a user in an embodiment of the application;
FIG. 6 is a schematic diagram showing a leg lifting angle corresponding to a specified action performed by a user in an embodiment of the present application;
FIG. 7 is a flow chart of a method for detecting body data according to still another embodiment of the present application;
FIG. 8 is a schematic diagram showing whether a prompt is detected in association with another electronic device in an embodiment of the application;
FIG. 9 is a schematic diagram of a confirmation of an imaging position in an embodiment of the application;
fig. 10 is a block diagram showing a frame rate control apparatus according to another embodiment of the present application;
fig. 11 is a block diagram showing a structure of a frame rate control apparatus according to still another embodiment of the present application;
fig. 12 shows a block diagram of the structure of an electronic device for performing the body data detection method according to an embodiment of the present application;
fig. 13 is a memory unit for storing or carrying program codes for implementing the body data detection method according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
As users are increasingly concerned about their physical state, users are more desirous of using portable electronic devices to detect their physical state. For example, with the popularity of smartwatches, more users wear smartwatches with them, and related smartwatches can only detect heart rate, blood oxygen saturation and walking steps, while with the increasing functional demands of users, users may expect smartwatches to have more physical data to detect and function, for example, to detect the flexibility of their legs. However, the inventors have found in the study that the related electronic devices do not have a function of detecting the flexibility of the leg portions.
Therefore, the inventor proposes a body data detection method, a body data detection device and an electronic device capable of improving the problems, acceleration data of the electronic device are obtained through responding to data detection instructions, a first angle value representing the lifting angle of the arm of a user is calculated based on the acceleration data, and then a second angle value representing the leg lifting angle of the user is calculated based on the first angle value, a first length value representing the shoulder height of the user, a second length value representing the arm length of the user and a third length value representing the leg length of the user, so that a leg flexibility detection result of the user is output based on the leg lifting angle. Therefore, under the condition that the shoulder height, the arm length and the leg length of the user are obtained, after the lifting angle of the arm of the user is detected, the lifting angle of the leg of the user is further calculated, and then the leg flexibility detection result of the user is obtained.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 1, a method for detecting body data according to an embodiment of the present application includes:
s110: and responding to the data detection instruction, and acquiring acceleration data of the electronic equipment.
It should be noted that, the data detection instruction is an instruction for triggering and executing the body data detection method in the embodiment. In this embodiment, the data detection instruction may be generated in a variety of ways.
As one approach, the data detection instruction may be generated in response to a touch operation by the user. Optionally, a control interface may be configured in the electronic device, where a control is configured in the control interface, and when the electronic device detects a touch operation acting on the control, the electronic device may trigger to generate a data detection instruction. For example, as shown in fig. 2, taking detection of flexibility of the leg as an example, control content corresponding to the current control interface is identified in the interface 10 shown in the drawing, and when a touch operation acting on the control 11 is detected, the data detection instruction is correspondingly generated. The control 11 may switch to a style shown in fig. 3 after detecting the touch operation, and in the style shown in fig. 3, if the touch operation acting on the control 11 is detected, a detection end instruction is generated. It should be noted that, if the detection end instruction is generated during the execution of the body data detection method provided in this embodiment, the electronic device will not execute the steps that are not completed yet. For example, after the execution of S120, if the detection end instruction generation is detected, the subsequent S130 and S140 are not executed any more.
In addition, when the control interface shown in fig. 2 is displayed, the data detection command may be triggered by touching the physical key, and correspondingly, when the control interface shown in fig. 3 is displayed, the detection command may be generated by touching the physical key.
As still another way, in the case where the electronic device is configured with a voice recognition function, the user may trigger generation of the data detection instruction by means of a voice instruction. For example, a voice assistant may be configured in the electronic device, and the electronic device may invoke the voice assistant to run when the electronic device recognizes an outgoing instruction of the instruction, and then trigger generation of a data detection instruction when the voice assistant recognizes a further instruction regarding leg flexibility detection. The instruction on the leg flexibility detection may be "flexibility detection" or "i want to detect leg flexibility" or the like. Alternatively, the electronic device may display the interface shown in fig. 3 after triggering the generation of the data detection instruction by means of voice recognition.
After the data detection instruction is generated in the foregoing manner, acquisition of the acceleration data may be started. Optionally, the acceleration data is acceleration data including multiple directions collected by the electronic device.
In one mode, the electronic device can trigger the acceleration module responsible for collecting acceleration data to start when responding to the data detection instruction, and then the acceleration data collected by the acceleration module is obtained.
S120: and calculating a first angle value based on the acceleration data, wherein the first angle value represents the lifting angle of the arm of the user.
It should be noted that, in the embodiment of the present application, the electronic device may be a smart watch worn on an arm of a user, and in this embodiment, the flexibility of a leg may be understood as the highest height that a user can raise when lifting the leg. And in the process of detecting the flexibility of the legs of the user, the user is required to make a leg lifting action, the user can put the arm flat in the leg lifting process, then prompt the user to lift the legs to touch the palm, after the feet touch the palm, the legs are further lifted along with the arm, and then the angle of the lifting of the arms of the user is used as a first angle value when the highest lifting height of the legs is obtained.
S130: based on the first angle value, the first length value, the second length value and the third length value, a second angle value is calculated, the second angle value represents the leg lifting angle of the user, the first length value represents the shoulder height of the user, the second length value represents the arm length of the user, and the third length value represents the leg length of the user.
The angle of lifting the arm and the angle of lifting the leg are related to a certain extent in the leg lifting process, and the angle of lifting the leg of the user can be calculated by combining the angle of lifting the arm, so that the flexibility of the leg of the user can be further represented according to the angle of lifting the leg.
S140: and outputting a leg flexibility detection result of the user based on the leg lifting angle.
In this embodiment, the greater the determined leg lifting angle, the better the flexibility of the user's leg is indicated.
According to the body data detection method, after the acceleration data of the electronic equipment are obtained in response to the data detection instruction, a first angle value representing the lifting angle of the arm of the user is calculated based on the acceleration data, and then a second angle value representing the leg lifting angle of the user is calculated based on the first angle value, a first length value representing the shoulder height of the user, a second length value representing the arm length of the user and a third length value representing the leg length of the user, so that a leg flexibility detection result of the user is output based on the leg lifting angle. Therefore, under the condition that the shoulder height, the arm length and the leg length of the user are obtained, after the lifting angle of the arm of the user is detected, the lifting angle of the leg of the user is further calculated, and then the leg flexibility detection result of the user is obtained.
Referring to fig. 4, a method for detecting body data according to an embodiment of the present application includes:
s210: and responding to the data detection instruction, and acquiring acceleration data of the electronic equipment.
In the foregoing embodiment, as shown in the foregoing embodiment, in the process of performing data detection, the user is required to perform a specified action on both the arm and the leg (for example, the leg follows the lifting action of the arm), and further, acceleration data of the user in the process of performing the specified action is acquired. In this embodiment, the purpose of acquiring the acceleration data is to calculate the angle lifted by the arm of the user later, and to more accurately detect the flexibility of the leg of the user, the acquired angle lifted by the arm may be the lifting angle corresponding to the highest lifting angle of the leg of the user, and in this manner, it should be noted that, when the lifting angle of the arm is highest, the acceleration of the electronic device in the lifting direction may reach the maximum, and then, when the acceleration in the lifting direction may reach the maximum, the acceleration data acquired by the electronic device may be used as the acceleration data of the first angle calculated later.
For example, as shown in fig. 5, as one way, in response to the data detection instruction, the user may be prompted to be in the ready state shown in fig. 5 first, and then be prompted to be in the ready state shown in fig. 5. When the arm is in the preparation state, the user is prompted to keep one arm horizontal, and then the included angle d1 between the arm kept horizontal and the body is 90 degrees. The user may then be prompted to start lifting the leg from the ready state and when the leg touches the arm which remains level, continue to lift the leg and the arm will follow the leg up until the user's leg can be lifted to the highest position as shown in the kicked state of fig. 5. Optionally, the acceleration data includes acceleration in a first direction, acceleration in a second direction, and acceleration in a third direction, where an included angle between any two directions of the first direction, the second direction, and the third direction is 90 degrees. For example, using a three-axis acceleration sensor as an example, acceleration data of an electronic device may include acceleration in three directions, i.e., x, y, and z, where the x axis is parallel to the arm direction, the y axis is perpendicular to the arm direction, and the z axis is an upward direction of a surface of the electronic device (e.g., a smart watch). The direction of the z-axis therein may then be the aforementioned lifting direction. And when the acceleration in the z-axis direction is maximum, the acquired acceleration data is used as acceleration data for calculating the first angle subsequently.
S220: and calculating a first angle value based on the acceleration data, wherein the first angle value represents the lifting angle of the arm of the user.
As one way, the calculating, based on the acceleration data, a first angle value, the first angle value representing an angle at which the arm of the user is raised, includes: acquiring the square sum of the acceleration in the first direction, the acceleration in the second direction and the acceleration in the third direction, and calculating the square root of the square sum; obtaining a ratio of the acceleration in the third direction to the square root; performing inverse cosine calculation on the ratio to obtain a third angle value to be processed; and obtaining the sum of the third angle value to be processed and a third reference angle value as a first angle value. The first direction may be the aforementioned x-axis direction, the second direction may be the y-axis direction, the third direction may be the z-axis direction, and the calculated first angle value may be the angle d2 shown in fig. 5. The calculation of the first angle value is described below by the following formula:
d2=arccos(z/sqrt(x^2+y^2+z^2))+90°
wherein x is acceleration in the x-axis direction, y is acceleration in the y-axis direction, z is acceleration in the z-axis direction, 90 ° is a third reference angle value, and d2 is a first angle value obtained by calculation.
S230: and calculating a designated height value based on the first angle value, the first length value and the second length value, wherein the designated height value represents the leg lifting height of the user.
As one way, the calculating, based on the first angle value, the first length value, and the second length value, obtains a specified height value, where the specified height value characterizes a leg lifting height of the user, includes: performing sine calculation on the difference value between the first angle value and the first reference angle value to obtain a first angle value to be processed; obtaining the product of a second length value and the first angle value to be processed; and obtaining the sum of the product and the first length value as a designated height value. The calculation of the specified height value is explained below by the following formula:
h2=h1+arm*sin(d2-90°)
where, as shown in fig. 6, h1 represents the shoulder height of the user, arm represents the arm length of the user, where 90 ° is a first reference angle value, and h2 is a calculated specified height value.
S240: and calculating a second angle value based on the appointed height value and a third length value, wherein the second angle value represents the leg lifting angle of the user, the first length value represents the shoulder height of the user, the second length value represents the arm length of the user, and the third length value represents the leg length of the user.
As one way, the calculating the second angle value based on the specified height value and the third length value includes: acquiring a difference value between the specified height value and the third length value; acquiring a ratio of the difference value to the third length value; performing arcsine calculation on the ratio to obtain a second angle value to be processed; and obtaining the sum of the second angle value to be processed and a second reference angle value as a second angle value. The calculation of the second angle value is described below by the following formula:
b=arcsin((h2-leg)/leg)+90°
wherein leg length of the user is characterized by leg length, 90 degrees being a second reference angle value.
S250: and outputting a leg flexibility detection result of the user based on the leg lifting angle.
It should be noted that the flexibility of the leg portion characterizes the highest height that the user can lift when lifting the leg, and the better the flexibility of the user's leg portion, the higher the height that the user can lift. Then as an evaluation means, a plurality of angle sections may be divided in advance, and a leg flexibility detection result may be configured for each angle section. In this manner, the outputting the leg flexibility detection result of the user based on the leg lifting angle includes: acquiring angle intervals corresponding to the leg lifting angles, wherein each angle interval corresponds to one leg flexibility detection result, and leg flexibility detection results corresponding to different angle intervals are different; and outputting the leg flexibility detection result corresponding to the corresponding angle interval as the leg flexibility detection result of the user.
For example, if the pre-divided sections include an angle section a, an angle section b, an angle section c, and an angle section d, the minimum value of the angle corresponding to the angle section a is greater than the maximum value of the angle corresponding to the angle section b, the minimum value of the angle corresponding to the angle section b is greater than the maximum value of the angle corresponding to the angle section c, and the minimum value of the angle corresponding to the angle section c is greater than the maximum value of the angle corresponding to the angle section d. Correspondingly, the leg flexibility detection result corresponding to the angle interval a is excellent, the leg flexibility detection result corresponding to the angle interval b is good, the leg flexibility detection result corresponding to the angle interval c is general, and the leg flexibility detection result corresponding to the angle interval d is poor.
If the leg lifting angle calculated based on S240 is within the angle section b, the output leg flexibility detection result is good. If the leg lifting angle calculated based on S240 is within the angle section a, the output leg flexibility detection result is excellent.
According to the body data detection method, after the acceleration data of the electronic equipment are obtained in response to the data detection instruction, a first angle value representing the lifting angle of the arm of the user is calculated based on the acceleration data, a designated height value representing the lifting height of the user is calculated based on the first angle value, the first length value and the second length value, and a second angle value is calculated based on the designated height value and the third length value. Therefore, under the condition that the shoulder height, the arm length and the leg length of the user are obtained, the leg lifting height of the user is calculated, the leg lifting angle of the user is calculated based on the leg lifting height and the leg length of the user, and further the leg flexibility detection result of the user is obtained according to the angle interval where the leg lifting angle is located.
Referring to fig. 7, a method for detecting body data according to an embodiment of the present application includes:
s310: and responding to the data detection instruction, and acquiring whether the electronic equipment is currently in a wearing state.
As shown in the foregoing, in the data detection method provided by the embodiment of the present application, the electronic device is required to be in the state of being worn by the user in the arm state to calculate other data, if the electronic device is not in the state of being worn by the user in the arm state, the acceleration data acquired by the electronic device may not be accurately calculated to obtain the leg lifting angle of the user, so that in order to avoid an invalid calculation process, after the data detection instruction is generated, whether the electronic device is in the wearing state may be detected. The wearing state is understood to be a state of wearing on the arm of the user. If the wearing state is detected, the electronic equipment is represented to be worn on the arm of the user currently, otherwise, if the electronic equipment is not detected to be in the wearing state, the electronic equipment is represented to be not worn on the arm of the user currently.
As an example, taking an electronic device as an intelligent watch, a distance sensor may be disposed towards one side of a user when the intelligent watch is in a wearing state, and if the distance value is detected to be 0 by the distance sensor, the intelligent watch is characterized as being in a state of being worn by the user.
S320: and if the electronic equipment is in the wearing state, acquiring acceleration data of the electronic equipment.
S330: and if the electronic equipment is not in the wearing state, sending out prompt information for prompting the user to wear the electronic equipment.
S340: calculating a first angle value based on the acceleration data, wherein the first angle value represents the lifting angle of the arm of the user;
s350: calculating a second angle value based on the first angle value, the first length value, the second length value and a third length value, wherein the second angle value represents a leg lifting angle of a user, the first length value represents shoulder height of the user, the second length value represents arm length of the user, and the third length value represents leg length of the user;
s360: and outputting a leg flexibility detection result of the user based on the leg lifting angle.
The body data detection method provided by the application can further calculate the lifting angle of the legs of the user after detecting the lifting angle of the arms of the user under the condition that the shoulder height, the arm length and the leg length of the user are obtained, so as to obtain the leg flexibility detection result of the user. In this embodiment, whether the electronic device is in the wearing state is detected first in response to the data detection instruction, and then the acceleration data is triggered to be acquired only when the electronic device is detected to be in the wearing state, so that the electronic device performs an invalid calculation process to cause power consumption waste.
It should be noted that, in the implementation process of the body data detection method provided in the embodiment of the present application, a user needs to cooperate to make a specified action, but for a user who uses the body data detection method for the first time, the user may be less skilled in the action to be made, or it is not clear that the specific actions need to be made. Then, as a way, the acceleration data of the electronic device may be obtained after responding to the data detection instruction, and if the acceleration data is detected in real time and found that the user does not make any action, dynamic prompt information may be displayed on the screen of the electronic device to prompt the user to make the foregoing specified action. The specified action includes an action shown from the preparation state to the kicking state in fig. 5, and the corresponding presentation information includes a continuous action screen of the action shown from the preparation state to the kicking state.
Further, in one mode, the body data detection method may be performed by the electronic device that performs the body data detection method provided by the present embodiment in cooperation with another electronic device.
In the process of executing the body data detection method, the user is required to make a specific action, but it is not preferable for the electronic device worn on the arm of the user to recognize whether the user actually makes the specific action. Then, as a way, the user's actions may be detected by a further electronic device in order to control the execution phase of the aforementioned body data detection method in accordance with the actions made by the user.
Alternatively, the electronic device performing the body data detection method provided in this embodiment may be a smart watch, and the other electronic device may be a smart phone. As a way, the smart watch and the smart phone may perform wireless communication through bluetooth before, and then the smart phone may transmit the image collected by the camera thereof to the smart watch, or transmit the recognition result of the collected image to the smart watch.
As one way, after the electronic device responds to the data detection instruction, the user may be prompted first whether detection is required in conjunction with another electronic device. As shown in fig. 8, the electronic device may display a prompt message of "whether to detect in association with another electronic device". If the user selects no, the electronic device switches to display the interface shown in fig. 3, if the user selects yes, the electronic device switches to display the interface shown in fig. 9, and "please confirm that the electronic device is at the image capturing position" is displayed on the interface shown in fig. 9, so that the user can be within the image capturing range of the camera of another electronic device. If the user confirms that the image is within the image acquisition range of the camera of another electronic device, a control named yes in fig. 9 can be clicked, so as to trigger the start of acquiring the acceleration data of the electronic device.
In this way, while the electronic device performs the steps of the body data detection method, the other electronic device also detects the action made by the user in real time and detects the action made by the user, if the user is identified to make the specified action, notification information indicating that the user has finished the specified action is transmitted to the electronic device (for example, a smart watch), and after the electronic device obtains the leg flexibility detection result of the user based on the foregoing manner, if the obtained leg flexibility detection result of the user is also received from the other electronic device, the obtained leg flexibility detection result of the user is output to the user. However, if the electronic device obtains the leg flexibility detection result of the user based on the above manner, but does not receive the notification information that the user has completed the specified action and is transmitted by another electronic device, the notification information that prompts the user to have a wrong action and please complete the specified action again is displayed, so that the accuracy of the output leg flexibility detection result is improved.
Referring to fig. 10, a body data detection apparatus 400 according to an embodiment of the present application is operated in an electronic device, where the apparatus 400 includes:
and an acceleration data acquisition unit 410, configured to acquire acceleration data of the electronic device in response to a data detection instruction.
As one way, the acceleration data obtaining unit 410 is specifically configured to obtain, in response to a data detection instruction, whether the electronic device is currently in a wearing state; and if the electronic equipment is in the wearing state, acquiring acceleration data of the electronic equipment. Correspondingly, as shown in fig. 11, the apparatus 400 further includes a prompting unit 440, configured to send a prompting message for prompting the user to wear the electronic device if the electronic device is not in the wearing state.
A data calculation unit 420, configured to calculate a first angle value based on the acceleration data, where the first angle value represents an angle at which an arm of a user is raised;
the data calculating unit 420 is further configured to calculate, based on the first angle value, the first length value, the second length value, and the third length value, a second angle value, where the second angle value represents a leg lifting angle of the user, the first length value represents shoulder height of the user, the second length value represents arm length of the user, and the third length value represents leg length of the user;
and the detection unit 430 is configured to output a leg flexibility detection result of the user based on the leg lifting angle.
As one way, the data calculating unit 420 is specifically configured to calculate, based on the first angle value, the first length value, and the second length value, a specified height value, where the specified height value characterizes a leg lifting height of the user; and calculating a second angle value based on the specified height value and the third length value.
Optionally, the data calculating unit 420 is specifically configured to perform sinusoidal calculation on the difference between the first angle value and the first reference angle value to obtain a first angle value to be processed; obtaining the product of a second length value and the first angle value to be processed; and obtaining the sum of the product and the first length value as a designated height value.
Optionally, the data calculating unit 420 is specifically configured to obtain a difference between the specified height value and the third length value; acquiring a ratio of the difference value to the third length value; performing arcsine calculation on the ratio to obtain a second angle value to be processed; and obtaining the sum of the second angle value to be processed and a second reference angle value as a second angle value.
Optionally, the acceleration data includes acceleration in a first direction, acceleration in a second direction, and acceleration in a third direction, where an included angle between any two directions of the first direction, the second direction, and the third direction is 90 degrees, and the data calculating unit 420 is specifically configured to obtain a sum of squares of the acceleration in the first direction, the acceleration in the second direction, and the acceleration in the third direction, and calculate a square root of the sum of squares; obtaining a ratio of the acceleration in the third direction to the square root; performing inverse cosine calculation on the ratio to obtain a third angle value to be processed; and obtaining the sum of the third angle value to be processed and a third reference angle value as a first angle value.
As a way, the detecting unit 430 is specifically configured to obtain angle intervals corresponding to the leg lifting angles, where each angle interval corresponds to one leg flexibility detection result, and leg flexibility detection results corresponding to different angle intervals are different; and outputting the leg flexibility detection result corresponding to the corresponding angle interval as the leg flexibility detection result of the user.
According to the body data detection device, after the acceleration data of the electronic equipment are obtained in response to the data detection instruction, a first angle value representing the lifting angle of the arm of the user is calculated based on the acceleration data, and then a second angle value representing the leg lifting angle of the user is calculated based on the first angle value, a first length value representing the shoulder height of the user, a second length value representing the arm length of the user and a third length value representing the leg length of the user, so that a leg flexibility detection result of the user is output based on the leg lifting angle. Therefore, under the condition that the shoulder height, the arm length and the leg length of the user are obtained, after the lifting angle of the arm of the user is detected, the lifting angle of the leg of the user is further calculated, and then the leg flexibility detection result of the user is obtained.
It should be noted that, in the present application, the device embodiment and the foregoing method embodiment correspond to each other, and specific principles in the device embodiment may refer to the content in the foregoing method embodiment, which is not described herein again.
An electronic device according to the present application will be described with reference to fig. 12.
Referring to fig. 12, based on the above-mentioned body data detection method and apparatus, another electronic device 200 capable of executing the above-mentioned body data detection method is provided in an embodiment of the present application. The electronic device 200 includes one or more (only one shown in the figures) processors 102, a memory 104, a network module 106, and an acceleration module 108 coupled to each other. The memory 104 stores therein a program capable of executing the contents of the foregoing embodiments, and the processor 108 can execute the program stored in the memory 104.
Wherein the processor 102 may include one or more cores for processing data. The processor 102 utilizes various interfaces and lines to connect various portions of the overall electronic device 200, perform various functions of the electronic device 200, and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 104, and invoking data stored in the memory 104. Alternatively, the processor 102 may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 102 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for being responsible for rendering and drawing of display content; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 102 and may be implemented solely by a single communication chip.
The Memory 104 may include random access Memory (Random Access Memory, RAM) or Read-Only Memory (RAM). Memory 104 may be used to store instructions, programs, code sets, or instruction sets. The memory 104 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (e.g., a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described below, etc. The storage data area may also store data created by the terminal 100 in use (such as phonebook, audio-video data, chat-record data), etc.
The network module 106 is configured to receive and transmit electromagnetic waves, and to implement mutual conversion between the electromagnetic waves and the electrical signals, so as to communicate with a communication network or other devices, such as an audio playing device. The network module 106 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and the like. The network module 106 may communicate with various networks such as the Internet, intranets, wireless networks, or other devices via wireless networks. The wireless network may include a cellular telephone network, a wireless local area network, or a metropolitan area network. For example, the network module 106 may interact with base stations.
Referring to fig. 13, a block diagram of a computer readable storage medium according to an embodiment of the present application is shown. The computer readable medium 1100 has stored therein program code that can be invoked by a processor to perform the methods described in the method embodiments above.
The computer readable storage medium 1100 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Optionally, computer readable storage medium 1100 includes non-volatile computer readable medium (non-transitory computer-readable storage medium). The computer readable storage medium 1100 has storage space for program code 1110 that performs any of the method steps described above. The program code can be read from or written to one or more computer program products. Program code 1110 may be compressed, for example, in a suitable form.
In summary, according to the body data detection method, the body data detection device and the electronic equipment provided by the application, after the acceleration data of the electronic equipment is obtained in response to the data detection instruction, a first angle value representing the lifting angle of the arm of the user is calculated based on the acceleration data, and then a second angle value representing the leg lifting angle of the user is calculated based on the first angle value, a first length value representing the shoulder height of the user, a second length value representing the arm length of the user and a third length value representing the leg length of the user, so that a leg flexibility detection result of the user is output based on the leg lifting angle. Therefore, under the condition that the shoulder height, the arm length and the leg length of the user are obtained, after the lifting angle of the arm of the user is detected, the lifting angle of the leg of the user is further calculated, and then the leg flexibility detection result of the user is obtained.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be appreciated by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not drive the essence of the corresponding technical solutions to depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (11)

1. A body data detection method, characterized by being applied to an electronic device, the method comprising:
responding to a data detection instruction, and acquiring acceleration data of the electronic equipment;
calculating a first angle value based on the acceleration data, wherein the first angle value represents the lifting angle of the arm of the user;
calculating a second angle value based on the first angle value, the first length value, the second length value and a third length value, wherein the second angle value represents a leg lifting angle of a user, the first length value represents shoulder height of the user, the second length value represents arm length of the user, and the third length value represents leg length of the user;
and outputting a leg flexibility detection result of the user based on the leg lifting angle.
2. The method of claim 1, wherein calculating the second angle value based on the first angle value, the first length value, the second length value, and the third length value comprises:
calculating a designated height value based on the first angle value, the first length value and the second length value, wherein the designated height value represents the leg lifting height of a user;
and calculating a second angle value based on the specified height value and the third length value.
3. The method of claim 2, wherein the calculating a specified height value based on the first angle value, the first length value, and the second length value, the specified height value characterizing a leg lifting height of the user comprises:
performing sine calculation on the difference value between the first angle value and the first reference angle value to obtain a first angle value to be processed;
obtaining the product of a second length value and the first angle value to be processed;
and obtaining the sum of the product and the first length value as a designated height value.
4. The method of claim 2, wherein calculating a second angle value based on the specified height value and a third length value comprises:
acquiring a difference value between the specified height value and the third length value;
acquiring a ratio of the difference value to the third length value;
performing arcsine calculation on the ratio to obtain a second angle value to be processed;
and obtaining the sum of the second angle value to be processed and a second reference angle value as a second angle value.
5. The method of claim 1, wherein the acceleration data includes a first direction of acceleration, a second direction of acceleration, and a third direction of acceleration, wherein an included angle between any two of the first direction, the second direction, and the third direction is 90 degrees, wherein a first angle value is calculated based on the acceleration data, the first angle value representing an angle at which an arm of a user is raised, the method comprising:
acquiring the square sum of the acceleration in the first direction, the acceleration in the second direction and the acceleration in the third direction, and calculating the square root of the square sum;
obtaining a ratio of the acceleration in the third direction to the square root;
performing inverse cosine calculation on the ratio to obtain a third angle value to be processed;
and obtaining the sum of the third angle value to be processed and a third reference angle value as a first angle value.
6. The method according to any one of claims 1-5, wherein outputting the leg flexibility detection result of the user based on the leg lifting angle includes:
acquiring angle intervals corresponding to the leg lifting angles, wherein each angle interval corresponds to one leg flexibility detection result, and leg flexibility detection results corresponding to different angle intervals are different;
and outputting the leg flexibility detection result corresponding to the corresponding angle interval as the leg flexibility detection result of the user.
7. The method of any of claims 1-5, wherein the electronic device is a smart watch, and the acquiring acceleration data of the electronic device in response to the data detection instruction comprises:
responding to the data detection instruction, and acquiring whether the electronic equipment is currently in a wearing state;
and if the electronic equipment is in the wearing state, acquiring acceleration data of the electronic equipment.
8. The method of claim 7, wherein the method further comprises:
and if the electronic equipment is not in the wearing state, sending out prompt information for prompting the user to wear the electronic equipment.
9. A body data detection apparatus, operable in an electronic device, the apparatus comprising:
the acceleration data acquisition unit is used for responding to the data detection instruction and acquiring acceleration data of the electronic equipment;
the data calculation unit is used for calculating a first angle value based on the acceleration data, wherein the first angle value represents the lifting angle of the arm of the user;
the data calculation unit is further configured to calculate a second angle value based on the first angle value, the first length value, the second length value and the third length value, where the second angle value represents a leg lifting angle of the user, the first length value represents shoulder height of the user, the second length value represents arm length of the user, and the third length value represents leg length of the user;
and the detection unit is used for outputting a leg flexibility detection result of the user based on the leg lifting angle.
10. An electronic device is characterized by comprising an acceleration sensor, a processor and a memory;
one or more programs are stored in the memory and configured to be executed by the processor to implement the method of any of claims 1-8.
11. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a program code, wherein the program code, when being executed by a processor, performs the method of any of claims 1-8.
CN202010936369.9A 2020-09-08 2020-09-08 Body data detection method and device and electronic equipment Active CN114145710B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010936369.9A CN114145710B (en) 2020-09-08 2020-09-08 Body data detection method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010936369.9A CN114145710B (en) 2020-09-08 2020-09-08 Body data detection method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN114145710A CN114145710A (en) 2022-03-08
CN114145710B true CN114145710B (en) 2023-08-29

Family

ID=80460870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010936369.9A Active CN114145710B (en) 2020-09-08 2020-09-08 Body data detection method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN114145710B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4714799B1 (en) * 2010-11-01 2011-06-29 酒井医療株式会社 angle gauge
KR20120041896A (en) * 2010-10-22 2012-05-03 주식회사 사이보그-랩 Robot for evaluating and improving hamstring flexibility
CN104524742A (en) * 2015-01-05 2015-04-22 河海大学常州校区 Cerebral palsy child rehabilitation training method based on Kinect sensor
CN104864886A (en) * 2015-05-20 2015-08-26 华南师范大学 Micro-nano scale based movement monitoring method and system for three-axis acceleration sensor
CN205126436U (en) * 2015-10-21 2016-04-06 冯明光 Shin bone spin control measuring device in art that fractures futilely
CN206102749U (en) * 2016-07-13 2017-04-19 江门市新会区人民医院 Novel medical electronic leg machine of lifting
CN106618584A (en) * 2015-11-10 2017-05-10 北京纳通科技集团有限公司 Method for monitoring lower limb movement of user
KR20170074669A (en) * 2015-12-22 2017-06-30 경상남도 (교육청) Comprehensive flexibility training and measuring instrument
CN207532391U (en) * 2017-04-05 2018-06-26 王祝香 A kind of upper and lower extremities girth meaurement instrument
CN108245164A (en) * 2017-12-22 2018-07-06 北京精密机电控制设备研究所 A kind of wearable inertia device body gait information collection computational methods
CN110801232A (en) * 2019-11-05 2020-02-18 湖南师范大学 Human body flexibility measurement training device and measurement method
CN210543128U (en) * 2019-06-26 2020-05-19 黎周浩 Device is tempered to pliability for sports
CN210612136U (en) * 2019-07-18 2020-05-26 黄海容 Seat body anteflexion testing arrangement
CN210992810U (en) * 2019-10-17 2020-07-14 陕西职业技术学院 Body flexibility training device for dance training
CN111419630A (en) * 2020-03-05 2020-07-17 重庆三峡学院 Flexibility training device for training cheering gym

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130225378A1 (en) * 2012-02-16 2013-08-29 Denis E Burek Leg Stretching Machine For Simultaneously Stretching All Stride Muscles And Method Of Using

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120041896A (en) * 2010-10-22 2012-05-03 주식회사 사이보그-랩 Robot for evaluating and improving hamstring flexibility
JP4714799B1 (en) * 2010-11-01 2011-06-29 酒井医療株式会社 angle gauge
CN104524742A (en) * 2015-01-05 2015-04-22 河海大学常州校区 Cerebral palsy child rehabilitation training method based on Kinect sensor
CN104864886A (en) * 2015-05-20 2015-08-26 华南师范大学 Micro-nano scale based movement monitoring method and system for three-axis acceleration sensor
CN205126436U (en) * 2015-10-21 2016-04-06 冯明光 Shin bone spin control measuring device in art that fractures futilely
CN106618584A (en) * 2015-11-10 2017-05-10 北京纳通科技集团有限公司 Method for monitoring lower limb movement of user
KR20170074669A (en) * 2015-12-22 2017-06-30 경상남도 (교육청) Comprehensive flexibility training and measuring instrument
CN206102749U (en) * 2016-07-13 2017-04-19 江门市新会区人民医院 Novel medical electronic leg machine of lifting
CN207532391U (en) * 2017-04-05 2018-06-26 王祝香 A kind of upper and lower extremities girth meaurement instrument
CN108245164A (en) * 2017-12-22 2018-07-06 北京精密机电控制设备研究所 A kind of wearable inertia device body gait information collection computational methods
CN210543128U (en) * 2019-06-26 2020-05-19 黎周浩 Device is tempered to pliability for sports
CN210612136U (en) * 2019-07-18 2020-05-26 黄海容 Seat body anteflexion testing arrangement
CN210992810U (en) * 2019-10-17 2020-07-14 陕西职业技术学院 Body flexibility training device for dance training
CN110801232A (en) * 2019-11-05 2020-02-18 湖南师范大学 Human body flexibility measurement training device and measurement method
CN111419630A (en) * 2020-03-05 2020-07-17 重庆三峡学院 Flexibility training device for training cheering gym

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
柔韧性测量方法及其欲测属性的研究;徐玉明,张国海;《北京体育大学学报》;484-485 *

Also Published As

Publication number Publication date
CN114145710A (en) 2022-03-08

Similar Documents

Publication Publication Date Title
CN107944325B (en) Code scanning method, code scanning device and mobile terminal
US20180276451A1 (en) Method For Fingerprint Template Update And Terminal Device
CN106127481B (en) A kind of fingerprint method of payment and terminal
CN108491707B (en) Unlocking control method, terminal equipment and medium product
CN107742523B (en) Voice signal processing method and device and mobile terminal
CN106203290B (en) A kind of fingerprint image acquisition method and terminal
KR101486177B1 (en) Method and apparatus for providing hand detection
CN108319886B (en) Fingerprint identification method and device
CN112130918B (en) Intelligent device awakening method, device and system and intelligent device
JP2011022687A (en) Electronic device having authentication function and authentication method
CN110245483B (en) Biometric identification method and related product
CN108960120B (en) Fingerprint identification processing method and electronic equipment
CN106203034B (en) A kind of unlocked by fingerprint method and terminal
CN108629579B (en) Payment method and mobile terminal
WO2017206687A1 (en) Method for controlling unlocking and terminal
CN110188666B (en) Vein collection method and related products
CN108932486B (en) Fingerprint matching method and device and electronic device
CN109544172B (en) Display method and terminal equipment
CN109558243A (en) Processing method, device, storage medium and the terminal of virtual data
CN111387978A (en) Method, device, equipment and medium for detecting action section of surface electromyogram signal
CN109754823A (en) A kind of voice activity detection method, mobile terminal
CN111599460A (en) Telemedicine method and system
CN115252339A (en) Control method, control device, electronic equipment and storage medium
CN112995918A (en) Positioning method, positioning device, storage medium and electronic equipment
CN110796147B (en) Image segmentation method and related product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant