CN112580602A - Method and device for standardizing grip strength test - Google Patents

Method and device for standardizing grip strength test Download PDF

Info

Publication number
CN112580602A
CN112580602A CN202011613390.1A CN202011613390A CN112580602A CN 112580602 A CN112580602 A CN 112580602A CN 202011613390 A CN202011613390 A CN 202011613390A CN 112580602 A CN112580602 A CN 112580602A
Authority
CN
China
Prior art keywords
wireless
grip
data processing
module
grip strength
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011613390.1A
Other languages
Chinese (zh)
Inventor
张开宇
李建伟
杨召阳
钱德省
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sport University
Original Assignee
Beijing Sport University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sport University filed Critical Beijing Sport University
Priority to CN202011613390.1A priority Critical patent/CN112580602A/en
Publication of CN112580602A publication Critical patent/CN112580602A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • A61B5/224Measuring muscular strength
    • A61B5/225Measuring muscular strength of the fingers, e.g. by monitoring hand-grip force
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention relates to a method and a device for standardizing grip strength test, belonging to the field of physique test. The testee begins to carry out the grip test after facing the side surface of the grip tester provided with the wireless ranging and posture sensing module to the body of the testee. The wireless RGB-D depth camera module collects limb movement images in the grip strength test process of a testee, and the wireless distance measurement and posture sensing module measures distance data from the grip strength tester to the body and posture data of arms. The images and data are transmitted to the embedded data processing and interaction terminal in a wireless mode to judge the gripping force action specification, the embedded data processing and interaction terminal controls the display screen and the sound to prompt the gripping force action specification of the subject, and therefore the gripping force test action specification of the subject is achieved. When the device is used for measurement, manual observation is not needed, cheating of testees and related testers can be avoided, and automatic judgment on the standard degree of the grip strength test action can be quickly realized.

Description

Method and device for standardizing grip strength test
Technical Field
The invention relates to a method and a device for standardizing grip strength test, in particular to a test process for standardizing grip strength by using a sensor network, and belongs to the field of physique test.
Background
The grip strength test is mainly used for testing the maximum muscle strength of the forearm and the hand of a testee and is an important test for evaluating the strength of the upper limb of a human body in national physique determination standards. The standard test method comprises the following steps: when in test, a test subject rotates a grip distance adjusting button of the grip strength tester to adjust the grip distance to a proper grip distance, then holds the grip strength tester by hands with strength, the body is upright, two feet are naturally separated (same as the shoulder width), two arms are naturally drooped, and when in test, the test subject grips an upper grab handle and a lower grab handle with the maximum force. The test was performed twice, the maximum value was taken and recorded in kilograms, one digit after the decimal point was retained. At present, the grip strength test mainly passes through a mechanical or electronic grip strength tester, and the focus is mainly on the accurate measurement function of the grip strength, but whether the grip strength test process is standard or not, such as: when the force is exerted, the swing arm is forbidden, the user squats or the grip strength tester is contacted with the body, and the judgment is still carried out by depending on a manual observation mode.
In addition, in 2019, China proposed the compendium for construction of the strong country of sports "to 2035, the proportion of people who are more than 92% qualified by urban and rural residents according to the national physique determination standard" and how to realize large-scale grip standardization test in the whole population becomes a problem to be solved urgently.
Disclosure of Invention
The invention aims to provide a method and a device for standardizing grip strength test, which realize the automatic judgment of the standardized degree of grip strength test action, replace the existing manual observation mode and provide a solution for the national large-scale standardized grip strength test.
The purpose of the invention is realized by the following technical scheme.
A device for standardizing grip strength test mainly comprises: the system comprises a grip tester, a wireless RGB-D depth camera module, a wireless distance measurement and posture sensing module and an embedded data processing and display terminal.
The grip strength tester is a common grip strength tester on the market.
The wireless RGB-D depth camera module is placed right in front of the subject; the wireless distance measurement and attitude sensing module is arranged on the side surface of the grip strength tester; and the embedded data processing and displaying terminal is placed in front of the side of the subject.
The testee begins to carry out the grip test after facing the side surface of the grip tester provided with the wireless ranging and posture sensing module to the body of the testee. The wireless RGB-D depth camera module collects limb movement images in the grip strength test process of a testee, and the wireless distance measurement and posture sensing module measures distance data from the grip strength tester to the body and posture data of arms. The images and data are transmitted to the embedded data processing and interaction terminal in a wireless mode to judge the gripping force action specification, the embedded data processing and interaction terminal controls the display screen and the sound to prompt the gripping force action specification of the subject, and therefore the gripping force test action specification of the subject is achieved.
The wireless RGB-D depth camera module comprises an RGB-D image acquisition sub-module, a light supplement lamp sub-module and a wireless transceiving sub-module, wherein the RGB-D image acquisition sub-module is connected with a video output and control interface and a brightness control interface of the light supplement lamp sub-module. The RGB-D image acquisition sub-module acquires a limb action image of a subject during grip strength test, and sends the shot image to the embedded data processing and display terminal through the wireless transceiving sub-module to judge whether the posture of the subject is in line with the body erection, two feet are naturally separated (same shoulder width), the two arms naturally droop, and whether the illegal actions of swinging arms and squatting occur, and the light supplement lamp sub-module is used for supplementing light of the environment to be photographed.
The wireless distance measurement and attitude sensing module comprises a distance measurement sensor, an inertial attitude sensor, a controller and a wireless transceiver module, wherein data and control interfaces of the distance measurement sensor, the inertial attitude sensor and the wireless transceiver module are connected with the controller. The controller controls the distance measuring sensor and the inertial attitude sensor to respectively acquire distance data from the grip tester to the body of the testee and attitude data of the grip tester, and wirelessly transmits the data to the embedded data processing and interaction terminal to judge whether the grip tester is not attached to the body of the testee.
The embedded data processing and interaction terminal comprises an embedded signal processing and controller, a display screen, a sound and a wireless transceiving submodule, wherein a data and control interface of the wireless transceiving submodule is connected with the embedded signal processing and controller, and a signal input interface of the display screen and the sound is connected with the embedded signal processing and controller. The embedded signal processing and control device judges whether the grip strength test process of the testee is standard or not and controls the display screen and the sound to prompt the testee in an acousto-optic mode according to the image data acquired by the wireless RGB-D depth camera module and the distance and attitude data acquired by the wireless distance measurement and attitude sensing module. In addition, the embedded signal processing and control device realizes communication and control with the wireless RGB-D depth camera module and the wireless distance measurement and posture sensing module through the wireless transceiving submodule.
A method of normalizing grip strength testing, comprising the steps of:
the method comprises the following steps that firstly, a wireless RGB-D depth camera module, a wireless distance measurement and posture sensing module and a power switch of an embedded data processing and interaction terminal are started, initialization and pairing are completed between the modules and the terminal, the embedded data processing and interaction terminal gives an acousto-optic prompt to a subject, the system is ready, and testing can be started.
And step two, the examinee rotates a grip distance adjusting button of the grip strength tester, adjusts the grip distance to a proper grip distance, faces the side face of the grip strength tester provided with the wireless ranging and posture sensing module to the body of the examinee, and then holds the grip strength tester with hands forcefully, so that the body is upright, two feet are naturally separated (same as shoulder width), and two arms are naturally drooped down.
And thirdly, continuously shooting a plurality of images in real time by the wireless RGB-D depth camera module and transmitting the images to the embedded data processing and interaction terminal, judging whether the current ambient light meets the condition of stably identifying the joint points of the human body by the embedded data processing and interaction terminal, and if not, controlling a light supplementing lamp sub-module in the wireless RGB-D depth camera module to supplement light until the condition is met.
And step four, the embedded data processing and interaction terminal prompts the subject to start measurement in an acousto-optic mode, and the subject starts to carry out grip strength test.
In the grip strength testing process, the wireless RGB-D depth camera module shoots the limb action image of the subject in real time and wirelessly transmits the image to the embedded data processing and interaction terminal, and the embedded data processing and interaction terminal judges whether the conditions of body erection, natural separation of two feet (same shoulder width), natural sagging of two arms and no squatting are met by analyzing the current motion of the subject.
Meanwhile, the wireless distance measurement and gesture sensing module measures distance data from the grip tester to the body and gesture data of the arm in real time, wirelessly transmits the distance data to the embedded data processing and interaction terminal, and the embedded data processing and interaction terminal judges whether the grip tester meets the condition that the grip tester is not attached to the body.
If the two conditions are met simultaneously, the embedded data processing and interaction terminal outputs a test meeting the specification to the testee, if any one of the conditions is not met, the embedded data processing and interaction terminal outputs a test meeting the specification and a test not meeting the specification, and the steps from three to four are repeated until all the conditions are met.
And step five, the embedded data processing and interactive terminal outputs the finished test, and prompts the subject of the standardized evaluation result in an acousto-optic mode.
The algorithm for judging whether the current ambient light meets the requirement of stably identifying the human body joint points is as follows: according to N RGB images and depth images collected by the wireless RGB-D depth camera module, the key points of the face, legs and arms of a human body in each image are identified by using a common human body key point identification algorithm, and whether the number of the key points of each image is equal to a preset value or not is compared. And if the number of the key points of any image in the N pictures is lower than a preset value, the current ambient light is considered to be not satisfied with the condition of stably identifying the human body joint points.
And step four, judging whether the body is upright, two feet are naturally separated (the same shoulder width), two arms are naturally drooped, and the algorithm of not squatting is as follows: and identifying the key joint points of the whole body of the subject in the real-time shot image by using a human body key joint point extraction algorithm. And comparing the similarity of the identified key joint point information of the whole body with the preset key joint point information of the human body in the upright posture, and determining that the body is upright and does not squat when the similarity is greater than a set threshold value. Then, sequentially connecting shoulder key joint points, two-foot key joint points, a left arm key joint point and a right arm key joint point, and determining that the two feet are naturally separated (same shoulder width) when the difference between the Euclidean distance of the shoulder key joint points and the Euclidean distance of the two-foot key joint points is smaller than a set threshold value of the two feet and the same shoulder width by using a graphic geometric calculation mode; and (3) determining that the two-dimensional plane included angle between the connecting line of the key joint points of the left arm and the connecting line of the key joint points of the right arm is smaller than the set threshold angle for natural sagging of the two arms, and determining that the two arms sag naturally.
The fourth step is that the algorithm for judging whether the grip strength tester meets the requirement of not being attached to the body is as follows: and judging that the distance from the grip strength tester to the body is greater than a set minimum distance threshold value attached to the body, and determining that the grip strength tester is not attached to the body if the triaxial angular velocity and the triaxial accelerometer information measured by the attitude sensor are both less than a set arm stability threshold value.
Advantageous effects
(1) A method and a device for standardizing grip strength test can avoid cheating of testees and related testers without manual observation, and can quickly realize automatic judgment on the standardized degree of grip strength test action.
(2) A method and a device for standardizing grip strength test can be matched with grip strength testers on the market, and have strong universality.
Drawings
FIG. 1 is a schematic diagram of a wireless RGB-D depth camera module in accordance with an embodiment of the present invention.
Fig. 2 is a schematic diagram of a wireless ranging and attitude sensing module according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an embedded data processing and display terminal in an embodiment of the present invention.
FIG. 4 is a flowchart of a method according to an embodiment of the present invention.
Wherein,
the system comprises a 1-RGB-D depth camera module, a 2-wireless distance measurement and attitude sensing module and a 3-embedded data processing and display terminal.
In the 1-RGB-D depth camera module, an 11-RGB-D image acquisition sub-module, a 12-power switch, a 13-rechargeable power supply and a 14-wireless transceiver sub-module are arranged.
In the 2-wireless distance measurement and posture sensing module, 21-an infrared photoelectric proximity switch, 22-an inertial posture sensor, 23-a power switch, 24-a rechargeable power supply, 25-a controller and 26-a wireless transceiver module.
In the 3-embedded data processing and display terminal, 31-embedded signal processing and controller, 32-display screen, 33-sound, 34-rechargeable power supply, 35-power switch and 36-wireless transceiver module.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and examples.
An apparatus for normalizing grip strength testing, comprising: the system comprises a wireless RGB-D depth camera module 1, a wireless distance measurement and attitude sensing module 2 and an embedded data processing and display terminal 3.
Wherein:
the wireless RGB-D depth camera module 1, whose schematic diagram is shown in fig. 1, includes: the system comprises an RGB-D image acquisition sub-module 11, a power switch 12, a rechargeable power supply 13 and a wireless transceiver sub-module 14, and is mainly used for shooting face and body images and video information of a subject. The RGB-D image acquisition sub-module 11 has a resolution of 1920 × 1080, a frame rate of 30 frames, a resolution of 1280 × 768 for a depth image, and a frame rate of 30 frames, and has a power supply interface connected to the power switch 12, and a video output and control interface connected to the data interface of the wireless transceiver sub-module 14. The power switch 12 is used for controlling the on-off of the power supply of the wireless RGB-D depth camera module 1, one end of the power switch is connected with the rechargeable power supply 13, and the other end of the power switch is connected with the power supply interfaces of the RGB-D image acquisition submodule 11 and the wireless transceiver submodule 14. And the rechargeable power supply 13 is used for supplying power to other modules in the wireless RGB-D depth camera module 1 and is connected with the power switch 12. And the wireless transceiving submodule 14 is used for wireless communication between the RGB-D image acquisition submodule 11 and the embedded data processing and display terminal 3, a data interface of the wireless transceiving submodule is connected with an image data interface of the RGB-D image acquisition submodule 11, and a power supply interface of the wireless transceiving submodule is connected with the power switch 12.
The wireless ranging and attitude sensing module 2, the schematic diagram of which is shown in fig. 2, includes: the infrared photoelectric proximity switch 21, the inertial attitude sensor 22, the power switch 23, the rechargeable power supply 24, the controller 25 and the wireless transceiver module 26 are used for measuring the distance from the grip tester to the body of the subject and the attitude data of the grip tester, and wirelessly transmitting the data to the embedded data processing and interaction terminal 3. The infrared photoelectric proximity switch 21 is used for measuring the distance from the grip strength tester to the body of the subject, a power supply interface of the infrared photoelectric proximity switch is connected with the power switch 23, and a data output interface of the infrared photoelectric proximity switch is connected with a data interface of the controller 25. And the inertial attitude sensor 22 is used for measuring the attitude information of the grip tester, a power supply interface of the inertial attitude sensor is connected with the power switch 23, and a data output interface of the inertial attitude sensor is connected with a data interface of the controller 25. The power switch 23 is used for controlling the power on-off of the wireless ranging and attitude sensing module 2, one end of the power switch is connected with the rechargeable power supply 24, and the other end of the power switch is connected with the infrared photoelectric proximity switch 21, the inertial attitude sensor 22, the controller 25 and the power supply interface of the wireless transceiver module 26. And the rechargeable power supply 24 is used for supplying power to other modules in the wireless ranging and attitude sensing module 2 and is connected with the power switch 23. The controller 25 is mainly used for transmitting the distance data of the infrared photoelectric proximity switch 21 and the attitude data of the inertial attitude sensor 22 to the embedded data processing and displaying terminal 3 in a wireless communication mode after acquiring the distance data of the infrared photoelectric proximity switch and the attitude data of the inertial attitude sensor 22, wherein a data interface of the controller is connected with the infrared photoelectric proximity switch 21 and the inertial attitude sensor 22, and a wireless communication interface of the controller is connected with the wireless transceiver sub-module 26. And the wireless transceiver sub-module 26 is used for wireless communication with the embedded data processing and display terminal 3, a data interface of the wireless transceiver sub-module is connected with the controller 25, and a power supply interface of the wireless transceiver sub-module is connected with the power switch 23.
The schematic diagram of the embedded data processing and display terminal 3 is shown in fig. 3, and includes: the embedded signal processing and control device 31, the display screen 32, the sound device 33, the rechargeable power supply 34, the power switch 35 and the wireless transceiver sub-module 36 are used for analyzing data sent by the wireless RGB-D depth camera module 1 and the wireless distance measurement and posture sensing module 2, judging whether actions meet the standard of the grip strength test or not when the testee carries out the grip strength test through human key point information obtained by algorithm processing, and prompting the testee of the judgment result in an acousto-optic mode. The embedded signal processing and controlling device 31 is used for analyzing the image, distance and posture data received by the embedded data processing and displaying terminal 3, judging whether the action of the subject meets the standard of the grip strength test when the grip strength test is carried out by the subject through the human body key point information obtained by algorithm processing, sending the judgment result to the display screen 32, and carrying out sound and light prompt by the sound device 33. The wireless data receiving and transmitting interface is connected with a wireless receiving and transmitting submodule 36, the video and audio interface is respectively connected with the display screen 32 and the sound box 33, and the power supply interface is connected with a power switch 35. And the display screen 32 is used for displaying whether the action is normal or not, a data interface of the display screen is connected with the embedded signal processing and control unit 31, and a power supply interface of the display screen is connected with the power switch 35. The sound box 33 is used for sending out audio data for prompting whether the action of the subject is normal or not, a data interface of the sound box is connected with the embedded signal processing and controlling device 31, and a power supply interface is connected with the power switch 35. And the rechargeable power supply 34 is used for supplying power to other modules in the embedded data processing and display terminal 3 and is connected with the power switch 35. And the power switch 35 is used for controlling the on-off of the power supply of the embedded data processing and display terminal 3, one end of the power switch is connected with the rechargeable power supply 34, and the other end of the power switch is connected with the embedded signal processing and control device 31, the display screen 32, the sound box 33 and the wireless transceiver module 36. And the wireless transceiving submodule 36 is used for wireless communication between the embedded signal processing and control module 31 and the wireless RGB-D depth camera module 1 and the wireless distance measurement and attitude sensing module 2, a data interface of the wireless transceiving submodule is connected with a wireless data transceiving interface of the embedded signal processing and control module 31, and a power supply interface of the wireless transceiving submodule is connected with the power switch 35.
The working principle of the device for standardizing the grip strength test is as follows:
the testee begins to carry out the grip test after facing the side surface of the grip tester provided with the wireless ranging and posture sensing module to the body of the testee. The wireless RGB-D depth camera module collects limb movement images in the grip strength test process of a testee, and the wireless distance measurement and posture sensing module measures distance data from the grip strength tester to the body and posture data of arms. The images and data are transmitted to the embedded data processing and interaction terminal in a wireless mode to judge the gripping force action specification, the embedded data processing and interaction terminal controls the display screen and the sound to prompt the gripping force action specification of the subject, and therefore the gripping force test action specification of the subject is achieved.
A method for standardizing the grip strength test is shown in a flow chart of a method as shown in figure 4 and is implemented according to the following steps:
step one, initialization and pairing are completed between each module and the terminal. And starting power switches of the wireless RGB-D depth camera module, the wireless distance measurement and posture sensing module and the embedded data processing and interaction terminal, finishing initialization and pairing between the modules and the terminal, and prompting a subject by the embedded data processing and interaction terminal in an acousto-optic manner, wherein the system is ready and can start testing.
And step two, the testee performs a grip test preparation action. The testee rotates a grip distance adjusting button of the grip strength tester, adjusts the grip distance to a proper distance, faces the side surface of the grip strength tester provided with the wireless ranging and posture sensing module to the body of the testee, and then holds the grip strength tester with hands forcefully, so that the body is upright, two feet are naturally separated (same as the shoulder width), and two arms are naturally drooped.
And step three, judging whether the current ambient light meets the condition of stably identifying the human body joint points. The wireless RGB-D depth camera module continuously shoots a plurality of images in real time and transmits the images to the embedded data processing and interaction terminal, the embedded data processing and interaction terminal judges whether the current ambient light meets the condition of stably identifying the joint points of the human body, and if not, the light supplementing lamp sub-module in the wireless RGB-D depth camera module is controlled to supplement light until the condition is met.
The algorithm for judging whether the current environment illumination meets the requirement of stably identifying the human body joint points is as follows: according to N RGB images and depth images collected by the wireless RGB-D depth camera module, the key points of the face, legs and arms of a human body in each image are identified by using a common human body key point identification algorithm, and whether the number of the key points of each image is equal to a preset value or not is compared. And if the number of the key points of any image in the N pictures is lower than a preset value, the current ambient light is considered to be not satisfied with the condition of stably identifying the human body joint points.
And step four, starting grip strength test by the testee, and judging whether the action is standard or not. And the embedded data processing and interaction terminal prompts the testee to start measurement in an acousto-optic mode, and the testee starts to carry out grip strength test.
In the grip strength testing process, the wireless RGB-D depth camera module shoots an image of a subject in real time and wirelessly transmits the image to the embedded data processing and interaction terminal, and the embedded data processing and interaction terminal judges whether the conditions of body erection, natural separation of two feet (same shoulder width), natural sagging of two arms and no squatting are met by analyzing the current motion of the subject.
Judging whether the body is upright, two feet are naturally separated (the same shoulder width), two arms are naturally drooped, and the algorithm of not squatting is as follows: and identifying the key joint points of the whole body of the subject in the real-time shot image by using a human body key joint point extraction algorithm. And comparing the similarity of the identified key joint point information of the whole body with the preset key joint point information of the human body in the upright posture, and determining that the body is upright and does not squat when the similarity is greater than a set threshold value. Then, sequentially connecting shoulder key joint points, two-foot key joint points, a left arm key joint point and a right arm key joint point, and determining that the two feet are naturally separated (same shoulder width) when the difference between the Euclidean distance of the shoulder key joint points and the Euclidean distance of the two-foot key joint points is smaller than a set threshold value of the two feet and the same shoulder width by using a graphic geometric calculation mode; and (3) determining that the two-dimensional plane included angle between the connecting line of the key joint points of the left arm and the connecting line of the key joint points of the right arm is smaller than the set threshold angle for natural sagging of the two arms, and determining that the two arms sag naturally.
Meanwhile, the wireless distance measurement and gesture sensing module measures distance data from the grip tester to the body and gesture data of the arm in real time, wirelessly transmits the distance data to the embedded data processing and interaction terminal, and the embedded data processing and interaction terminal judges whether the grip tester meets the condition that the grip tester is not attached to the body.
The algorithm for judging whether the grip strength tester meets the condition that the grip strength tester is not attached to the body is as follows: and judging that the distance from the grip strength tester to the body is greater than a set minimum distance threshold value attached to the body, and determining that the grip strength tester is not attached to the body if the triaxial angular velocity and the triaxial accelerometer information measured by the attitude sensor are both less than a set arm stability threshold value.
If the two conditions are met simultaneously, the embedded data processing and interaction terminal outputs a test meeting the specification to the testee, if any one of the conditions is not met, the embedded data processing and interaction terminal outputs a test meeting the specification and a test not meeting the specification, and the steps from three to four are repeated until all the conditions are met.
And step five, reminding the testee to finish the test. And the embedded data processing and interactive terminal outputs the finished test, and prompts the subject of the normalized evaluation result in an acousto-optic mode.
The invention does not need manual observation, can prevent the cheating of the testee and related testers, and can quickly realize the automatic judgment of the standard degree of the grip strength test action; meanwhile, the device can be matched with a grip strength tester on the market, and has strong universality.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (8)

1. The utility model provides a device of standard grip test, includes the grip tester, its characterized in that: further comprising: the system comprises a wireless RGB-D depth camera module, a wireless distance measurement and attitude sensing module and an embedded data processing and display terminal;
the wireless RGB-D depth camera module is placed right in front of the subject; the wireless distance measurement and attitude sensing module is arranged on the side surface of the grip strength tester; the embedded data processing and displaying terminal is placed in front of the side of the subject;
the test subject starts to carry out the grip test after facing the side surface of the grip tester provided with the wireless ranging and posture sensing module to the body of the test subject; the wireless RGB-D depth camera module collects a limb action image in the grip strength test process of a testee, and the wireless distance measurement and posture sensing module measures distance data from a grip strength tester to a body and posture data of an arm; the images and data are transmitted to the embedded data processing and interaction terminal in a wireless mode to judge the gripping force action specification, the embedded data processing and interaction terminal controls the display screen and the sound to prompt the gripping force action specification of the subject, and therefore the gripping force test action specification of the subject is achieved.
2. The device for normalized grip strength test of claim 1, wherein: the wireless RGB-D depth camera module comprises an RGB-D image acquisition sub-module, a light supplement lamp sub-module and a wireless transceiving sub-module, wherein the RGB-D image acquisition sub-module is connected with a video output and control interface and a brightness control interface of the light supplement lamp sub-module; the RGB-D image acquisition sub-module acquires a limb action image of a testee during grip strength test, and sends the shot image to the embedded data processing and display terminal through the wireless transceiving sub-module to judge whether the posture of the testee is in line with the body erection, two feet are naturally separated, the two arms naturally droop, and whether the illegal actions of swinging arms and squatting occur, and the light supplementing lamp sub-module is used for supplementing light of the shooting environment under test.
3. The device for normalized grip strength test of claim 1, wherein: the wireless distance measurement and attitude sensing module comprises a distance measurement sensor, an inertial attitude sensor, a controller and a wireless transceiver module, and data and control interfaces of the distance measurement sensor, the inertial attitude sensor and the wireless transceiver module are connected with the controller; the controller controls the distance measuring sensor and the inertial attitude sensor to respectively acquire distance data from the grip tester to the body of the testee and attitude data of the grip tester, and wirelessly transmits the data to the embedded data processing and interaction terminal to judge whether the grip tester is not attached to the body of the testee.
4. The device for normalized grip strength test of claim 1, wherein: the embedded data processing and interaction terminal comprises an embedded signal processing and controller, a display screen, a sound and a wireless transceiving submodule, wherein data and control interfaces of the wireless transceiving submodule are connected with the embedded signal processing and controller; the embedded signal processing and control device judges whether the grip strength test process of the testee is standard and controls a display screen and a sound to prompt the testee in an acousto-optic mode according to the image data acquired by the wireless RGB-D depth camera module and the distance and attitude data acquired by the wireless distance measurement and attitude sensing module; in addition, the embedded signal processing and control device realizes communication and control with the wireless RGB-D depth camera module and the wireless distance measurement and posture sensing module through the wireless transceiving submodule.
5. A method for performing a standardized grip test using the apparatus of claim 1 or 2 or 3 or 4, wherein: the method comprises the following steps:
the method comprises the following steps that firstly, a wireless RGB-D depth camera module, a wireless distance measurement and posture sensing module and a power switch of an embedded data processing and interaction terminal are started, initialization and pairing are completed between the modules and the terminal, the embedded data processing and interaction terminal gives an acousto-optic prompt to a subject, the system is ready, and testing can be started;
secondly, the examinee rotates a grip distance adjusting button of the grip strength tester, adjusts the grip distance to a proper grip distance, faces the side face of the grip strength tester provided with the wireless ranging and posture sensing module to the body of the examinee, and then holds the grip strength tester with hands forcefully, so that the body is upright, two feet are naturally separated (same as the shoulder width), and two arms naturally droop;
thirdly, continuously shooting a plurality of images in real time by the wireless RGB-D depth camera module and transmitting the images to the embedded data processing and interaction terminal, judging whether the current ambient light meets the condition of stably identifying the joint points of the human body by the embedded data processing and interaction terminal, and if not, controlling a light supplementing lamp sub-module in the wireless RGB-D depth camera module to supplement light until the condition is met;
step four, the embedded data processing and interaction terminal prompts the subject to start measurement in an acousto-optic mode, and the subject starts to carry out grip strength test;
in the grip strength testing process, the wireless RGB-D depth camera module shoots a limb action image of a subject in real time and wirelessly transmits the limb action image to the embedded data processing and interaction terminal, and the embedded data processing and interaction terminal judges whether the conditions of body erection, natural separation of two feet (same shoulder width), natural sagging of two arms and no squatting are met by analyzing the current motion of the subject;
meanwhile, the wireless distance measurement and posture sensing module measures the distance data from the grip tester to the body and the posture data of the arm in real time, wirelessly transmits the distance data to the embedded data processing and interaction terminal, and the embedded data processing and interaction terminal judges whether the grip tester meets the condition that the grip tester is not attached to the body;
if the two conditions are met simultaneously, the embedded data processing and interaction terminal outputs a standard test to the testee, if any one of the conditions is not met, the embedded data processing and interaction terminal outputs a standard test and a non-standard condition, and the third step to the fourth step are repeated until all the conditions are met;
and step five, the embedded data processing and interactive terminal outputs the finished test, and prompts the subject of the standardized evaluation result in an acousto-optic mode.
6. The method of claim 5, wherein: the algorithm for judging whether the current ambient light meets the requirement of stably identifying the human body joint points is as follows: identifying key points of the face, legs and arms of a human body in each image by using a common human body key point identification algorithm according to N RGB images and depth images acquired by a wireless RGB-D depth camera module, and comparing whether the number of the key points of each image is equal to a preset value or not; and if the number of the key points of any image in the N pictures is lower than a preset value, the current ambient light is considered to be not satisfied with the condition of stably identifying the human body joint points.
7. The method of claim 5, wherein: and step four, judging whether the body is upright or not, separating two feet naturally, allowing two arms to droop naturally, wherein the algorithm of not squatting is as follows: identifying key joint points of the whole body of a subject in a real-time shot image by using a human key joint point extraction algorithm; comparing the similarity of the identified key joint point information of the whole body with the preset key joint point information of the human body in the upright posture, and determining that the body is upright and does not squat when the similarity is greater than a set threshold value; then, sequentially connecting shoulder key joint points, two-foot key joint points, a left arm key joint point and a right arm key joint point, and determining that the two feet are naturally separated when the difference between the Euclidean distance of the shoulder key joint points and the Euclidean distance of the two-foot key joint points is smaller than a set threshold value of the two feet with the same shoulder width by using a graphic geometric calculation mode; and (3) determining that the two-dimensional plane included angle between the connecting line of the key joint points of the left arm and the connecting line of the key joint points of the right arm is smaller than the set threshold angle for natural sagging of the two arms, and determining that the two arms sag naturally.
8. The method of claim 5, wherein: the fourth step is that the algorithm for judging whether the grip strength tester meets the requirement of not being attached to the body is as follows: and judging that the distance from the grip strength tester to the body is greater than a set minimum distance threshold value attached to the body, and determining that the grip strength tester is not attached to the body if the triaxial angular velocity and the triaxial accelerometer information measured by the attitude sensor are both less than a set arm stability threshold value.
CN202011613390.1A 2020-12-30 2020-12-30 Method and device for standardizing grip strength test Pending CN112580602A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011613390.1A CN112580602A (en) 2020-12-30 2020-12-30 Method and device for standardizing grip strength test

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011613390.1A CN112580602A (en) 2020-12-30 2020-12-30 Method and device for standardizing grip strength test

Publications (1)

Publication Number Publication Date
CN112580602A true CN112580602A (en) 2021-03-30

Family

ID=75145096

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011613390.1A Pending CN112580602A (en) 2020-12-30 2020-12-30 Method and device for standardizing grip strength test

Country Status (1)

Country Link
CN (1) CN112580602A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113617017A (en) * 2021-07-23 2021-11-09 广州偶游网络科技有限公司 Squatting action identification method, device, equipment, computer equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104598896A (en) * 2015-02-12 2015-05-06 南通大学 Automatic human tumble detecting method based on Kinect skeleton tracking
CN104688233A (en) * 2015-02-11 2015-06-10 深圳泰山在线科技有限公司 Physique test machine
CN109298779A (en) * 2018-08-10 2019-02-01 济南奥维信息科技有限公司济宁分公司 Virtual training System and method for based on virtual protocol interaction
CN109815907A (en) * 2019-01-25 2019-05-28 深圳市象形字科技股份有限公司 A kind of sit-ups attitude detection and guidance method based on computer vision technique
CN110458061A (en) * 2019-07-30 2019-11-15 四川工商学院 A kind of method and company robot of identification Falls in Old People

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104688233A (en) * 2015-02-11 2015-06-10 深圳泰山在线科技有限公司 Physique test machine
CN104598896A (en) * 2015-02-12 2015-05-06 南通大学 Automatic human tumble detecting method based on Kinect skeleton tracking
CN109298779A (en) * 2018-08-10 2019-02-01 济南奥维信息科技有限公司济宁分公司 Virtual training System and method for based on virtual protocol interaction
CN109815907A (en) * 2019-01-25 2019-05-28 深圳市象形字科技股份有限公司 A kind of sit-ups attitude detection and guidance method based on computer vision technique
CN110458061A (en) * 2019-07-30 2019-11-15 四川工商学院 A kind of method and company robot of identification Falls in Old People

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113617017A (en) * 2021-07-23 2021-11-09 广州偶游网络科技有限公司 Squatting action identification method, device, equipment, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111437583B (en) Badminton basic action auxiliary training system based on Kinect
CN111368810B (en) Sit-up detection system and method based on human body and skeleton key point identification
CN103099602B (en) Based on the physical examinations method and system of optical identification
CN106621284B (en) A kind of intelligent physical examinations platform
CN111282248A (en) Pull-up detection system and method based on skeleton and face key points
CN107754225A (en) A kind of intelligent body-building coaching system
KR20180103280A (en) An exercise guidance system for the elderly that performs posture recognition based on distance similarity between joints
CN106934830A (en) A kind of contactless fitness test system and method for testing based on depth image
CN107080524A (en) Intelligent physical examination all-in-one
CN112614399A (en) Dance teaching equipment based on virtual reality and teaching method thereof
CN111012353A (en) Height detection method based on face key point recognition
CN113856186B (en) Pull-up action judging and counting method, system and device
CN109543652A (en) A kind of wisdom ski training device and its training result display methods, Cloud Server
CN111985393A (en) Intelligent mirror for correcting motion posture and motion posture correcting method thereof
CN112580602A (en) Method and device for standardizing grip strength test
CN115568823B (en) Human body balance capability assessment method, system and device
CN115909839B (en) Medical education training assessment system and method based on VR technology
CN106730770A (en) Athletic posture testing equipment, treadmill, athletic posture detection method
CN105997094A (en) A posture identification device and method
CN108379815A (en) The automation training system with Real-time Feedback based on elastic intelligent sensor node
CN108434698B (en) Sports ball game teaching system
CN114832349B (en) Yuanzhou swimming teaching auxiliary system and use method thereof
CN108608439A (en) A kind of childhood emotional leads robot and method off
CN106390418B (en) A kind of implementation method of ultrasonic wave chin-up test
CN115690895A (en) Human skeleton point detection-based multi-person motion detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210330

RJ01 Rejection of invention patent application after publication