CN116758627A - Automatic evaluation method for motion of straight jump air technology - Google Patents

Automatic evaluation method for motion of straight jump air technology Download PDF

Info

Publication number
CN116758627A
CN116758627A CN202310606665.6A CN202310606665A CN116758627A CN 116758627 A CN116758627 A CN 116758627A CN 202310606665 A CN202310606665 A CN 202310606665A CN 116758627 A CN116758627 A CN 116758627A
Authority
CN
China
Prior art keywords
gesture
posture
standard
jump
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310606665.6A
Other languages
Chinese (zh)
Inventor
陆声链
麦贤健
向军
邱志良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi Normal University
Original Assignee
Guangxi Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangxi Normal University filed Critical Guangxi Normal University
Priority to CN202310606665.6A priority Critical patent/CN116758627A/en
Publication of CN116758627A publication Critical patent/CN116758627A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Psychiatry (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for automatically evaluating the motion of a straight jump air technology, which comprises the following steps: s1, determining the placement position and shooting parameters of a camera; s2, estimating the human body posture and extracting characteristic angles; s3, establishing an air technical action standard library; s4, matching the action gesture of the air technology; s5, evaluating the action gesture of the air technology. The method can quantitatively evaluate the jump taking-off and vacation technical action gesture of the jump person, provide quantized data for the straight jump in-flight technical action of the jump person, serve as an important reference index for performance assessment, and reduce the influence of artificial subjective factors caused by evaluation; when the long jump person trains at ordinary times, the problem that various technologies are not standard in the long jump process can be corrected in time by referring to the evaluation result, and the quantized feedback data can be referred to guide the long jump person pertinently, so that the possibility of personalized training is improved.

Description

Automatic evaluation method for motion of straight jump air technology
Technical Field
The invention relates to the technical field of machine vision, in particular to an automatic evaluation method for a straight jump sky technical action.
Background
As a complex track and field sport, straight jump is generally divided into four technical stages, including: run-up, take-off, vacate, land. From the pedal jumping to the landing moment, each limb action has a crucial influence on the long jump performance. Therefore, the related technical actions of the long jump person are concerned during ordinary training or examination. However, in the teaching scene of the straight jump general repair course of the university, as the teacher has difference in understanding of each detailed action and lacks unified standards, the performance assessment of the students often depends on personal experience, lacks objectivity and also lacks quantitative feedback data, and is difficult to realize personalized training of the students. In addition, professional long jump technology analysis equipment is expensive and difficult to be widely used. In order to solve the problems, the method aims at analyzing and evaluating the gestures of the students based on a human body gesture estimation algorithm and providing quantized feedback data so as to discover and correct the technical action problems of the long jump of the students in time, and simultaneously, an automatic evaluation system is realized to improve the objectivity of the performance evaluation.
In recent years, the analysis of sports using machine vision has become a research hotspot. For example, using a method based on human body posture estimation to perform squat and push-up standard judgment and counting; judging whether yoga actions are in place, analyzing swimming postures, analyzing standing long jump postures and the like. The invention aims at the problems existing in the straight jump course teaching in the teaching scene, combines the related algorithms such as human body posture estimation in machine vision and the like to perform straight jump air technology analysis, and establishes an automatic evaluation method of the straight jump air technology based on human body posture estimation.
Disclosure of Invention
Aiming at the need of automatic evaluation of the student straight-up type long jump air technology in a college teaching scene, the invention provides an automatic evaluation method of the straight-up type long jump air technology action, which can quantitatively evaluate the jump and vacation technical action gestures of a long jump person, provide quantized data for the straight-up type long jump air technology action of the long jump person, serve as important reference indexes for performance evaluation, and reduce the influence of artificial subjective factors caused by evaluation; when the long jump person trains at ordinary times, the problem that various technologies are not standard in the long jump process can be corrected in time by referring to the evaluation result, and the quantized feedback data can be referred to guide the long jump person pertinently, so that the possibility of personalized training is improved.
The technical scheme for realizing the aim of the invention is as follows:
a method for automatically evaluating the motion of a straight jump air technology comprises the following steps:
s1, determining the placement position of a camera and shooting parameters: determining the placement position of a camera and shooting parameters of the camera according to the characteristics of a straight jump score examination and examination room in a campus, wherein the distance between the camera and the edge of a running-up area is D meters, and adjusting the camera to a proper angle for recording the whole process of jumping to landing by a jump person;
s2, human body posture estimation and feature angle extraction: the coordinates of the joint points of the human body in the image are obtained as a basis for the subsequent analysis of the air action gesture of the long jump person:
s2-1, extracting key point coordinates of a human body by using a BlazePose human body posture estimation algorithm, and removing key point information such as ears, eyes and the like which are irrelevant to researching posture actions of a long jump person;
s2-2, joint included angle calculating method, assuming that the included angle of the right leg of the long jump person is assumed, wherein the included angle formed by connecting lines of a right ankle joint A, a right knee joint B and a right hip joint C, B and A, C which are detected and output by BlazePose is theta, and A, B, C coordinates are respectively (x A ,y A )、(x B ,y B )、(x C ,y C ) The radian angle of AB and BC is:
the rad is radian, the radian calculated by the formula (1) is positive and negative, the absolute value of the angle is required to be converted, the limb angle theta epsilon [0,180 DEG ], and the final theta is obtained by the formula (2):
the included angle between joints is obtained by the formula (2), the included angle between the limb and the horizontal and vertical directions is assumed by the right leg of the long jump person, D is the intersection point of a straight line passing through the point A and parallel to the x axis and a straight line passing through the point B and perpendicular to the x axis, and the coordinate of D is (x B ,y A ) D 'is the intersection point of a straight line passing through the point A and perpendicular to the x-axis and the x-axis, and the coordinate of D' is (x A ,0),θ 1 Is the included angle between BA and AD, AD is parallel to the x-axis, theta 2 Is the included angle between BA and AD ', AD' is perpendicular to the x axis, and theta can be obtained by the formulas (1) and (2) 1 、θ 2
S3, establishing an air technical action standard library: as a complex sports item, standardized definition of technical actions of straight jump has been a difficult problem, and consistent understanding and evaluation are difficult to achieve in the teaching and practice process due to the uncertainty of the technical action standards; for this purpose, three important technical actions of taking off, straightening and spreading the hip and contracting the abdomen are selected for evaluation, standardized definition and scoring standards of the technical actions are established on the basis of the three important technical actions, the standardized definition and scoring standards of the three actions of taking off, straightening and spreading the hip and contracting the abdomen are obtained through comprehensive evaluation and consideration, key point coordinates of a human body are extracted in the step S2-1, corresponding gesture features are extracted, and a standard action gesture library is established;
s4, matching the action gesture of the air technology: extracting key frames most similar to standard action gestures from a long jump video to be detected by adopting a matching algorithm, wherein the key frames comprise three action gesture key frames of taking off, straightening and expanding hips and contracting abdomen and landing, and are used for subsequent gesture assessment, and the specific gesture matching process is as follows:
s4-1, extracting attitude characteristic angles, and defining three modes of taking off, straightening and spreading hip and contracting abdomenCharacteristic angle of each action, and characteristic angle of take-off gesture is theta 11 、θ 12 、θ 13 The method comprises the steps of carrying out a first treatment on the surface of the Characteristic angle theta for straightening and spreading hip posture 21 、θ 22 、θ 23 The method comprises the steps of carrying out a first treatment on the surface of the Characteristic angle of abdomen contraction posture is theta 31 、θ 32 、θ 33
S4-2, calculating the weighted Euclidean distance of the characteristic angles of the gesture to be measured and the standard gesture, extracting the gesture to be measured in the image from the long jump video frame by frame, and carrying out weighted Euclidean distance calculation on the characteristic angle vector formed by the characteristic angles corresponding to the three gestures and the characteristic angle vector corresponding to the three standard gestures according to the jump, the straight-up and hip-extension and the abdomen-contraction of the gesture to evaluate the similarity of the gesture to be measured and the three standard gestures, wherein the characteristic angle vector of the gesture to be measured is assumed to beThe corresponding characteristic angle vector of the standard gesture is +.> Weighted euclidean distance->The method comprises the following steps:
where i= 1,2,3,1 denotes a take-off posture, 2 denotes a straight hip posture, 3 denotes a crunching posture, p (i) 、q (i) A feature angle vector representing the corresponding gesture;for the weighted Euclidean distance between the gesture feature angle vector to be detected and the corresponding standard gesture feature angle vector,/for the gesture feature angle vector to be detected>For the weighted Euclidean distance between the straight feature angle vector of the gesture to be detected and the corresponding standard straight feature angle vector, < >>The weighted Euclidean distance between the abdomen-drawing characteristic angle vector of the gesture to be detected and the corresponding standard abdomen-drawing characteristic angle vector is calculated; n is the number of characteristic angles; />A weight representing the j-th feature angle of the gesture i;
s4-3, calculating a cosine distance, and setting a threshold t for improving the accuracy of gesture matching (i) For a pair ofCalculating the rest chord distance between the coordinate vectors of all the key points of the gesture to be detected and the coordinate vectors of the key points corresponding to the standard gesture, and assuming that the coordinate vectors of the gesture to be detected are +.>The coordinate vector of the standard posture is +.>The cosine distance between the coordinate vector of the gesture to be measured and the standard gesture is as follows:
s4-4, calculating a comprehensive distance, comprehensively considering the Euclidean distance of the characteristic angle and the cosine distance of the coordinate point vector of the gesture to be measured and the standard gesture, and obtaining a comprehensive distance d of the gesture to be measured and the standard gesture, wherein the comprehensive distance d is as shown in a formula (5):
wherein mu i 、ρ i To match with correspondingThe coefficient weight of the motion gesture, wherein i= 1,2,3,1 represents the jump gesture, 2 represents the straight hip gesture, and 3 represents the abdomen-contracting gesture;
s4-5, taking a min (d) (i) ) As a result of the final match;
s5, evaluating the action gesture of the air technology: scoring the three matched gestures in the step S4;
s5-1, determining a scoring function, evaluating three key action postures of jump, straightening and abdomen contraction, which are obtained through matching by a posture matching algorithm, outputting the score of each posture, scoring by using the similarity degree based on the to-be-tested posture and the standard posture, measuring the comprehensive distance d between the to-be-tested posture and the standard posture, which is calculated by using a formula (5), wherein the smaller d is similar, the higher the score is, and converting the comprehensive distance into a score by using a linear function:
score=k×f+c (6)
wherein k and c are constants;
s5-2, calculating gesture scores, selecting n groups of standing-up long jump videos with representativeness for obtaining values of k and c and improving accuracy and objectivity of the scores, respectively scoring 3 key action gestures of taking off, standing up and drawing abdomen in the videos under the conditions of no interference and influence, taking 100 minutes for each key gesture score, taking average score as the final score of the gesture, calculating the comprehensive distance d between the scored key gesture in the n groups of videos and the corresponding standard gesture, and finally obtaining 3 types of samplesWhere i= 1,2,3,1 denotes a take-off posture, 2 denotes a straight posture, 3 denotes a crunch posture, and video numbers t=1, 2, …, n; />The comprehensive distance d representing the standard gesture corresponding to the gesture i in the video number t is calculated by the formula (5), the Score is the average Score for scoring the gesture, and the least square fitting is usedFitting 3 types of samples respectively to obtain optimal k and c corresponding to jump, straight and abdomen-drawing postures, obtaining a comprehensive distance d for each matched posture, and substituting the comprehensive distance d into a formula (6) to obtain a corresponding score.
The key points of the technical scheme are as follows:
(1) And extracting key point coordinates of the long jump person by using a human body posture estimation algorithm, defining a characteristic angle according to the motion characteristics of the straight jump, calculating a weighted Euclidean distance by the characteristic angle vector of the posture to be detected and the characteristic angle vector of the standard posture to carry out posture matching, and adding cosine distance calculation of the posture coordinates to comprehensively consider the result of the posture matching in order to improve the matching accuracy.
(2) And scoring a plurality of groups of videos to construct a scoring data set to fit constants of a scoring function so as to improve the accuracy and objectivity of scoring.
The method quantitatively evaluates the jump taking-off and vacation technical action gestures of the jump person, provides quantized data for the jump taking-off and vacation technical action of the jump person, serves as an important reference index for performance assessment, and reduces the influence of artificial subjective factors caused by evaluation; when the long jump person trains at ordinary times, the problem that various technologies are not standard in the long jump process can be corrected in time by referring to the evaluation result, and the quantized feedback data can be referred to guide the long jump person pertinently, so that the possibility of personalized training is improved.
Drawings
FIG. 1 is a flow chart of an embodiment;
FIG. 2 is a diagram of the placement of the cameras and markers in an embodiment;
FIG. 3 is a schematic diagram of a human keypoint topology in an embodiment;
FIG. 4 is a graph showing the right leg joint angle of a long jump person according to the embodiment;
FIG. 5 is a graph showing the angles between the limb and the horizontal and vertical directions in the example;
FIG. 6 is a key motion joint feature angle diagram in an embodiment;
fig. 7 is a diagram of key actions and corresponding scores extracted in the embodiment.
Detailed Description
The present invention will now be further illustrated with reference to the drawings and examples, but is not limited thereto.
Examples:
referring to fig. 1, a method for automatically evaluating the motion of a straight jump over the air technology comprises the following steps:
s1, determining the placement position of a camera and shooting parameters: determining the placement position of a camera and shooting parameters of the camera according to the characteristics of a straight-up long jump score examination room in a campus, wherein the placement position of the camera is shown in fig. 2, the boundary of the camera from a running-up area is D meters, and the camera is adjusted to a proper angle for recording the whole process of a long jump from taking off to landing;
s2, human body posture estimation and feature angle extraction: the coordinates of the joint points of the human body in the image are obtained as a basis for the subsequent analysis of the air action gesture of the long jump person:
s2-1, extracting key point coordinates of a human body by using a BlazePose human body posture estimation algorithm, and removing key point information such as ears, eyes and the like which are irrelevant to posture actions of a person with long jump, wherein the extracted key point topology is shown in figure 3;
s2-2, as shown in FIG. 4, taking the angle of the right leg of the long jump person as an example, the angle formed by the connecting lines of the right ankle joint A, the right knee joint B and the right hip joint C, B and A, C of the BlazePose detection output is θ, and the coordinates of A, B, C are respectively (x A ,y A )、(x B ,y B )、(x C ,y C ) The radian angle of AB and BC is:
the rad is radian, the radian calculated by the formula (1) is positive and negative, the absolute value of the angle is required to be converted, the limb angle theta epsilon [0,180 DEG ], and the final theta is obtained by the formula (2):
the included angle between joints is obtained by the formula (2), the included angle between the limb and the horizontal and vertical directions is exemplified by the right leg of the long jump person, D is the intersection point of a straight line passing through the point A and parallel to the x axis and a straight line passing through the point B and perpendicular to the x axis, and the coordinate of D is (x B ,y A ) D 'is the intersection point of a straight line passing through the point A and perpendicular to the x-axis and the x-axis, and the coordinate of D' is (x A 0), as shown in FIG. 5, θ in FIG. 5 (a) 1 For the angle between BA and AD, AD is parallel to the x-axis, θ in FIG. 5 (b) 2 Is the included angle between BA and AD ', AD' is perpendicular to the x axis, and theta can be obtained by the formulas (1) and (2) 1 、θ 2
S3, establishing an air technical action standard library: as a complex sports item, standardized definition of technical actions of straight jump has been a difficult problem, and consistent understanding and evaluation are difficult to achieve in the teaching and practice process due to the uncertainty of the technical action standards; for this purpose, three important technical actions of taking off, straightening and spreading the hip and contracting the abdomen are selected for evaluation, standardized definition and scoring standards of the technical actions are established on the basis of the three important technical actions, the standardized definition and scoring standards of the three actions of taking off, straightening and spreading the hip and contracting the abdomen are obtained through comprehensive evaluation and consideration, key point coordinates of a human body are extracted in the step S2-1, corresponding gesture features are extracted, and a standard action gesture library is established;
s4, matching the action gesture of the air technology: extracting key frames most similar to standard action gestures from a long jump video to be detected by adopting a matching algorithm, wherein the key frames comprise three action gesture key frames of taking off, straightening and expanding hips and contracting abdomen and landing, and are used for subsequent gesture assessment, and the specific gesture matching process is as follows:
s4-1, extracting characteristic angles of the posture, wherein the characteristic angles defining three actions of taking off, straightening and expanding the hip and contracting the abdomen are respectively theta in the figure 6 (a) 11 、θ 12 、θ 13 θ in FIG. 6 (b) 21 、θ 22 、θ 33 And θ in FIG. 6 (c) 31 、θ 32 、θ 33
S4-2, calculating the gesture to be measured and the markThe characteristic angle weighted Euclidean distance of the quasi-gesture extracts the gesture to be detected in the image from the long jump video frame by frame, and carries out weighted Euclidean distance calculation on the characteristic angle vector formed by the characteristic angles corresponding to the three gestures of the jump figure 6 (a), the straight-up hip-expanding figure 6 (b) and the abdomen-drawing figure 6 (c) and the characteristic angle vector corresponding to the three standard gestures respectively to evaluate the similarity of the gesture to be detected and the three standard gestures respectively, and the characteristic angle vector of the gesture to be detected is assumed to beThe corresponding characteristic angle vector of the standard gesture is +.>Weighted euclidean distance->The method comprises the following steps:
where i= 1,2,3,1 denotes a take-off posture, 2 denotes a straight hip posture, 3 denotes a crunching posture, p (i) 、q (i) A feature angle vector representing the corresponding gesture;for the weighted Euclidean distance between the gesture feature angle vector to be detected and the corresponding standard gesture feature angle vector,/for the gesture feature angle vector to be detected>For the weighted Euclidean distance between the straight feature angle vector of the gesture to be detected and the corresponding standard straight feature angle vector, < >>The weighted Euclidean distance between the abdomen-drawing characteristic angle vector of the gesture to be detected and the corresponding standard abdomen-drawing characteristic angle vector is calculated; n is the number of characteristic angles; />A weight representing the j-th feature angle of the gesture i;
s4-3, calculating a cosine distance, and setting a threshold t for improving the accuracy of gesture matching (i) For a pair ofCalculating the rest chord distance between the coordinate vectors of all the key points of the gesture to be detected and the coordinate vectors of the key points corresponding to the standard gesture, and assuming that the coordinate vectors of the gesture to be detected are +.>The coordinate vector of the standard posture is +.>The cosine distance between the coordinate vector of the gesture to be measured and the standard gesture is as follows:
s4-4, calculating a comprehensive distance, comprehensively considering the Euclidean distance of the characteristic angle and the cosine distance of the coordinate point vector of the gesture to be measured and the standard gesture, and obtaining a comprehensive distance d of the gesture to be measured and the standard gesture, wherein the comprehensive distance d is as shown in a formula (5):
wherein mu i 、ρ i For matching coefficient weights of corresponding motion postures, wherein i= 1,2,3,1 represents a jump posture, 2 represents a straight hip-unfolding posture, and 3 represents a abdomen-contracting posture;
s4-5, taking a min (d) (i) ) As a result of the final match;
s5, evaluating the action gesture of the air technology: scoring the three matched gestures in the step S4;
s5-1, determining a scoring function, evaluating three key action postures of jump, straightening and abdomen contraction, which are obtained through matching by a posture matching algorithm, outputting the score of each posture, scoring by using the similarity degree based on the to-be-tested posture and the standard posture, measuring the comprehensive distance d between the to-be-tested posture and the standard posture, which is calculated by using a formula (5), wherein the smaller d is similar, the higher the score is, and converting the comprehensive distance into a score by using a linear function:
score=k×d+c (6)
wherein k and c are constants;
s5-2, calculating gesture scores, selecting n groups of standing-up long jump videos with representativeness for obtaining values of k and c and improving accuracy and objectivity of the scores, respectively scoring 3 key action gestures of taking off, standing up and drawing abdomen in the videos under the conditions of no interference and influence, taking 100 minutes for each key gesture score, taking average score as the final score of the gesture, calculating the comprehensive distance d between the scored key gesture in the n groups of videos and the corresponding standard gesture, and finally obtaining 3 types of samplesWhere i= 1,2,3,1 denotes a take-off posture, 2 denotes a straight posture, 3 denotes a crunch posture, and video numbers t=1, 2, …, n; />And (3) representing the comprehensive distance of the gesture i in the video number t and the standard gesture corresponding to the gesture i, wherein the comprehensive distance d is calculated by a formula (5), score is an average Score for scoring the gesture, a least square fitting method is used for fitting 3 types of samples respectively to obtain optimal k and c corresponding to the take-off, straightening and hip-expanding and abdomen-contracting gestures respectively, and the comprehensive distance d is calculated for each matched gesture and is substituted into the formula (6) to obtain the corresponding Score.
Simulation experiment: in a sports teaching scene of a college, a camera is placed according to a step S1 in the experiment, a standing-up long jump video of a class student is recorded, a jump, standing-up hip expansion and abdomen contraction key action image in each long jump video is intercepted through a representative 30-group long jump video carefully selected by a special sports teacher, 3 sports teachers score the intercepted key actions, and the best fitting result of each gesture action is obtained according to a step S5-2. The new long jump video sample is tested, and the extracted key gesture and the corresponding score are shown in fig. 7.

Claims (1)

1. The automatic evaluation method for the motion of the straight jump air technology is characterized by comprising the following steps:
s1, determining the placement position of a camera and shooting parameters: determining the placement position of a camera and shooting parameters of the camera according to the characteristics of a straight jump score examination and examination room in a campus, wherein the distance between the camera and the edge of a running-up area is D meters, and adjusting the camera to a proper angle for recording the whole process of jumping to landing by a jump person;
s2, human body posture estimation and feature angle extraction: acquiring coordinates of joint points of a human body in an image:
s2-1, extracting key point coordinates of a human body by using a BlazePose human body posture estimation algorithm, and removing key point information irrelevant to researching posture actions of a long jump person;
s2-2, joint included angle calculating method, assuming that the included angle of the right leg of the long jump person is assumed, wherein the included angle formed by connecting lines of a right ankle joint A, a right knee joint B and a right hip joint C, B and A, C which are detected and output by BlazePose is theta, and A, B, C coordinates are respectively (x A ,y A )、(x B ,y B )、(x C ,y C ) The radian angle of AB and BC is:
the rad is radian, the radian calculated by the formula (1) is positive and negative, the absolute value of the angle is required to be converted, the limb angle theta epsilon [0,180 DEG ], and the final theta is obtained by the formula (2):
the included angle between joints is obtained by the formula (2), the included angle between the limb and the horizontal and vertical directions is assumed by the right leg of the long jump person, D is the intersection point of a straight line passing through the point A and parallel to the x axis and a straight line passing through the point B and perpendicular to the x axis, and the coordinate of D is (x B ,y A ) D 'is the intersection point of a straight line passing through the point A and perpendicular to the x-axis and the x-axis, and the coordinate of D' is (x A ,0),θ 1 Is the included angle between BA and AD, AD is parallel to the x-axis, theta 2 Is the included angle between BA and AD ', AD' is perpendicular to the x axis, and theta can be obtained by the formulas (1) and (2) 1 、θ 2
S3, establishing an air technical action standard library: extracting key point coordinates of a human body in the step S2-1 to extract corresponding gesture features, and establishing a standard action gesture library;
s4, matching the action gesture of the air technology: extracting key frames most similar to the standard action gestures from the long jump video to be detected by adopting a matching algorithm, wherein the key frames comprise three action gesture key frames of taking off, straightening and spreading hips and contracting abdomen and landing, and the specific gesture matching process comprises the following steps:
s4-1, extracting a posture characteristic angle, and defining characteristic angles of three actions of taking off, straightening and spreading the hip and contracting the abdomen, wherein the characteristic angle of the taking off posture is theta 11 、θ 12 、θ 13 The method comprises the steps of carrying out a first treatment on the surface of the Characteristic angle theta for straightening and spreading hip posture 21 、θ 22 、θ 23 The method comprises the steps of carrying out a first treatment on the surface of the Characteristic angle of abdomen contraction posture is theta 31 、θ 32 、θ 33
S4-2, calculating the weighted Euclidean distance of the characteristic angles of the gesture to be measured and the standard gesture, extracting the gesture to be measured in the image from the long jump video frame by frame, and carrying out weighted Euclidean distance calculation on the characteristic angle vector formed by the characteristic angles corresponding to the three gestures and the characteristic angle vector corresponding to the three standard gestures according to the jump, the straight-up and hip-extension and the abdomen-contraction of the gesture to evaluate the similarity of the gesture to be measured and the three standard gestures, wherein the characteristic angle vector of the gesture to be measured is assumed to beThe corresponding characteristic angle vector of the standard gesture is +.> Weighted euclidean distance->The method comprises the following steps:
where i= 1,2,3,1 denotes a take-off posture, 2 denotes a straight hip posture, 3 denotes a crunching posture, p (i) 、q (i) A feature angle vector representing the corresponding gesture;for the weighted Euclidean distance between the gesture feature angle vector to be detected and the corresponding standard gesture feature angle vector,/for the gesture feature angle vector to be detected>For the weighted Euclidean distance between the straight feature angle vector of the gesture to be detected and the corresponding standard straight feature angle vector, < >>The weighted Euclidean distance between the abdomen-drawing characteristic angle vector of the gesture to be detected and the corresponding standard abdomen-drawing characteristic angle vector is calculated; n is the number of characteristic angles; />A weight representing the j-th feature angle of the gesture i;
s4-3, calculating a cosine distance, and setting a threshold t for improving the accuracy of gesture matching (i) For a pair ofCalculating the rest chord distance between the coordinate vectors of all the key points of the gesture to be detected and the coordinate vectors of the key points corresponding to the standard gesture, and assuming that the coordinate vectors of the gesture to be detected are +.>The coordinate vector of the standard posture is +.>The cosine distance between the coordinate vector of the gesture to be measured and the standard gesture is as follows:
s4-4, calculating a comprehensive distance, comprehensively considering the Euclidean distance of the characteristic angle and the cosine distance of the coordinate point vector of the gesture to be measured and the standard gesture, and obtaining a comprehensive distance d of the gesture to be measured and the standard gesture, wherein the comprehensive distance d is as shown in a formula (5):
wherein mu i 、ρ i For matching coefficient weights of corresponding motion postures, wherein i= 1,2,3,1 represents a jump posture, 2 represents a straight hip-unfolding posture, and 3 represents a abdomen-contracting posture;
s4-5, taking a min (d) (i) ) As a result of the final match;
s5, evaluating the action gesture of the air technology: scoring the three matched gestures in the step S4;
s5-1, determining a scoring function, evaluating three key action postures of jump, straightening and abdomen contraction, which are obtained through matching by a posture matching algorithm, outputting the score of each posture, scoring by using the similarity degree based on the to-be-tested posture and the standard posture, measuring the comprehensive distance d between the to-be-tested posture and the standard posture, which is calculated by using a formula (5), wherein the smaller d is similar, the higher the score is, and converting the comprehensive distance into a score by using a linear function:
score=k×d+c (6)
wherein k and c are constants;
s5-2, calculating gesture scores, selecting n groups of standing-up long jump videos with representativeness for obtaining values of k and c and improving accuracy and objectivity of the scores, respectively scoring 3 key action gestures of taking off, standing up and drawing abdomen in the videos under the conditions of no interference and influence, taking 100 minutes for each key gesture score, taking average score as the final score of the gesture, calculating the comprehensive distance d between the scored key gesture in the n groups of videos and the corresponding standard gesture, and finally obtaining 3 types of samplesWhere i= 1,2,3,1 denotes a take-off posture, 2 denotes a straight posture, 3 denotes a crunch posture, and video numbers t=1, 2, …, n; />And (3) representing the comprehensive distance of the gesture i in the video number t and the standard gesture corresponding to the gesture i, wherein the comprehensive distance d is calculated by a formula (5), score is an average Score for scoring the gesture, a least square fitting method is used for fitting 3 types of samples respectively to obtain optimal k and c corresponding to the jump, straight and abdomen-drawing gestures, and the comprehensive distance d is calculated for each matched gesture and is substituted into the formula (6) to obtain the corresponding Score.
CN202310606665.6A 2023-05-26 2023-05-26 Automatic evaluation method for motion of straight jump air technology Pending CN116758627A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310606665.6A CN116758627A (en) 2023-05-26 2023-05-26 Automatic evaluation method for motion of straight jump air technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310606665.6A CN116758627A (en) 2023-05-26 2023-05-26 Automatic evaluation method for motion of straight jump air technology

Publications (1)

Publication Number Publication Date
CN116758627A true CN116758627A (en) 2023-09-15

Family

ID=87958074

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310606665.6A Pending CN116758627A (en) 2023-05-26 2023-05-26 Automatic evaluation method for motion of straight jump air technology

Country Status (1)

Country Link
CN (1) CN116758627A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118016241A (en) * 2024-04-09 2024-05-10 南京康尼机电股份有限公司 Human body movement function assessment and correction training method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118016241A (en) * 2024-04-09 2024-05-10 南京康尼机电股份有限公司 Human body movement function assessment and correction training method and system
CN118016241B (en) * 2024-04-09 2024-06-07 南京康尼机电股份有限公司 Human body movement function assessment and correction training method and system

Similar Documents

Publication Publication Date Title
CN109522850B (en) Action similarity evaluation method based on small sample learning
CN111368791B (en) Pull-up test counting method and system based on Quick-OpenPose model
CN109948459A (en) A kind of football movement appraisal procedure and system based on deep learning
CN109190446A (en) Pedestrian&#39;s recognition methods again based on triple focused lost function
CN113762133A (en) Self-weight fitness auxiliary coaching system, method and terminal based on human body posture recognition
CN111860267B (en) Multichannel body-building exercise identification method based on human body skeleton joint point positions
CN111401260B (en) Sit-up test counting method and system based on Quick-OpenPose model
CN110717392B (en) Sitting posture detection and correction method and device
CN107392939A (en) Indoor sport observation device, method and storage medium based on body-sensing technology
CN109508661B (en) Method for detecting hand lifter based on object detection and posture estimation
CN116758627A (en) Automatic evaluation method for motion of straight jump air technology
Yang et al. Human exercise posture analysis based on pose estimation
CN112446313A (en) Volleyball action recognition method based on improved dynamic time warping algorithm
CN113197572A (en) Human body work correction system based on vision
CN116844084A (en) Sports motion analysis and correction method and system integrating blockchain
CN114998986A (en) Computer vision-based pull-up action specification intelligent identification method and system
CN110309786A (en) A kind of milking sow posture conversion identification method based on deep video
CN112633083A (en) Method for detecting abnormal behaviors of multiple persons and wearing of mask based on improved Openpos examination
CN111563443A (en) Continuous motion action accuracy evaluation method
CN116189301A (en) Standing long jump motion standardability assessment method based on attitude estimation
CN115761901A (en) Horse riding posture detection and evaluation method
CN115512435A (en) Single-stage multi-person human body posture estimation method and device by using human body positioning
CN112802051B (en) Fitting method and system of basketball shooting curve based on neural network
CN115731608A (en) Physical exercise training method and system based on human body posture estimation
CN111507185B (en) Tumble detection method based on stack cavity convolution network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination