CN113065505A - Body action rapid identification method and system - Google Patents

Body action rapid identification method and system Download PDF

Info

Publication number
CN113065505A
CN113065505A CN202110408054.1A CN202110408054A CN113065505A CN 113065505 A CN113065505 A CN 113065505A CN 202110408054 A CN202110408054 A CN 202110408054A CN 113065505 A CN113065505 A CN 113065505A
Authority
CN
China
Prior art keywords
action
joint
motion
human body
discrimination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110408054.1A
Other languages
Chinese (zh)
Other versions
CN113065505B (en
Inventor
刘曦泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China National Institute of Standardization
Original Assignee
China National Institute of Standardization
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China National Institute of Standardization filed Critical China National Institute of Standardization
Priority to CN202110408054.1A priority Critical patent/CN113065505B/en
Publication of CN113065505A publication Critical patent/CN113065505A/en
Application granted granted Critical
Publication of CN113065505B publication Critical patent/CN113065505B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a system for quickly identifying body actions, and belongs to the field of body action identification. The method comprises the following steps: selecting body motion in the human body joint points to judge the joint points and establishing a human body joint simplification model; taking a joint included angle parameter and a joint vertical height parameter in the human body joint simplified model as action discrimination parameters; establishing a human body central projection coordinate system for acquiring motion discrimination parameters by taking a human body motion discrimination joint point in the human body joint simplified model as a coordinate origin; establishing a motion recognition model according to the change characteristics of the motion discrimination parameters corresponding to the body motion; and continuously acquiring action image frames, acquiring action distinguishing parameters from a human body central projection coordinate system according to the action image frames, and realizing specific action recognition by using an action recognition model according to the acquired action distinguishing parameters. The body motion rapid identification method and the body motion rapid identification system effectively simplify the identification process and improve the identification speed on the basis of ensuring the identification rate.

Description

Body action rapid identification method and system
Technical Field
The invention relates to the field of body motion recognition, in particular to a body motion quick recognition method and a body motion quick recognition system.
Background
With the advancement of science and technology, computer vision-based motion recognition becomes one of the important sources of human behavior data, and also provides a basis for human-based design decisions.
The human body action recognition method based on joint point data mainly comprises two types, firstly, the recognition result is dynamically classified through probability statistics, and the method mainly comprises three methods based on a dynamic Bayesian network, a hidden Markov model and a support vector machine; secondly, a method for matching templates by establishing a classification feature database is adopted, a typical representative of the method is a dynamic time warping algorithm, and the collected action data and a standard template are subjected to feature matching in real time to realize the purpose of rapid identification. In recent years, a deep learning method is also introduced into the field of motion recognition, and can provide a better recognition result. However, the above researches all use the recognition accuracy as a main target, require a large data set and calculation amount, have a long recognition time, and do not consider the requirements of different application scenarios on the recognition target.
The application scenes of the action recognition are very rich, and the application scenes comprise event monitoring, man-machine interaction, medical rehabilitation, machine equipment simulation and the like. Different application scenes have different requirements on the efficiency, the precision and the cost of action recognition. In most scenes, such as the human-computer interaction process during physical exercise and medical rehabilitation, only main body actions such as walking, squatting, running and jumping need to be recognized, but detailed actions of the extremities are not recognized, and the recognition response time is high so as to improve the human-computer interaction efficiency. Therefore, in such a scenario, the need to identify the joint points of the extremities and the overall displacement of the human body is low. In the previous researches, the effects of joint point simplification and overall displacement on the recognition result are rarely comprehensively discussed, and the research for improving the recognition speed is relatively less.
Disclosure of Invention
In order to solve the above problems, the present invention provides a method and a system for quickly recognizing body movements.
In order to achieve the above object, a first aspect of the present invention provides a body motion fast recognition method, including:
selecting body motion distinguishing joint points in the human body joint points, and establishing a human body joint simplification model, wherein the body motion distinguishing joint points at least comprise human body head, spine and shoulder intersection points, hip joint center points, left elbows, right elbows, left knees, right knees, left hands, right hands, left feet and right feet;
taking a joint included angle parameter and a joint vertical height parameter in the human body joint simplified model as action discrimination parameters;
establishing a human body center projection coordinate system (X, Y, Z) for acquiring motion discrimination parameters by taking a human body center as a coordinate origin O, wherein the human body center corresponds to a human body motion discrimination joint point in a human body joint simplification model;
establishing a motion recognition model according to the change characteristics of the motion discrimination parameters corresponding to the body motion;
and continuously acquiring action image frames, acquiring action distinguishing parameters from a human body central projection coordinate system according to the action image frames, and realizing specific action recognition by using an action recognition model according to the acquired action distinguishing parameters.
Optionally, the establishing a motion recognition model according to the variation characteristics of the motion discrimination parameters corresponding to the body motion includes:
continuously collecting action image frames, acquiring action distinguishing parameters in a plurality of action periods from a human body central projection coordinate system according to the action image frames, and extracting action distinguishing characteristic values from the acquired action distinguishing parameters in the plurality of action periods;
classifying actions, and collecting a preset number of action distinguishing characteristic values of each type of action as action samples;
establishing an action recognition model through the action sample;
the action recognition model comprises an action discriminant function and an action recognition rule base;
the action discrimination function is used for calculating an action discrimination index value according to the action discrimination characteristic value;
and the action identification rule base is used for classifying and calculating the collected action samples of each type of action by an action discriminant function to obtain an action discriminant index value set of each type of action, and is established according to the difference and the coverage range between the action discriminant index value sets corresponding to each type of action.
Optionally, the motion discrimination characteristic values include an included angle characteristic value and a joint vertical height characteristic value;
the included angle characteristic value is an average value of included angle parameters in the action judging parameters in a plurality of action periods in the plurality of action periods;
the joint vertical height characteristic value is an average value of the joint vertical height parameter in a plurality of action periods in the action judging parameters in the plurality of action periods.
Optionally, the establishing of the action identification rule base according to the difference and the coverage between the action discrimination index value sets corresponding to each type of action includes:
when two types of actions with intersection exist in the action judging index value set, and the two types of actions cannot be distinguished in the intersection range, defining a special joint point corresponding to each type of action according to distinguishing characteristics of the two types of actions;
and the special joint points are used for acquiring the coordinates of the special joint points corresponding to the action from the human body central coordinate system when the action identification can not be carried out according to the action identification rule base, and carrying out secondary action identification by combining the coordinates of the special joint points.
Optionally, the obtaining of the motion discrimination parameter from the human body central projection coordinate system according to the motion image frame, and implementing specific motion recognition by using the motion recognition model according to the obtained motion discrimination parameter includes:
acquiring action distinguishing parameters in a plurality of action periods from a human body central projection coordinate system according to the action image frame, and extracting action distinguishing characteristic values from the acquired action distinguishing parameters to input into an action recognition model;
the action discrimination function in the action discrimination model calculates an action discrimination index value according to the input action discrimination feature value, and identifies a specific action from the action discrimination rule base according to the calculated action discrimination index value.
Optionally, the action discriminant function includes a general action comprehensive discriminant function and a specific action comprehensive discriminant function;
the large-class action comprehensive discriminant function is used for calculating a large-class action comprehensive discriminant index value according to the numerical value after the non-dimensionalization processing of the included angle characteristic value and the numerical value after the non-dimensionalization processing of the joint vertical height characteristic value;
and the specific action comprehensive discriminant function is used for calculating a specific action comprehensive discriminant index value according to the numerical value after the non-dimensionalization processing of the included angle characteristic value.
Optionally, the taking the joint included angle parameter and the joint vertical height parameter in the human body joint reduced model as the motion discrimination parameters includes:
taking the included angle parameter of two bodies connected with the same joint in the human body joint simplified model as a joint included angle parameter, wherein the joint included angle parameter comprises the included angle phi between the left arm and the body trunk1Right arm and trunk included angle phi2The included angle phi between the left leg and the right leg3Right elbow joint included angle phi4Angle phi between left elbow joint and right elbow joint5Right knee joint angle phi6Angle phi with left knee joint7The included angle parameter is used for identifying specific actions;
the joint vertical height parameters comprise the height from the center of the hip joint to the feet and the height from the head to the feet, and are used for identifying large actions, wherein the large actions comprise vertical movement and horizontal movement.
Optionally, the establishing a human body center projection coordinate system (X, Y, Z) for obtaining the motion discrimination parameter with the human body center as the origin of coordinates O, where the human body center corresponds to a human body motion discrimination joint point of the human body joint reduced model, includes:
the central point of the hip joint of the human body in the body motion judging joint point is used as the coordinate origin O of a human body central projection coordinate system, the vertical horizontal plane faces upwards and is the positive direction of the Y axis, the direction of the video acquisition equipment pointing to the center of the hip joint is the positive direction of the Z axis, and the direction vertical to the YOZ plane is the direction of the X axis.
Optionally, the video acquisition device is set to point to the direction of the center of the hip joint, so that the XOY plane in the projection coordinate system of the center of the human body displays the side projection of the human body, and the motion discrimination parameters are obtained from the XOY plane.
A second aspect of the present invention provides a body motion quick recognition system, comprising: the body motion quick recognition method comprises a memory and a processor, wherein computer program instructions are stored in the memory, and the processor executes the computer program instructions in the memory to realize the body motion quick recognition method.
Through the technical scheme, the human joint simplification model, the human body center projection coordinate system and the action recognition model are established, and when the body action is recognized, the quick recognition of the body action can be realized only by acquiring the action distinguishing parameter and inputting the action recognition model. The identification process is effectively simplified on the basis of ensuring the identification rate, and the identification speed is improved.
Additional features and advantages of embodiments of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the embodiments of the invention without limiting the embodiments of the invention. In the drawings:
FIG. 1 is a simplified model diagram of a human joint for a method for rapidly recognizing body movement according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a central projection coordinate system of a method for rapidly recognizing body motion according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of motion discrimination parameters based on a human joint reduction model according to an embodiment of the present invention;
FIG. 4 is a block diagram of an angle parameter φ in a running maneuver provided in accordance with one embodiment of the present invention1And angle parameter phi4Schematic diagram of variations of (a);
FIG. 5 is an image frame of a motion cycle of a method for rapidly recognizing body motion according to an embodiment of the present invention;
fig. 6 is a flowchart of a body motion fast recognition method according to an embodiment of the present invention.
Description of the reference numerals
101-head; 102-spinal shoulder intersection; 103-hip joint center; 104-left elbow;
105-right elbow; 106-left knee; 107-right knee; 108-left hand;
109-right hand; 110-left foot; 111-right foot.
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present invention, are given by way of illustration and explanation only, not limitation.
Example one
One embodiment of the invention provides a body motion rapid recognition method, which comprises the following steps:
selecting body motion distinguishing joint points in the human body joint points, and establishing a human body joint simplification model, wherein the body motion distinguishing joint points at least comprise human body head, spine and shoulder intersection points, hip joint center points, left elbows, right elbows, left knees, right knees, left hands, right hands, left feet and right feet;
taking a joint included angle parameter and a joint vertical height parameter in the human body joint simplified model as action discrimination parameters;
establishing a human body center projection coordinate system (X, Y, Z) for acquiring motion discrimination parameters by taking a human body center as a coordinate origin O, wherein the human body center corresponds to a human body motion discrimination joint point in a human body joint simplification model;
establishing a motion recognition model according to the change characteristics of the motion discrimination parameters corresponding to the body motion;
and continuously acquiring action image frames, acquiring action distinguishing parameters from a human body central projection coordinate system according to the action image frames, and realizing specific action recognition by using an action recognition model according to the acquired action distinguishing parameters.
Because the number of the joint points determines the complexity of motion recognition calculation, the more the joint points are, the longer the calculation period is, and therefore, the motion judgment result cannot be given in time. According to the body motion rapid identification method provided by the embodiment of the invention, the joint point which can represent the body motion most in the human body joint points is selected as the body motion judgment joint point to establish the human body joint simplification model, so that the number of the joint points is effectively reduced on the basis of ensuring the motion identification, and the motion identification rate is improved.
In addition, the body motion fast recognition method provided by the embodiment of the invention establishes the body central projection coordinate system by taking a body motion discrimination joint point of a human body in the human body joint simplified model as the coordinate origin. When the human body acts, the human body is projected in a human body central projection coordinate system according to the human body joint simplified model, and the action judgment parameters of the human body action can be quickly and effectively acquired according to the projection of the human body in the human body central projection coordinate system without being influenced by the displacement change of the human body.
By establishing the human joint simplification model, the human body central projection coordinate system and the action recognition model, the quick body action recognition method provided by the embodiment of the invention can realize quick body action recognition only by acquiring the action discrimination parameters and inputting the action recognition model when the body action is recognized. The identification process is effectively simplified on the basis of ensuring the identification rate, and the identification speed is improved.
Fig. 1 is a schematic diagram of a simplified human joint model of a method for rapidly recognizing a body motion according to an embodiment of the present invention.
As shown in fig. 1, diagram (a) in fig. 1 is a joint point model generated using Kinect V2 in the prior art, the model including 25 joint points in total, including a hip joint center, a spine center, a neck, a head, a left shoulder, a left elbow, a left wrist, a left hand, a right shoulder, a right elbow, a right wrist, a right hand, a left hip, a left knee, a left ankle, a left foot, a right hip, a right knee, a right ankle, a right foot, a spine-shoulder intersection, a fingertip, a left thumb, a right fingertip, and a right thumb.
Based on GB/T10000-1988 'Chinese adult human body size' and partial research of human factor engineering, 25 key points in the joint point model generated by Kinect V2 are classified hierarchically, as shown in Table 1.
The first layer, the body trunk joint, is the relevant joint of the human trunk, which is part of the source of the characteristic information required for some actions.
The second layer, which is the joint point of the limbs, most of the movements of the human body are mainly expressed by the limbs of the human body, and they contain a large amount of characteristic information of the movement or posture of the human body.
The third layer, the joint points of hands and feet. The joints of the hand and foot are sources of information for fine motion capture of local motion (e.g., gesture recognition, foot motion, etc.).
A great deal of experimental research is carried out on various main body actions (such as walking, running, squatting, standing, jumping and the like) in the scenes of physical exercise, medical rehabilitation and the like. Research shows that the relative positions of the head, the neck, the intersection point of the spine and the shoulder, the left shoulder, the right shoulder, the center of the spine, the center of the hip joint, the left hip and the right hip in the first layer of joint points in the movement are basically fixed and do not influence the judgment of the body movement; in the second layer of joint points, the left wrist, the right wrist, the left ankle and the right ankle are omitted, and the action classification is not influenced; in the third layer of joint points, the details of hands and feet have no influence on the overall body action recognition.
Based on the above analysis, an embodiment of the present invention merges some adjacent joint points in the existing human joint model that have little influence on body motion recognition, simplifies the detailed joint points of hands and feet to obtain simplified joint points as shown in table 1, and obtains a human joint simplified model according to the simplified joint points. As shown in fig. 1 (B), the human joint reduction model includes a head 101, a spine-shoulder intersection 102, a hip joint center 103, a left elbow 104, a right elbow 105, a left knee 106, a right knee 107, a left hand 108, a right hand 109, a left foot 110, and a right foot 111. The human joint simplification model comprises 11 body motion distinguishing joint points.
Figure BDA0003023078000000081
TABLE 1
Specifically, as shown in table 1, according to the hierarchical division of the simplified joint points, the joint points with higher hierarchical levels may be preferentially defined and identified, for example, a vertical motion or a horizontal motion may be determined according to the change of the vertical heights of the head and the hip joint center with higher hierarchical levels.
Specifically, when the body moves, different included angles formed between limbs of the body along with the position change of joint points of the body are defined as joint included angle parameters, and the joint included angle parameters and the joint vertical height parameters in the human body joint simplification model are used as motion judgment parameters.
Fig. 2 is a schematic diagram of a central projection coordinate system of a method for rapidly recognizing a body motion according to an embodiment of the present invention. As shown in fig. 2, existing human motion recognition is generally established on the basis of an image coordinate system, and data acquired according to the image coordinate system reflects relative motion of each part of a human body and overall displacement, and in the process of human motion recognition, the overall displacement only helps recognition of a moving direction and belongs to an invalid parameter in the aspect of motion recognition. According to the projection in the human body central projection coordinate system during the motion of the human body, the change of the included angle parameter and the vertical height parameter of the joint of the human body can be embodied, the interference of the displacement of the human body on the motion recognition can be stripped, the recognition process is concentrated on the motion of the human body, and the accuracy and the speed of the motion recognition are further improved.
Further, as shown in fig. 2, an embodiment of the present invention provides a human body center projection coordinate system established with the hip joint center as the coordinate origin O, where the human body center projection coordinate system uses the upward direction of the vertical horizontal plane as the positive direction of the Y axis, the direction of the video capture device pointing to the hip joint center as the positive direction of the Z axis, and the direction perpendicular to the YOZ plane as the X axis. And acquiring a joint included angle parameter and a joint vertical height parameter in the action judgment parameters according to the projection of the human body on the XOY plane of the human body central projection coordinate system.
Furthermore, the video acquisition equipment is set to point to the direction of the center of the hip joint, so that the XOY plane in the central projection coordinate system of the human body displays the side projection of the human body, and then the action judgment parameters are obtained from the XOY plane.
Fig. 3 is a schematic diagram of motion discrimination parameters based on a human joint reduction model according to an embodiment of the present invention.
As shown in figure 3, the included angle parameter of two bodies connected with the same joint in the human body joint reduced model is taken as the joint included angle parameter, and the joint included angle parameter comprises the included angle phi between the left arm and the body trunk1Right arm and trunkAngle phi2The included angle phi between the left leg and the right leg3Right elbow joint included angle phi4Angle phi between left elbow joint and right elbow joint5Right knee joint angle phi6Angle phi with left knee joint7The angle parameter may be used to identify a particular action.
As shown in FIG. 3, the joint vertical height parameters, including the height of the hip center from the foot H and the height of the head from the foot H, can be used to identify a broad class of motion, including vertical and horizontal motion.
Specifically, the Y-axis coordinate value Y of the head of the human body in the human body central projection coordinate system is obtainedHead with a rotatable shaftY-axis coordinate value Y of hip joint centerCenter of a shipAnd the average Y of the Y-axis coordinates of the left and right feetFoot mean value. Height h ═ y from hip center to footCenter of a ship-yFoot mean valueI, height of head from foot H | yHead with a rotatable shaft-yFoot mean value|。
Further, according to the variation characteristics of the motion discrimination parameters corresponding to the body motion, establishing a motion recognition model, including:
continuously collecting action image frames, acquiring action distinguishing parameters in a plurality of action periods from a human body central projection coordinate system according to the action image frames, and extracting action distinguishing characteristic values from the acquired action distinguishing parameters in the plurality of action periods;
classifying actions, and collecting a preset number of action distinguishing characteristic values of each type of action as action samples;
establishing an action recognition model through the action sample;
the action recognition model comprises an action discriminant function and an action recognition rule base;
the action discrimination function is used for calculating an action discrimination index value according to the action discrimination characteristic value;
and the action identification rule base is used for classifying and calculating the collected action samples of each type of action by an action discriminant function to obtain an action discriminant index value set of each type of action, and is established according to the difference and the coverage range between the action discriminant index value sets corresponding to each type of action.
Further, the action distinguishing characteristic value is a value which can represent the action characteristic most, and comprises an included angle characteristic value and a joint vertical height characteristic value;
the included angle characteristic value is an average value of included angle parameters in the action judging parameters in a plurality of action periods in the plurality of action periods;
the joint vertical height characteristic value is an average value of the joint vertical height parameter in a plurality of action periods in the action judging parameters in the plurality of action periods.
Specifically, the selection rule of the angle parameter for each motion period in the angle characteristic value is as follows: before action, if the included angle parameter in a motion period is at a larger value, the included angle changes from big to small and then from small to big in the action period, and then the minimum value of the included angle parameter in the action period is selected; on the contrary, before the action, if the included angle parameter in a motion period is at a smaller value, the change of the included angle in the action period is changed from small to large and then from large to small, the maximum value of the included angle parameter in the action period is selected.
Similarly, the selection rule of the joint vertical height parameter of each motion period in the joint vertical height characteristic value is as follows: before action, if the vertical height parameter of the joint in a motion period is in a larger value, the change of the vertical height of the joint in the motion period is changed from large to small and then from small to large, then the minimum value of the vertical height parameter in the motion period is selected; on the contrary, before the action, if the vertical height parameter of the joint in a motion period is at a smaller value, the change of the vertical height of the joint in the motion period is from small to large and then from large to small, then the maximum value of the vertical height parameter in the motion period is selected.
FIG. 4 is a block diagram of an angle parameter φ in a running maneuver provided in accordance with one embodiment of the present invention1And angle parameter phi4Schematic diagram of the variation of (1). Collecting image frames in the running process, and obtaining included angle parameters through the projection of the human body in the central projection coordinate system of the human body in the image frames, as shown in fig. 4, in the running processThe included angle parameter phi between the middle and left arms and the trunk of the body1Angle parameter phi with right elbow joint4The angle of (c) varies cyclically with the period of motion. As shown in FIG. 4, diagram (A), the angle parameter φ in a running action cycle1The maximum value in each running action period is selected as the included angle parameter phi according to the selection rule of the included angle parameter characteristic value1The characteristic value of (2). Similarly, as shown in FIG. 4 (B), the angle parameter φ in a running action cycle4The data is changed from big to small and then big, and the minimum value in each running action period is selected as the included angle parameter phi according to the selection rule of the included angle parameter characteristic value4The characteristic value of (2). As shown in fig. 4, the point corresponding to each dotted line in the graph is the point corresponding to the included angle parameter characteristic value in an action cycle.
Specifically, regarding the collection of the action image frame, Total _ Video _ Converter-Video-audio conversion V3.11 software can be used to convert the Video shot by the Video capture device into a still image frame in jpg format. And the time (a cycle time) of one action period of each action is taken as a standard to carry out video segmentation, and a complete cycle action contained in each video segment is subjected to framing processing at intervals of several seconds per frame.
Fig. 5 is an image frame of a motion cycle of a body motion fast recognition method according to an embodiment of the present invention. As shown in fig. 5, the image frames of one motion cycle of walking, jumping, standing up in a squatting position, and running are illustrated, and the motion discrimination feature values of a plurality of motion cycles can be obtained from the projection of a plurality of image frames on the human body center projection coordinate system.
Further, the motion classification is carried out, and a preset number of motion discrimination characteristic values are collected for each type of motion to serve as motion samples, and the motion classification comprises the following steps:
classifying the motions, classifying the motions according to the motion directions to obtain a large class (such as vertical motions and horizontal motions), and classifying the large class of motions to obtain a small class (such as vertical motions including jumping motions and squatting up motions and horizontal motions including running motions and walking motions);
selecting multiple persons as action sample collection objects according to actual requirements of sample experiments, and collecting a preset number of action distinguishing characteristic values as action samples for each type of actions;
preferably, the more people who select the collection object, the larger the preset number of each type of motion is set, the more representative and diversified the motion sample is, and the more accurate the motion recognition model established according to the motion sample is.
Preferably, in order to ensure the quality of the motion sample, the position of the video acquisition equipment is set, so that the video acquisition equipment always adopts the side of a person to shoot, for example, the motion direction during walking or running forms a 90-degree visual angle with the direction of a camera lens, and the front direction of the person forms a 90-degree visual angle with the camera lens during actions such as jumping.
Preferably, to ensure the quality of the motion samples, the speed-related motion is divided into (e.g., slow, normal, fast, etc.), the motion samples of the motion at each speed are collected, and the collected motion samples at each speed are collected as the final motion samples of the motion. As shown in table 2 below, the preset number of samples for each type of motion is set to 100, 100 motion discrimination feature values are collected for each type of motion as motion samples for each type of motion, and since walking and running are related to speed, the walking and running motions are divided into speeds, and motion samples for the motion are collected at each speed.
Figure BDA0003023078000000131
TABLE 2
Further, the action judgment function comprises a general action comprehensive judgment function and a specific action comprehensive judgment function;
the large-class action comprehensive discriminant function is used for calculating a large-class action comprehensive discriminant index value according to the numerical value after the non-dimensionalization processing of the included angle characteristic value and the numerical value after the non-dimensionalization processing of the joint vertical height characteristic value;
and the specific action comprehensive discriminant function is used for calculating a specific action comprehensive discriminant index value according to the numerical value after the non-dimensionalization processing of the included angle characteristic value.
One embodiment of the present invention provides a method for non-dimensionalizing an included angle characteristic value and a joint vertical height characteristic value.
Specifically, one embodiment of the present invention provides a non-dimensionalization process of the characteristic value of the vertical height of the joint, in which the ratio of the height H from the center of the hip joint to the foot and the height H from the head to the foot is used as the central parameter η, as shown in formula (1).
Figure BDA0003023078000000132
The upper and lower values of each angle parameter under different types of actions are defined as shown in table 3 below.
Angle of rotation Lower limit/° (state) Upper limit/° (state) Range/° c
φ1 0 (standing) 180 (jumping) 0~180
φ2 0 (standing) 180 (jumping) 0~180
φ3 0 (standing) 60 (running) 0~60
φ4 35 (run/jump) 180 (standing) 40~180
φ5 35 (run/jump) 180 (standing) 40~180
φ6 30 (jumping/squatting) 180 (standing) 30~180
φ7 30 (jumping/squatting) 180 (standing) 30~180
TABLE 3
And calculating the average value of the included angle characteristic value of each included angle in each type of motion according to the collected motion sample of each type of motion, and taking the average value as the included angle reference value of the included angle under the type of motion. And calculating the average value of the central parameters of each type of motion according to the formula (1) according to the joint vertical height characteristic value in the motion sample of each type of motion to be used as the central reference value of the type of motion. As shown in Table 4, the reference values of the angle and the center of each angle in four types of motions of walking, running, squatting and standing up and jumping are listed.
Figure BDA0003023078000000141
TABLE 4
By using the included angle parameter range of each included angle given in table 3, the non-dimensionalization processing can be performed on the general motion included angle and the specific motion included angle respectively, as shown in formulas (2) and (3).
Figure BDA0003023078000000151
Figure BDA0003023078000000152
Wherein phi isiAn angle characteristic value phi representing the measured corresponding angleGinseng radix (Panax ginseng C.A. Meyer)Is a reference value for the angle (see Table 4), phinIndicates the included angle phi of each joint when the human body stands naturallymaxAnd phiminThe theoretical limit value of each joint angle when the human body moves is shown (as shown in a table 3).
Further, when the human body stands still, the central parameter η ≈ 1/2, the central parameter η has little change during walking or running, and has significant change during standing up at jump or crouch, so two types of discriminant functions are established: a general type of motion comprehensive discriminant function, as shown in the following formula (4); and a specific motion integration discriminant function, as shown in the following equation (5). The general action comprehensive discrimination function is realized based on the central parameters and the included angle parameters, two times of the central parameters can be used as denominators to enhance the identification significance, and the general action comprehensive discrimination function can be used for discriminating the horizontal or vertical motion general classes. The specific action comprehensive discriminant function is realized based on the included angle parameter and can be used for identifying specific actions.
Figure BDA0003023078000000153
Figure BDA0003023078000000154
Wherein, wjRepresents the weight of the parameter and can be calculated by adopting an analytic hierarchy process, epsilonjAnd mujIs the angle value after dimensionless, eta is the measured central parameter, F1Comprehensively judging index value F for the major actions2The index value is comprehensively determined for the specific action. F1For judging whether the central parameter is significantly changed, determining whether the motion belongs to the category of horizontal motion or vertical motion, such as walking, running, horizontal motion, or jumping, squatting, F2For determining the specific action.
Furthermore, carrying out non-dimensionalization processing on the included angle characteristic value and the joint vertical height characteristic value in the collected motion sample of each type of motion by combining the formulas (1), (2) and (3), and then carrying out motion discriminant index calculation on the data after non-dimensionalization of each type of motion by using the formulas (4) and (5) of the motion discriminant function to obtain a large-class motion comprehensive discriminant index value set and a specific motion comprehensive discriminant index value set of each type of motion. And establishing an action identification rule base according to the threshold value and the range of the general action comprehensive judgment index value set and the specific action comprehensive judgment index value set of each type of action calculated according to the collected action sample.
Such as calculating the comprehensive judgment index value F of each action of running, walking, jumping and squatting1Aggregate and concrete action integrated judgment index value F2Are assembled and according to F1Set sum F2Set determination of F corresponding to each type of action1Range of (1) and F2So as to correspond to F according to each type of action1Range of (1) and F2The action recognition rule base is built as shown in table 5 below.
Figure BDA0003023078000000161
TABLE 5
Further, establishing an action identification rule base according to the difference and the coverage range between the action discrimination index value sets corresponding to each type of action, including:
when two types of actions with intersection exist in the action judging index value set, and the two types of actions cannot be distinguished in the intersection range, defining a special joint point corresponding to each type of action according to distinguishing characteristics of the two types of actions;
and the special joint points are used for acquiring the coordinates of the special joint points corresponding to the action from the human body central coordinate system when the action identification can not be carried out according to the action identification rule base, and carrying out secondary action identification by combining the coordinates of the special joint points.
The specific integrated judgment index value F for jumping and squatting-up actions is shown in Table 52Has intersection in the value range of (A), and the specific action comprehensive judgment index value F calculated during actual measurement2When the action falls within the intersection range, the rule base shown in table 5 cannot identify which type of action the action belongs to. Therefore, according to the distinguishing characteristics between jumping and squatting and standing, if the joint point of the left hand and the right hand is higher than the head in jumping, the joint point of the left hand and the right hand is not higher than the head in squatting and standing, and whether the jumping or the squatting and standing can be further judged by further judging whether the Y coordinate of the wrist joint is higher than the Y coordinate of the head.
Fig. 6 is a flowchart of a body motion fast recognition method according to an embodiment of the present invention. As shown in fig. 6, a method for quickly recognizing body movement according to an embodiment of the present invention is divided into 3 stages, including:
stage 1: establishing a human body joint point simplified model, defining action discrimination parameters, and establishing a human body central projection coordinate system, wherein the method comprises the following steps: acquiring a motion video, extracting joint points to establish a human body joint simplified model, establishing a human body central projection coordinate system according to the human body joint simplified model, and defining joint included angle parameters and central parameters;
and (2) stage: designing an action discrimination function, obtaining an included angle reference value and a center reference value through collected action samples of each type of action, substituting the reference values into the action discrimination function, determining a value range of an action discrimination threshold value and an action discrimination index value, and establishing an action classification library and an action identification rule library according to the action discrimination threshold value and the range of the action discrimination index value;
and (3) stage: and actual action recognition, namely acquiring coordinates of human joint points in real time, acquiring a joint included angle characteristic value and a joint vertical height characteristic value, calculating an action judgment index value, performing rule matching on the action judgment index value by using an action recognition rule base, successfully matching, outputting an action recognition result, and acquiring special joint coordinates of an action if the recognition fails to obtain an action recognition result.
According to the body action rapid identification method provided by the embodiment of the invention, a human joint simplification model and a human body central projection coordinate system are established, an included angle parameter and a joint vertical height parameter are defined as action discrimination parameters, an action discrimination function is designed, actions are classified, a preset number of action discrimination characteristic values are collected for each type of actions to serve as action samples of the actions, and an action identification rule base is established according to the collected action samples and action discrimination functions of each type of actions. In the actual test, the action identification can be realized by only acquiring the action identification characteristic value of the action, calculating the action identification index value according to the action identification function and then performing rule matching on the calculated action identification index value according to the established action identification rule base.
According to the body action rapid recognition method provided by the embodiment of the invention, the reference value of the parameter is obtained according to the action sample, and after the action recognition rule base is established according to the action sample, in the actual action recognition process, the purpose of rapidly recognizing the body action can be achieved only by simply calculating the collected action discrimination parameter through the action discrimination function and then carrying out rule matching with the action recognition rule base. The invention does not need a large amount of data as a database to carry out one-dimensional or two-dimensional matching, and has simple discrimination and small calculated amount.
Table 6 shows that the method provided by the present invention effectively simplifies the recognition process, increases the recognition speed, and has stable recognition time on the basis of ensuring the recognition rate of the motion, as shown in table 6.
Figure BDA0003023078000000181
TABLE 6
The embodiment of the invention also provides a body action rapid recognition system, which comprises: the body motion quick recognition method comprises a memory and a processor, wherein computer program instructions are stored in the memory, and the processor executes the computer program instructions in the memory to realize the body motion quick recognition method.
Those skilled in the art will appreciate that all or part of the steps in the method for implementing the above embodiments may be implemented by a program, which is stored in a storage medium and includes several instructions to enable a single chip, a chip, or a processor (processor) to execute all or part of the steps in the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
While the embodiments of the present invention have been described in detail with reference to the accompanying drawings, the embodiments of the present invention are not limited to the details of the above embodiments, and various simple modifications can be made to the technical solution of the embodiments of the present invention within the technical idea of the embodiments of the present invention, and the simple modifications are within the scope of the embodiments of the present invention. It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, the embodiments of the present invention will not be described separately for the various possible combinations.
In addition, any combination of the various embodiments of the present invention is also possible, and the same should be considered as disclosed in the embodiments of the present invention as long as it does not depart from the spirit of the embodiments of the present invention.

Claims (10)

1. A method for quickly recognizing body motion, which is characterized by comprising the following steps:
selecting body motion distinguishing joint points in the human body joint points, and establishing a human body joint simplification model, wherein the body motion distinguishing joint points at least comprise human body head, spine and shoulder intersection points, hip joint center points, left elbows, right elbows, left knees, right knees, left hands, right hands, left feet and right feet;
taking a joint included angle parameter and a joint vertical height parameter in the human body joint simplified model as action discrimination parameters;
establishing a human body center projection coordinate system (X, Y, Z) for acquiring motion discrimination parameters by taking a human body center as a coordinate origin O, wherein the human body center corresponds to a human body motion discrimination joint point in a human body joint simplification model;
establishing a motion recognition model according to the change characteristics of the motion discrimination parameters corresponding to the body motion;
and continuously acquiring action image frames, acquiring action distinguishing parameters from a human body central projection coordinate system according to the action image frames, and realizing specific action recognition by using an action recognition model according to the acquired action distinguishing parameters.
2. The method for rapidly recognizing body movement according to claim 1, wherein the establishing of the movement recognition model according to the variation characteristics of the movement discrimination parameters corresponding to the body movement comprises:
continuously collecting action image frames, acquiring action distinguishing parameters in a plurality of action periods from a human body central projection coordinate system according to the action image frames, and extracting action distinguishing characteristic values from the acquired action distinguishing parameters in the plurality of action periods;
classifying actions, and collecting a preset number of action distinguishing characteristic values of each type of action as action samples;
establishing an action recognition model through the action sample;
the action recognition model comprises an action discriminant function and an action recognition rule base;
the action discrimination function is used for calculating an action discrimination index value according to the action discrimination characteristic value;
and the action identification rule base is used for classifying and calculating the collected action samples of each type of action by an action discriminant function to obtain an action discriminant index value set of each type of action, and is established according to the difference and the coverage range between the action discriminant index value sets corresponding to each type of action.
3. The method for rapidly recognizing body motion according to claim 2, wherein the motion discrimination feature values include an included angle feature value and a joint vertical height feature value;
the included angle characteristic value is an average value of included angle parameters in the action judging parameters in a plurality of action periods in the plurality of action periods;
the joint vertical height characteristic value is an average value of the joint vertical height parameter in a plurality of action periods in the action judging parameters in the plurality of action periods.
4. The method for rapidly recognizing body actions according to claim 3, wherein the step of establishing an action recognition rule base according to differences and coverage ranges between action discriminant index value sets corresponding to each type of actions comprises:
when two types of actions with intersection exist in the action judging index value set, and the two types of actions cannot be distinguished in the intersection range, defining a special joint point corresponding to each type of action according to distinguishing characteristics of the two types of actions;
and the special joint points are used for acquiring the coordinates of the special joint points corresponding to the action from the human body central coordinate system when the action identification can not be carried out according to the action identification rule base, and carrying out secondary action identification by combining the coordinates of the special joint points.
5. The method for rapidly recognizing body motion according to claim 4, wherein the obtaining of motion discrimination parameters from a human body central projection coordinate system according to motion image frames and the implementation of specific motion recognition using a motion recognition model according to the obtained motion discrimination parameters comprises:
acquiring action distinguishing parameters in a plurality of action periods from a human body central projection coordinate system according to the action image frame, and extracting action distinguishing characteristic values from the acquired action distinguishing parameters to input into an action recognition model;
the action discrimination function in the action discrimination model calculates an action discrimination index value according to the input action discrimination feature value, and identifies a specific action from the action discrimination rule base according to the calculated action discrimination index value.
6. The method for rapidly recognizing body motion according to claim 5, wherein the motion discriminant function comprises a general motion comprehensive discriminant function and a specific motion comprehensive discriminant function;
the large-class action comprehensive discriminant function is used for calculating a large-class action comprehensive discriminant index value according to the numerical value after the non-dimensionalization processing of the included angle characteristic value and the numerical value after the non-dimensionalization processing of the joint vertical height characteristic value;
and the specific action comprehensive discriminant function is used for calculating a specific action comprehensive discriminant index value according to the numerical value after the non-dimensionalization processing of the included angle characteristic value.
7. The method for rapidly recognizing body motion according to any one of claims 1-6, wherein the step of using the joint angle parameter and the joint vertical height parameter in the human body joint reduced model as motion discrimination parameters comprises:
taking the included angle parameter of two bodies connected with the same joint in the human body joint simplified model as a joint included angle parameter, wherein the joint included angle parameter comprises the included angle phi between the left arm and the body trunk1Right arm and trunk included angle phi2The included angle phi between the left leg and the right leg3Right elbow joint included angle phi4Angle phi between left elbow joint and right elbow joint5Right knee joint angle phi6Angle phi with left knee joint7The included angle parameter is used for identifying specific actions;
the joint vertical height parameters comprise the height from the center of the hip joint to the feet and the height from the head to the feet, and are used for identifying large actions, wherein the large actions comprise vertical movement and horizontal movement.
8. The method for rapidly recognizing body motion according to claim 7, wherein the establishing a projected coordinate system (X, Y, Z) of the body center for obtaining the motion-determining parameter with the body center as the origin of coordinates O, the body center corresponding to a body motion-determining joint point of the simplified model of the body joint comprises:
the central point of the hip joint of the human body in the body motion judging joint point is used as the coordinate origin O of a human body central projection coordinate system, the vertical horizontal plane faces upwards and is the positive direction of the Y axis, the direction of the video acquisition equipment pointing to the center of the hip joint is the positive direction of the Z axis, and the direction vertical to the YOZ plane is the direction of the X axis.
9. The method for rapidly identifying body movement according to claim 8, wherein the video capture device is set to point in the direction of the center of the hip joint, so that the XOY plane in the projection coordinate system of the center of the human body displays the side projection of the human body, and the movement discrimination parameters are obtained from the XOY plane.
10. A system for rapid recognition of body movements, the system comprising: a memory having computer program instructions stored therein and a processor executing the computer program instructions in the memory to implement the method of rapid body motion recognition according to any of claims 1-9.
CN202110408054.1A 2021-04-15 2021-04-15 Method and system for quickly identifying body actions Active CN113065505B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110408054.1A CN113065505B (en) 2021-04-15 2021-04-15 Method and system for quickly identifying body actions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110408054.1A CN113065505B (en) 2021-04-15 2021-04-15 Method and system for quickly identifying body actions

Publications (2)

Publication Number Publication Date
CN113065505A true CN113065505A (en) 2021-07-02
CN113065505B CN113065505B (en) 2023-05-09

Family

ID=76566834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110408054.1A Active CN113065505B (en) 2021-04-15 2021-04-15 Method and system for quickly identifying body actions

Country Status (1)

Country Link
CN (1) CN113065505B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113627365A (en) * 2021-08-16 2021-11-09 南通大学 Group movement identification and time sequence analysis method
CN115223240A (en) * 2022-07-05 2022-10-21 北京甲板智慧科技有限公司 Motion real-time counting method and system based on dynamic time warping algorithm
CN115861381A (en) * 2023-02-28 2023-03-28 中国民用航空飞行学院 Detection method and system for fishing jump action standard in ball body cooperative motion
CN115966016A (en) * 2022-12-19 2023-04-14 天翼爱音乐文化科技有限公司 Jumping state identification method and system, electronic equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012065723A (en) * 2010-09-21 2012-04-05 Dainippon Printing Co Ltd Walking state display system or the like
CN103390174A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Physical education assisting system and method based on human body posture recognition
JP2015077643A (en) * 2013-10-16 2015-04-23 国立大学法人信州大学 Parameter identification method for vertical multijoint hydraulic manipulator, identification apparatus, and identification program
CN106022213A (en) * 2016-05-04 2016-10-12 北方工业大学 Human body motion recognition method based on three-dimensional bone information
CN107301370A (en) * 2017-05-08 2017-10-27 上海大学 A kind of body action identification method based on Kinect three-dimensional framework models
CN107930048A (en) * 2017-03-20 2018-04-20 深圳市太空科技南方研究院 A kind of space somatosensory recognition motion analysis system and method for motion analysis
CN109086706A (en) * 2018-07-24 2018-12-25 西北工业大学 Applied to the action identification method based on segmentation manikin in man-machine collaboration
CN109308438A (en) * 2017-07-28 2019-02-05 上海形趣信息科技有限公司 Method for building up, electronic equipment, the storage medium in action recognition library
CN109344694A (en) * 2018-08-13 2019-02-15 西安理工大学 A kind of human body elemental motion real-time identification method based on three-dimensional human skeleton
CN111353346A (en) * 2018-12-21 2020-06-30 上海形趣信息科技有限公司 Action recognition method, device, system, electronic equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012065723A (en) * 2010-09-21 2012-04-05 Dainippon Printing Co Ltd Walking state display system or the like
CN103390174A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Physical education assisting system and method based on human body posture recognition
JP2015077643A (en) * 2013-10-16 2015-04-23 国立大学法人信州大学 Parameter identification method for vertical multijoint hydraulic manipulator, identification apparatus, and identification program
CN106022213A (en) * 2016-05-04 2016-10-12 北方工业大学 Human body motion recognition method based on three-dimensional bone information
CN107930048A (en) * 2017-03-20 2018-04-20 深圳市太空科技南方研究院 A kind of space somatosensory recognition motion analysis system and method for motion analysis
CN107301370A (en) * 2017-05-08 2017-10-27 上海大学 A kind of body action identification method based on Kinect three-dimensional framework models
CN109308438A (en) * 2017-07-28 2019-02-05 上海形趣信息科技有限公司 Method for building up, electronic equipment, the storage medium in action recognition library
CN109086706A (en) * 2018-07-24 2018-12-25 西北工业大学 Applied to the action identification method based on segmentation manikin in man-machine collaboration
CN109344694A (en) * 2018-08-13 2019-02-15 西安理工大学 A kind of human body elemental motion real-time identification method based on three-dimensional human skeleton
CN111353346A (en) * 2018-12-21 2020-06-30 上海形趣信息科技有限公司 Action recognition method, device, system, electronic equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113627365A (en) * 2021-08-16 2021-11-09 南通大学 Group movement identification and time sequence analysis method
CN115223240A (en) * 2022-07-05 2022-10-21 北京甲板智慧科技有限公司 Motion real-time counting method and system based on dynamic time warping algorithm
CN115966016A (en) * 2022-12-19 2023-04-14 天翼爱音乐文化科技有限公司 Jumping state identification method and system, electronic equipment and storage medium
CN115966016B (en) * 2022-12-19 2024-07-05 天翼爱音乐文化科技有限公司 Jump state identification method, system, electronic equipment and storage medium
CN115861381A (en) * 2023-02-28 2023-03-28 中国民用航空飞行学院 Detection method and system for fishing jump action standard in ball body cooperative motion

Also Published As

Publication number Publication date
CN113065505B (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN113065505B (en) Method and system for quickly identifying body actions
CN103839040B (en) Gesture identification method and device based on depth image
Pazhoumand-Dar et al. Joint movement similarities for robust 3D action recognition using skeletal data
Chaudhari et al. Yog-guru: Real-time yoga pose correction system using deep learning methods
Wen et al. A robust method of detecting hand gestures using depth sensors
CN104932804B (en) A kind of intelligent virtual assembles action identification method
WO2013027091A1 (en) Systems and methods of detecting body movements using globally generated multi-dimensional gesture data
CN105868779B (en) A kind of Activity recognition method based on feature enhancing and Decision fusion
CN109308437B (en) Motion recognition error correction method, electronic device, and storage medium
JP2012518236A (en) Method and system for gesture recognition
CN112800892B (en) Human body posture recognition method based on openposition
CN109308438A (en) Method for building up, electronic equipment, the storage medium in action recognition library
McCall et al. Macro-class Selection for Hierarchical k-NN Classification of Inertial Sensor Data.
CN108875586B (en) Functional limb rehabilitation training detection method based on depth image and skeleton data multi-feature fusion
Monir et al. Rotation and scale invariant posture recognition using Microsoft Kinect skeletal tracking feature
Xu et al. Robust hand gesture recognition based on RGB-D Data for natural human–computer interaction
CN109543644A (en) A kind of recognition methods of multi-modal gesture
KR20120089948A (en) Real-time gesture recognition using mhi shape information
Sooai et al. Comparison of recognition accuracy on dynamic hand gesture using feature selection
Kong et al. Design of computer interactive system for sports training based on artificial intelligence and improved support vector
Sharma et al. Real-time recognition of yoga poses using computer vision for smart health care
CN113975775A (en) Wearable inertial body feeling ping-pong exercise training system and working method thereof
Hachaj et al. Human actions recognition on multimedia hardware using angle-based and coordinate-based features and multivariate continuous hidden Markov model classifier
Xu et al. A novel method for hand posture recognition based on depth information descriptor
KR101447958B1 (en) Method and apparatus for recognizing body point

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant