CN110991292A - Action identification comparison method and system, computer storage medium and electronic device - Google Patents

Action identification comparison method and system, computer storage medium and electronic device Download PDF

Info

Publication number
CN110991292A
CN110991292A CN201911171881.2A CN201911171881A CN110991292A CN 110991292 A CN110991292 A CN 110991292A CN 201911171881 A CN201911171881 A CN 201911171881A CN 110991292 A CN110991292 A CN 110991292A
Authority
CN
China
Prior art keywords
human body
depth image
standard
action posture
posture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911171881.2A
Other languages
Chinese (zh)
Inventor
胡佳文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Everest Shenzhen Technology Co Ltd
Original Assignee
Everest Shenzhen Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Everest Shenzhen Technology Co Ltd filed Critical Everest Shenzhen Technology Co Ltd
Priority to CN201911171881.2A priority Critical patent/CN110991292A/en
Publication of CN110991292A publication Critical patent/CN110991292A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Abstract

The invention provides a motion recognition comparison method, a motion recognition comparison system, a computer storage medium and an electronic device, which are characterized in that depth image data of a target space containing a human body are obtained, the depth image data of the human body is separated from the depth image data, and the positions and angles of all bone joints of the human body are recognized and detected according to the depth image data, so that the current motion posture of the human body is judged according to the positions and angles of all the bone joints of the human body, and finally, the difference between the current motion posture of the human body and a preset standard motion posture corresponding to the current motion posture of the human body is calculated. The invention can be applied to the fields of dance, fitness, sports and the like, can accurately identify the difference between the current action posture of the human body and the corresponding standard action posture, and provides action guidance for users.

Description

Action identification comparison method and system, computer storage medium and electronic device
Technical Field
The invention relates to a machine vision identification technology, in particular to a method and a system for identifying and comparing actions, a computer storage medium and an electronic device.
Background
In visual recognition, human body information can be recognized, and actions and gestures can be recognized by the human body information, but the human body information is recognized more accurately with defined standard actions, and has large errors. Because the individual data and the standard data are different, the position and the angle obtained visually are different, and the difference between the human body and the standard action is difficult to accurately judge only by identifying the image from the human body, the human body information and the standard action information need to be quantized for comparison at this time.
Disclosure of Invention
The invention aims to provide a method, a system, a computer storage medium and an electronic device for identifying and comparing actions, so as to solve the problem of inaccurate action identification in the prior art.
The invention is realized by the following technical scheme:
a motion recognition comparison method comprises the following steps:
step A: acquiring a depth image of a target space containing a human body;
and B: separating the depth image of the human body from the depth image of the target space;
and C: identifying bone joints of the human body from the depth image of the human body;
step D: detecting the positions and angles of the identified bone joints of the human body;
step E: judging the current action posture of the human body according to the position and the angle of each bone joint of the human body;
step F: calculating the difference between the current action posture of the human body and a preset standard action posture corresponding to the current action posture of the human body;
step G: and outputting the calculation result of the difference.
Further, each standard action posture is provided with a standard position and a standard angle of each bone joint, and the step F is specifically as follows:
and calculating the difference between the position and the angle of each bone joint of the human body in the current action posture and the standard position and the standard angle of each bone joint of the preset standard action posture corresponding to the current action posture of the human body.
Further, each skeletal joint of the human body comprises a head, a neck, a trunk, a left shoulder, a left elbow, a left wrist, a right shoulder, a right elbow, a right wrist, a left hip, a left knee, a left heel, a right hip, a right knee, and a right heel.
Further, in the step a, a depth image of a target space including a human body is acquired by a depth camera.
Further, in the step a, the depth image of the target space is a continuous depth image of the target space acquired in real time.
A motion recognition alignment system, comprising:
the depth image acquisition module is used for acquiring a depth image of a target space containing a human body;
the image separation module is used for separating the depth image of the human body from the depth image of the target space;
a bone joint identification module for identifying each bone joint of the human body from the depth image of the human body;
the bone joint detection module is used for detecting the positions and angles of all the identified bone joints of the human body;
the motion posture calculation module is used for judging the current motion posture of the human body according to the position and the angle of each bone joint of the human body;
the difference calculation module is used for calculating the difference between the current action posture of the human body and a preset standard action posture corresponding to the current action posture of the human body;
and the result output module is used for outputting the calculation result of the difference.
Further, each standard action posture is provided with a standard position and a standard angle of each bone joint, and the difference calculation module is specifically configured to:
and calculating the difference between the position and the angle of each bone joint of the human body in the current action posture and the standard position and the standard angle of each bone joint of the preset standard action posture corresponding to the current action posture of the human body.
Further, each skeletal joint of the human body comprises a head, a neck, a trunk, a left shoulder, a left elbow, a left wrist, a right shoulder, a right elbow, a right wrist, a left hip, a left knee, a left heel, a right hip, a right knee, and a right heel.
Further, the depth image acquisition module is a depth camera.
Further, the depth image of the target space is a continuous depth image of the target space acquired by the depth image acquisition module in real time.
A computer storage medium having stored thereon a computer program which, when executed by a processor, implements a method of motion recognition and comparison as described above.
An electronic device includes a memory, a processor, and a computer program stored in the memory and executable in the processor, wherein the processor implements the motion recognition and comparison method as described above when executing the computer program.
Compared with the prior art, the action recognition and comparison method, the action recognition and comparison system, the computer storage medium and the electronic device provided by the invention have the advantages that the depth image data of the target space containing the human body is obtained, the depth image data of the human body is separated from the depth image data, and the position and the angle of each bone joint of the human body are recognized and detected according to the depth image data, so that the current action posture of the human body is judged according to the position and the angle of each bone joint of the human body, and finally, the difference between the current action posture of the human body and the preset standard action posture corresponding to the current action posture of the human body is calculated. The invention can be applied to the fields of dance, fitness, sports and the like, can accurately identify the difference between the current action posture of the human body and the corresponding standard action posture, and provides action guidance for users.
Drawings
FIG. 1 is a schematic general flow chart of a motion recognition and comparison method according to the present invention;
FIG. 2 is a schematic illustration of a human bone joint;
FIG. 3 is a flowchart illustrating an exemplary embodiment of a motion recognition comparison method according to the present invention;
FIG. 4 is a schematic diagram illustrating the principle of the motion recognition and comparison system of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the following embodiments and the accompanying drawings.
As shown in fig. 1, the method for identifying and comparing actions provided by the embodiment of the present invention may include the following steps a to G.
Step A: a depth image of a target space containing a human body is acquired. Specifically, a depth image of the target space may be obtained through a depth camera (e.g., a Tof camera), a structured light projector, and the like. The human body is in the target space, and the depth images of the human body are acquired.
And B: and separating the depth image of the human body from the depth image of the target space. The separated depth image of the human body is used for the analysis and calculation of the subsequent steps.
And C: identifying each skeletal joint of the human body from the depth image of the human body. The skeletal joint identification can be realized by adopting a somatosensory algorithm of Microsoft Kinect to analyze and calculate the human body depth image.
Step D: detecting the position and angle of each identified skeletal joint of the human body. The positions and angles of all the bone joints can be realized by analyzing and calculating all the bone joints of the human body identified from the human body depth image by adopting the somatosensory algorithm of Microsoft Kinect.
Step E: and judging the current action posture of the human body according to the position and the angle of each bone joint of the human body. The motion posture of the human body is determined by the positions and angles of the skeletal joints of the whole body of the human body, so that the motion posture of the human body can be judged according to the positions and angles of the skeletal joints of the human body. Because the number of human body bone joints is large, for the requirement of general motion identification precision, a plurality of purposes can be realized only by accurately identifying main motions of the human body without identifying fine motions of the human body, so that in the process of identifying the human body bone joints, only the positions and the angles of the main bone joints of the human body need to be identified, and the positions and the angles of all the bone joints of the human body do not need to be identified. Accordingly, when determining each skeletal joint of the human body to be recognized, as shown in fig. 2, 15 human skeletal joints such as the head 101, the neck 102, the trunk 103, the left shoulder 104, the left elbow 105, the left wrist 106, the right shoulder 107, the right elbow 108, the right wrist 109, the left hip 110, the left knee 111, the left heel 112, the right hip 113, the right knee 114, and the right heel 115 may be selected and recognized. By recognizing the positions and angles of these 15 skeletal joints, various major movement gestures of the human body can be recognized.
Step F: and calculating the difference between the current action posture of the human body and a preset standard action posture corresponding to the current action posture of the human body. A plurality of standard movement postures can be preset, and each standard movement posture is provided with a standard position and a standard angle of each bone joint, so that the step is specifically to calculate the difference between the position and the angle of each bone joint of the human body in the current movement posture and the preset standard position and the standard angle of each bone joint of the standard movement posture corresponding to the current movement posture of the human body. In the specific calculation, taking the above-mentioned 15 bone joints as an example, as shown in fig. 3, the 15 bone joints may be subjected to batch calculation of differences, and finally summarized.
Step G: and outputting the calculation result of the difference. The similarity between the current action posture of the human body and the standard action posture can be judged through the difference between the current action posture of the human body and the corresponding standard action posture, and the method can be used for action guidance in the fields of dance, fitness, sports and the like.
In step a, the depth image of the target space may be a continuous depth image of the target space acquired in real time, so that a difference result between the real-time action posture of the human body and the corresponding standard action posture can be obtained after the processing of the subsequent step B, C, D, E, F, G, and real-time dynamic monitoring of the difference between the action posture of the human body and the corresponding standard action posture can be achieved.
As shown in fig. 4, based on the above motion recognition and comparison method, the present invention further provides a motion recognition and comparison system. The action recognition comparison system comprises a depth image acquisition module 1, an image separation module 2, a bone joint recognition module 3, a bone joint detection module 4, an action posture calculation module 5, a difference calculation module 6 and a result output module 7. Wherein:
the depth image acquisition module 1 is used for acquiring a depth image of a target space containing a human body;
the image separation module 2 is used for separating the depth image of the human body from the depth image of the target space;
the bone joint identification module 3 is used for identifying each bone joint of the human body from the depth image of the human body;
the bone joint detection module 4 is used for detecting the positions and angles of all the identified bone joints of the human body;
the action posture calculation module 5 is used for judging the current action posture of the human body according to the position and the angle of each bone joint of the human body;
the difference calculation module 6 is configured to calculate a difference between the current action posture of the human body and a preset standard action posture corresponding to the current action posture of the human body;
the result output module 7 is used for outputting the calculation result of the difference.
Each standard action posture is provided with a standard position and a standard angle of each bone joint, and the difference calculation module 6 is specifically configured to calculate a difference between the position and the angle of each bone joint of the human body in the current action posture and a preset standard position and a preset standard angle of each bone joint of the human body in the standard action posture corresponding to the current action posture of the human body.
The skeletal joints of the human body include a head 101, a neck 102, a torso 103, a left shoulder 104, a left elbow 105, a left wrist 106, a right shoulder 107, a right elbow 108, a right wrist 109, a left hip 110, a left knee 111, a left heel 112, a right hip 113, a right knee 114, and a right heel 115.
The depth image obtaining module 1 may be a depth camera.
The depth image of the target space may be a continuous depth image of the target space acquired by the depth image acquiring module 1 in real time.
The motion recognition and comparison system corresponds to the motion recognition and comparison method, and the motion recognition and comparison method can be referred to specifically, which is not described herein again.
The invention also provides a computer storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the method for recognizing and comparing actions as described above is realized.
The invention also provides an electronic device, which comprises a memory, a processor and a computer program which is stored in the memory and can run in the processor, wherein when the processor executes the computer program, the action recognition and comparison method is realized.
The above embodiments are only preferred embodiments and are not intended to limit the scope of the present invention. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (12)

1. A motion recognition and comparison method is characterized by comprising the following steps:
step A: acquiring a depth image of a target space containing a human body;
and B: separating the depth image of the human body from the depth image of the target space;
and C: identifying bone joints of the human body from the depth image of the human body;
step D: detecting the positions and angles of the identified bone joints of the human body;
step E: judging the current action posture of the human body according to the position and the angle of each bone joint of the human body;
step F: calculating the difference between the current action posture of the human body and a preset standard action posture corresponding to the current action posture of the human body;
step G: and outputting the calculation result of the difference.
2. The motion recognition and comparison method according to claim 1, wherein each standard motion posture is provided with a standard position and a standard angle of each bone joint, and the step F specifically comprises:
and calculating the difference between the position and the angle of each bone joint of the human body in the current action posture and the standard position and the standard angle of each bone joint of the preset standard action posture corresponding to the current action posture of the human body.
3. The motion recognition comparison method according to claim 2, wherein each skeletal joint of the human body comprises a head, a neck, a torso, a left shoulder, a left elbow, a left wrist, a right shoulder, a right elbow, a right wrist, a left hip, a left knee, a left heel, a right hip, a right knee, and a right heel.
4. The motion recognition and comparison method according to claim 1, wherein in the step a, a depth image of a target space including a human body is obtained by a depth camera.
5. The method for motion recognition and comparison according to claim 1, wherein in the step a, the depth image of the target space is a continuous depth image of the target space obtained in real time.
6. An action recognition comparison system, comprising:
the depth image acquisition module is used for acquiring a depth image of a target space containing a human body;
the image separation module is used for separating the depth image of the human body from the depth image of the target space;
a bone joint identification module for identifying each bone joint of the human body from the depth image of the human body;
the bone joint detection module is used for detecting the positions and angles of all the identified bone joints of the human body;
the motion posture calculation module is used for judging the current motion posture of the human body according to the position and the angle of each bone joint of the human body;
the difference calculation module is used for calculating the difference between the current action posture of the human body and a preset standard action posture corresponding to the current action posture of the human body;
and the result output module is used for outputting the calculation result of the difference.
7. The motion recognition comparison system of claim 6, wherein each standard motion pose is provided with a standard position and a standard angle of each skeletal joint, and the difference calculation module is specifically configured to:
and calculating the difference between the position and the angle of each bone joint of the human body in the current action posture and the standard position and the standard angle of each bone joint of the preset standard action posture corresponding to the current action posture of the human body.
8. The motion recognition alignment system of claim 7, wherein the skeletal joints of the human body comprise a head, a neck, a torso, a left shoulder, a left elbow, a left wrist, a right shoulder, a right elbow, a right wrist, a left hip, a left knee, a left heel, a right hip, a right knee, and a right heel.
9. The motion recognition and comparison system of claim 6, wherein the depth image capture module is a depth camera.
10. The motion recognition and comparison system of claim 6, wherein the depth image of the target space is a continuous depth image of the target space acquired by the depth image acquisition module in real time.
11. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the motion recognition and comparison method according to any one of claims 1 to 5.
12. An electronic device, comprising a memory, a processor and a computer program stored in the memory and executable in the processor, wherein the processor executes the computer program to implement the motion recognition and comparison method according to any one of claims 1 to 5.
CN201911171881.2A 2019-11-26 2019-11-26 Action identification comparison method and system, computer storage medium and electronic device Pending CN110991292A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911171881.2A CN110991292A (en) 2019-11-26 2019-11-26 Action identification comparison method and system, computer storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911171881.2A CN110991292A (en) 2019-11-26 2019-11-26 Action identification comparison method and system, computer storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN110991292A true CN110991292A (en) 2020-04-10

Family

ID=70087000

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911171881.2A Pending CN110991292A (en) 2019-11-26 2019-11-26 Action identification comparison method and system, computer storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN110991292A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111709365A (en) * 2020-06-17 2020-09-25 成都工业学院 Automatic human motion posture detection method based on convolutional neural network
CN112804575A (en) * 2021-01-18 2021-05-14 珠海格力电器股份有限公司 Resource pushing method and device based on TOF
CN113657278A (en) * 2021-08-18 2021-11-16 成都信息工程大学 Motion gesture recognition method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105551182A (en) * 2015-11-26 2016-05-04 吉林大学 Driving state monitoring system based on Kinect human body posture recognition
CN106022213A (en) * 2016-05-04 2016-10-12 北方工业大学 Human body motion recognition method based on three-dimensional bone information
CN106650687A (en) * 2016-12-30 2017-05-10 山东大学 Posture correction method based on depth information and skeleton information
CN109086754A (en) * 2018-10-11 2018-12-25 天津科技大学 A kind of human posture recognition method based on deep learning
CN109344706A (en) * 2018-08-28 2019-02-15 杭州电子科技大学 It is a kind of can one man operation human body specific positions photo acquisition methods
US20190251341A1 (en) * 2017-12-08 2019-08-15 Huawei Technologies Co., Ltd. Skeleton Posture Determining Method and Apparatus, and Computer Readable Storage Medium
CN110245623A (en) * 2019-06-18 2019-09-17 重庆大学 A kind of real time human movement posture correcting method and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105551182A (en) * 2015-11-26 2016-05-04 吉林大学 Driving state monitoring system based on Kinect human body posture recognition
CN106022213A (en) * 2016-05-04 2016-10-12 北方工业大学 Human body motion recognition method based on three-dimensional bone information
CN106650687A (en) * 2016-12-30 2017-05-10 山东大学 Posture correction method based on depth information and skeleton information
US20190251341A1 (en) * 2017-12-08 2019-08-15 Huawei Technologies Co., Ltd. Skeleton Posture Determining Method and Apparatus, and Computer Readable Storage Medium
CN109344706A (en) * 2018-08-28 2019-02-15 杭州电子科技大学 It is a kind of can one man operation human body specific positions photo acquisition methods
CN109086754A (en) * 2018-10-11 2018-12-25 天津科技大学 A kind of human posture recognition method based on deep learning
CN110245623A (en) * 2019-06-18 2019-09-17 重庆大学 A kind of real time human movement posture correcting method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李红波等: "基于Kinect骨骼数据的人体动作姿势识别方法", 计算机工程与设计, vol. 37, no. 4, 30 April 2016 (2016-04-30), pages 1 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111709365A (en) * 2020-06-17 2020-09-25 成都工业学院 Automatic human motion posture detection method based on convolutional neural network
CN112804575A (en) * 2021-01-18 2021-05-14 珠海格力电器股份有限公司 Resource pushing method and device based on TOF
CN113657278A (en) * 2021-08-18 2021-11-16 成都信息工程大学 Motion gesture recognition method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN108399367B (en) Hand motion recognition method and device, computer equipment and readable storage medium
Kishore et al. Motionlets matching with adaptive kernels for 3-d indian sign language recognition
US8428311B2 (en) Capturing and recognizing hand postures using inner distance shape contexts
US8355529B2 (en) Motion capture apparatus and method, and motion capture program
US9286694B2 (en) Apparatus and method for detecting multiple arms and hands by using three-dimensional image
Ye et al. A depth camera motion analysis framework for tele-rehabilitation: Motion capture and person-centric kinematics analysis
CN110546644B (en) Identification device, identification method, and recording medium
Dikovski et al. Evaluation of different feature sets for gait recognition using skeletal data from Kinect
Gritai et al. On the use of anthropometry in the invariant analysis of human actions
CN110991292A (en) Action identification comparison method and system, computer storage medium and electronic device
JP2016091108A (en) Human body portion detection system and human body portion detection method
JP2016099982A (en) Behavior recognition device, behaviour learning device, method, and program
US10755422B2 (en) Tracking system and method thereof
KR20190099537A (en) Motion learning device, function determining device and function determining system
CN111488775B (en) Device and method for judging degree of visibility
CN111860196B (en) Hand operation action scoring device, method and computer readable storage medium
CN110910426A (en) Action process and action trend identification method, storage medium and electronic device
KR20200113743A (en) Method and apparatus for estimating and compensating human's pose
KR102371127B1 (en) Gesture Recognition Method and Processing System using Skeleton Length Information
US20220095959A1 (en) Feigned Injury Detection Systems And Methods
CN113033501A (en) Human body classification method and device based on joint quaternion
US11527090B2 (en) Information processing apparatus, control method, and non-transitory storage medium
JP7375806B2 (en) Image processing device and image processing method
KR20130081126A (en) Method for hand-gesture recognition and apparatus thereof
CN114373091A (en) Gait recognition method based on deep learning fusion SVM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination