CN114582021A - Sit-up counting method based on image vision technology - Google Patents

Sit-up counting method based on image vision technology Download PDF

Info

Publication number
CN114582021A
CN114582021A CN202210223146.7A CN202210223146A CN114582021A CN 114582021 A CN114582021 A CN 114582021A CN 202210223146 A CN202210223146 A CN 202210223146A CN 114582021 A CN114582021 A CN 114582021A
Authority
CN
China
Prior art keywords
sit
thres
standard
action
crotch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210223146.7A
Other languages
Chinese (zh)
Inventor
张伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Qingyanjun Positron Technology Co ltd
Original Assignee
Suzhou Qingyanjun Positron Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Qingyanjun Positron Technology Co ltd filed Critical Suzhou Qingyanjun Positron Technology Co ltd
Priority to CN202210223146.7A priority Critical patent/CN114582021A/en
Publication of CN114582021A publication Critical patent/CN114582021A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a sit-up counting method based on an image vision technology, which is characterized in that a human body joint is detected and positioned based on a convolutional neural network, whether sit-up action is standard or not is judged according to a joint angle, action times are judged according to an extreme point, whether sit-up action is standard or not is judged according to ankle positions and straight line slopes, the number of actions meeting the standard is judged according to all recorded results after timing is finished, sit-up tests can be effectively counted, and the accuracy and the real-time performance of sit-up counting are improved.

Description

Sit-up counting method based on image vision technology
Technical Field
The invention relates to the field of physical fitness tests, in particular to a sit-up counting method based on an image vision technology.
Background
The sit-up technical method based on the deep learning technology and the image vision technology judges whether the action is standard or not through modeling analysis of a testee, and then calculates the standard action quantity.
The current sit-up counting method mainly has the following defects:
(1) the cost is high;
(2) whether the action is standard or not cannot be judged;
(3) counting needs manual operation;
the above disadvantages will affect the accuracy and real-time performance of the sit-up counting.
Disclosure of Invention
The invention aims at: the utility model provides a sit up counting method based on image vision technique, detect and fix a position human joint based on convolution neural network, judge whether sit up action is standard through ankle position and joint angle, judge the action number of times through extreme point, the time count is over and judge the action quantity that accords with the standard according to all record results.
The technical scheme of the invention is as follows:
a sit-up counting method based on image vision technology comprises the following steps:
s101, starting timing;
s102, inputting a single-frame image;
s103, detecting and positioning the human body joint points by using the trained convolutional neural network, wherein the positioning of the human body joint points comprises the following steps: head a, shoulder b, crotch c, knee d, ankle e;
s104, inputting e-point longitudinal coordinate threshold value eythresIf e is the ordinate ey<eythresJudging that the foot is off the ground, and ending timing;
s105, calculating the slope K of a straight line ac connecting the head a and the crotch c, and putting the calculation result into a set K;
s106, calculating a crotch bending angle bcd, and putting a calculation result into the set A;
s107, after the timing is finished, the image is not input any more, and the set K is divided at the minimum value to obtain n subsets KnN sit-ups are performed within the timing time;
s108, taking the subset KnTwo endpoint values Kn1,Kn2
S109, the set A is arranged in the corresponding n subsets KnIs divided into n subsets An,AnMinimum value of (2)
An min
S110, judging whether each sit-up action is standard: slope threshold K of input straight line acthresAnd a crotch bending angle threshold value alpha, if An min<α, and Kn1 <Kthres,Kn2 <KthresIf the three conditions are met simultaneously, the standard of the sit-up action is considered, the count is increased by 1, otherwise, the sit-up action is considered not to be standardStandard, not counting;
and S111, outputting a counting result.
Preferably, the crotch bending angle threshold α is 80 ° to 100 °.
The invention has the advantages that:
the sit-up counting method based on the image vision technology detects and positions human joints based on the convolutional neural network, judges whether sit-up actions are standard or not according to ankle positions and joint angles, judges the action times through extreme points, judges the action quantity meeting the standard according to all recorded results after timing is finished, and can effectively count sit-up tests.
Drawings
The invention is further described with reference to the following figures and examples:
FIG. 1 is a flow chart of a sit-up counting method based on image vision technology according to the present invention;
fig. 2 is a schematic view of joint positioning.
Detailed Description
As shown in fig. 1, the sit-up counting method based on image vision technology of the present invention includes the steps of:
s101, starting timing;
s102, inputting a single-frame image;
s103, detecting and positioning the human body joint points by using the trained convolutional neural network, wherein the positioning of the human body joint points comprises the following steps: head a, shoulders b, crotch c, knees d, ankles e, as shown in fig. 2;
s104, inputting e-point longitudinal coordinate threshold value eythresIf e is the ordinate ey<eythresIf yes, judging that the feet are off the ground, and ending timing;
s105, calculating the slope K of a straight line ac connecting the head a and the crotch c, and putting the calculation result into a set K;
s106, calculating a crotch bending angle bcd, and putting a calculation result into the set A;
s107, after the timing is finished, the image is not input any more, and the set K is divided at the minimum value to obtain n subsets KnN sit-ups are performed within the timing time;
s108, taking the subset KnTwo endpoint values Kn1,Kn2
S109, the set A is arranged in the corresponding n subsets KnIs divided into n subsets An,AnMinimum value of (2)
An min
S110, judging whether each sit-up action is standard: slope threshold K of input straight line acthresAnd a crotch bending angle threshold of 90 DEG, if An min<90 DEG, and Kn1 <Kthres,Kn2 <KthresIf the three conditions are met simultaneously, the standard of the sit-up action is considered, and the count is increased by 1, otherwise, the sit-up action is considered not to be standard and not to be counted;
and S111, outputting a counting result.
The above embodiments are merely illustrative of the technical ideas and features of the present invention, and the purpose of the embodiments is to enable those skilled in the art to understand the contents of the present invention and implement the present invention, and not to limit the protection scope of the present invention. All modifications made according to the spirit of the main technical scheme of the invention are covered in the protection scope of the invention.

Claims (2)

1. A sit-up counting method based on image vision technology is characterized by comprising the following steps:
s101, starting timing;
s102, inputting a single-frame image;
s103, detecting and positioning the human body joint points by using the trained convolutional neural network, wherein the positioning of the human body joint points comprises the following steps: head a, shoulder b, crotch c, knee d, ankle e;
s104, inputting e-point longitudinal coordinate threshold value eythresIf e is the ordinate ey<eythresJudging that the foot is off the ground, and ending timing;
s105, calculating the slope K of a straight line ac connecting the head a and the crotch c, and putting the calculation result into a set K;
s106, calculating a crotch bending angle bcd, and putting a calculation result into the set A;
S107、after the timing is finished, the image is not input any more, the set K is divided at the minimum value, and n subsets K are obtainednN sit-ups are performed within the timing time;
s108, taking the subset KnTwo endpoint values Kn1,Kn2
S109, the set A is arranged in the corresponding n subsets KnIs divided into n subsets An,AnMinimum value of (2)
An min
S110, judging whether each sit-up action is standard: slope threshold K of input straight line acthresAnd a crotch bending angle threshold value alpha, if An min<α, and Kn1 <Kthres,Kn2 <KthresIf the three conditions are met simultaneously, the standard of the sit-up action is considered, and the count is increased by 1, otherwise, the sit-up action is considered not to be standard and not to be counted;
and S111, outputting a counting result.
2. The sit-up counting method based on image vision technology according to claim 1, wherein the crotch bending angle threshold α is 80 ° to 100 °.
CN202210223146.7A 2022-03-09 2022-03-09 Sit-up counting method based on image vision technology Withdrawn CN114582021A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210223146.7A CN114582021A (en) 2022-03-09 2022-03-09 Sit-up counting method based on image vision technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210223146.7A CN114582021A (en) 2022-03-09 2022-03-09 Sit-up counting method based on image vision technology

Publications (1)

Publication Number Publication Date
CN114582021A true CN114582021A (en) 2022-06-03

Family

ID=81779195

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210223146.7A Withdrawn CN114582021A (en) 2022-03-09 2022-03-09 Sit-up counting method based on image vision technology

Country Status (1)

Country Link
CN (1) CN114582021A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115394400A (en) * 2022-08-24 2022-11-25 杭州闪动信息服务有限公司 Online AI intelligent motion management method and detection system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115394400A (en) * 2022-08-24 2022-11-25 杭州闪动信息服务有限公司 Online AI intelligent motion management method and detection system

Similar Documents

Publication Publication Date Title
CN109815907B (en) Sit-up posture detection and guidance method based on computer vision technology
CN110170159A (en) A kind of human health&#39;s action movement monitoring system
CN114582021A (en) Sit-up counting method based on image vision technology
TWI393579B (en) The state of the muscle movement state analysis system, methods and computer program products
CN105913045B (en) The method of counting and system of sit-ups test
CN111401260B (en) Sit-up test counting method and system based on Quick-OpenPose model
CN105892674A (en) Swimming stroke recognition method based on smart band and sports plan based on smart band
CN109731316B (en) Shooting training system
CN112464915A (en) Push-up counting method based on human body bone point detection
CN106422206A (en) Motion standardization recognition method based on intelligent bracelet
CN115188078A (en) Ping-pong intelligent action training method based on voice interaction and attitude estimation
CN106334310A (en) Shuttle run exercise monitoring and evaluating method and system thereof
CN111814700B (en) Behavior action recognition method based on child behavior characteristics
CN114582020A (en) Pull-up counting method based on image vision technology
Williams et al. The kinematic differences between skill levels in the squash forehand drive, volley and drop strokes
CN105833466A (en) Method and device for measuring and counting sit-up
CN109830278B (en) Anaerobic exercise fitness recommendation method and device, anaerobic exercise equipment and storage medium
CN111353345B (en) Method, apparatus, system, electronic device, and storage medium for providing training feedback
CN108079521A (en) A kind of running test method and system
CN114582022A (en) Push-up automatic counting method based on image vision technology
US11944870B2 (en) Movement determination method, movement determination device and computer-readable storage medium
CN113599776A (en) Real-time push-up counting and standard judging method and system
CN112784699A (en) Method and system for realizing posture evaluation guidance of sports coach
CN111291656B (en) Human body trunk posture matching method in measurement 2d image
CN113663314A (en) Intelligent sit-up and push-up counting system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20220603

WW01 Invention patent application withdrawn after publication