CN112509668A - Method for identifying whether hand is gripping or not - Google Patents
Method for identifying whether hand is gripping or not Download PDFInfo
- Publication number
- CN112509668A CN112509668A CN202011484298.XA CN202011484298A CN112509668A CN 112509668 A CN112509668 A CN 112509668A CN 202011484298 A CN202011484298 A CN 202011484298A CN 112509668 A CN112509668 A CN 112509668A
- Authority
- CN
- China
- Prior art keywords
- hand
- thumb
- user
- index finger
- left hand
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Epidemiology (AREA)
- Biomedical Technology (AREA)
- Primary Health Care (AREA)
- Social Psychology (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Physical Education & Sports Medicine (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Psychiatry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention relates to a method for identifying whether a hand is gripping, which comprises the following steps: establishing a coordinate system, and arranging an origin (0,0,0) at the optical motion capture equipment, wherein the positive X axis of the coordinate faces to the right of the equipment, the positive Y axis faces downwards, and the positive Z axis faces to the front of the equipment; continuously acquiring three-dimensional coordinates of user whole body characteristic joint point data in an optical motion capturing space coordinate system; and identifying and judging whether the hand of the patient is continuously gripped according to the data of the continuous multi-frame joint points which are formed by detecting the three points of the index finger, the thumb and the palm of the hand of the user and change in area, and feeding back the result to assist the user in performing gripping training of the hand. The invention has the advantages that: the problems that hand grasping training needs to be carried out in an actual rehabilitation scene, quantitative analysis cannot be carried out, training amount cannot be counted and the like are solved; the invention has the advantages of no radiation, high identification speed, simple identification mode, easy use and the like.
Description
Technical Field
The invention relates to the technical field of computer vision and pattern recognition, in particular to a method for recognizing whether a hand is grasped.
Background
Computer vision, the ability to acquire and process information using a computer to simulate human brain vision mechanisms, such as performing tasks of image target detection, recognition, tracking, and the like. Computer vision also crosses the disciplines of statistics, computer science, neurobiology and the like, and the final aim is to realize the understanding of a computer to a three-dimensional real world and realize the functions of a human visual system. More abstractly, computer vision can be seen as a perceptual problem in high-dimensional data such as images, including image processing and image understanding.
Pattern recognition, finding patterns in data is a fundamental problem, and the field of pattern recognition focuses on automatically discovering rules in data using computer algorithms and taking actions such as classifying data using these rules.
The scapulohumeral periarthritis is called as peri-shoulder inflammation in a whole, is a kind of aseptic inflammation which has adhesion property and is generated at a glenohumeral joint part of an upper limb in a moving and stiff way, and can only see the reduction change of the bone mass of a shoulder joint without other pathological phenomena under the radiographic image; the inflammatory response results in painful symptoms in the shoulder joint and surrounding tissues, and also affects the shoulder joint's anterior flexion, posterior extension, and rotation. The most important therapeutic goals of scapulohumeral periarthritis are to address pain and cure stiffness of the joint. In the field of sports rehabilitation, the continuous grasping training of the two arms at different angles can well exercise the nerves and muscles of the upper limbs, thereby achieving the effects of relieving pain and relieving joint stiffness.
At present, the traditional rehabilitation training of scapulohumeral periarthritis mainly takes passive manipulations and traction training as main points. The main defect is that only the recovery of the activity of the shoulder joint is focused, and the recovery of the nerve and muscle capacity of the upper limb is not considered, so that the recovery effect is not comprehensive.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, provides a method for identifying whether a hand is being grasped or not, and solves the defects in the existing rehabilitation training.
The purpose of the invention is realized by the following technical scheme: a method of identifying whether a hand is gripping, the method comprising:
establishing a coordinate system, and arranging an origin (0,0,0) at the optical motion capture equipment, wherein the positive X axis of the coordinate faces to the right of the equipment, the positive Y axis faces downwards, and the positive Z axis faces to the front of the equipment;
continuously acquiring three-dimensional coordinates of user whole body characteristic joint point data in an optical motion capture device space coordinate system;
and identifying and judging whether the hand of the user is continuously gripped according to the data of the continuous multi-frame joint points which are formed by detecting the three points of the index finger, the thumb and the palm of the hand, and feeding back the result to assist the user in performing the gripping training of the hand.
The identification and judgment of whether the hand of the user is continuously gripped according to the data of the continuous multi-frame joint points for detecting the area change formed by the three points of the index finger, the thumb and the palm of the hand comprises the following steps:
in the acquisition of the ith frame data, the spatial position coordinate of the RIGHT or left index finger (HAND _ RIGHT) is (x)HTR,yHTR,zHTR) The spatial position coordinate of the RIGHT or left THUMB (THUMB _ RIGHT) is (x)TR,yTR,zTR) The spatial position coordinate of the RIGHT or left palm (HAND _ RIGHT) is (x)HR,yHR,zHR);
The coordinates of the space vector from the heart of the right hand or the left hand to the index finger of the right hand or the left hand are calculated and expressed as The coordinates of the space vector from the heart of the right or left hand to the thumb of the right or left hand are expressed asAnd then calculating to obtain the vector product of the two vectors
According to the vector productCalculating to obtain the area of a triangle formed by three points of the heart of the right hand or the left hand, the index finger of the right hand or the left hand and the thumb of the right hand or the left hand of the ith frame
Calculating the area S of a triangle consisting of three points of the heart of the right hand or the left hand, the index finger of the right hand or the left hand and the thumb of the right hand or the left hand in the i +1 th frame data by the methodi+1Further, the absolute change value of the triangle area in two adjacent frames is obtained as Δ S ═ Si-Si+1|;
And when the absolute change value is larger than the preset value, the heart of the right hand or the left hand of the user, the index finger of the right hand or the left hand and the thumb of the right hand or the left hand are considered to be cooperatively changed, namely the right hand or the left hand of the user is considered to be continuously grasped.
The invention has the following advantages: a method for identifying whether a hand is gripping or not solves the problems that the hand gripping training needs to be watched in an actual rehabilitation scene, the quantitative analysis cannot be carried out, the training amount cannot be counted and the like; the invention has the advantages of no radiation, high identification speed, simple identification mode, easy use and the like.
Drawings
FIG. 1 is a schematic flow diagram of the present invention;
fig. 2 is a schematic diagram of a characteristic joint point of the whole human body.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the detailed description of the embodiments of the present application provided below in connection with the appended drawings is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application. The invention is further described below with reference to the accompanying drawings.
As shown in fig. 1 and 2, the present invention relates to a method for measuring and recognizing whether a hand is being grasped by an optical motion capture device, and the grasping state of the hand is recognized without feeling during grasping training by a user. The implementation mode comprises the following steps:
s1, establishing a coordinate system, wherein an origin (0,0,0) is positioned at the optical motion capture device, and a coordinate positive X axis faces to the right of the device; positive Y axis is downward; the positive Z-axis is towards the front of the device.
And S2, continuously acquiring user characteristic joint point data, specifically three-dimensional coordinates of 32 joint points (such as HEAD, NOSE and the like) of the user in a space coordinate system of the optical motion capture device.
And S3, identifying whether the hand of the user is continuously gripped according to the joint point data of the continuous multi-frame. And then feeding back the result to assist the user in carrying out hand gripping training.
Specifically, if the HAND of the user is continuously gripped, the spatial distances between the index finger (HAND), THUMB (THUMB), and palm (HAND) are continuously and cooperatively changed, i.e., the distance between the fingers and the palm and the distance between the THUMB and the palm are jointly increased (palm is open) or decreased (palm is clenched). Therefore, whether the hand of the user is continuously gripped or not can be judged by detecting the area change formed by the three points of the index finger, the thumb and the palm of the hand of the user.
In the i-th frame data, the spatial position coordinate of the RIGHT index finger (HANDTIP _ RIGHT) is (x)HTR,yHTR,zHTR) The spatial position coordinate of the THUMB of the RIGHT hand (THUMB _ RIGHT) is (x)TR,yTR,zTR) The spatial position coordinate of the RIGHT-HAND palm (HAND _ RIGHT) is (x)HR,yHR,zHR). The coordinates of the space vector from the center of the right hand to the index finger of the right hand are expressed as The coordinates of the space vector from the center of the right hand to the thumb of the right hand are expressed as The vector product of the two vectors is:
the area S of a triangle formed by three points of the heart of the right hand, the index finger of the right hand and the thumb of the right hand in the ith frameiEqual to half the modulo length of the above-mentioned vector product:
correspondingly, the area of a triangle formed by three points of the right hand heart, the right hand index finger and the right hand thumb in the i +1 th frame data is Si+1The absolute change of the triangle area in two adjacent frames is Δ S ═ Si-Si+1L. Definition when Δ S>1.0, three points of the palm of the right hand, the index finger of the right hand and the thumb of the right hand of the user are cooperatively changed, namely the right hand of the user is considered to be continuously gripped.
The LEFT HAND is determined in a manner similar to the right HAND, and the above-described operation and determination are performed using the spatial position coordinates of the index finger (HAND _ LEFT), the THUMB (THUMB _ LEFT), and the palm of the LEFT HAND (HAND _ LEFT).
The foregoing is illustrative of the preferred embodiments of this invention, and it is to be understood that the invention is not limited to the precise form disclosed herein and that various other combinations, modifications, and environments may be resorted to, falling within the scope of the concept as disclosed herein, either as described above or as apparent to those skilled in the relevant art. And that modifications and variations may be effected by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (2)
1. A method of identifying whether a hand is being grasped, characterized by: the method comprises the following steps:
establishing a coordinate system, and arranging an origin (0,0,0) at the optical motion capture equipment, wherein the positive X axis of the coordinate faces to the right of the equipment, the positive Y axis faces downwards, and the positive Z axis faces to the front of the equipment;
continuously acquiring three-dimensional coordinates of user whole body characteristic joint point data in an optical motion capture device space coordinate system;
and identifying and judging whether the hand of the user is continuously gripped according to the data of the continuous multi-frame joint points which are formed by detecting the three points of the index finger, the thumb and the palm of the hand, and feeding back the result to assist the user in performing the gripping training of the hand.
2. A method of identifying whether a hand is being grasped according to claim 1, wherein: the identification and judgment of whether the hand of the user is continuously gripped according to the data of the continuous multi-frame joint points for detecting the area change formed by the three points of the index finger, the thumb and the palm of the hand comprises the following steps:
in the acquisition of the ith frame data, the spatial position coordinate of the RIGHT or left index finger (HAND _ RIGHT) is (x)HTR,yHTR,zHTR) The spatial position coordinate of the RIGHT or left THUMB (THUMB _ RIGHT) is (x)TR,yTR,zTR) The spatial position coordinate of the RIGHT or left palm (HAND _ RIGHT) is (x)HR,yHR,zHR);
The coordinates of the space vector from the heart of the right hand or the left hand to the index finger of the right hand or the left hand are calculated and expressed as The coordinates of the space vector from the heart of the right or left hand to the thumb of the right or left hand are expressed asAnd then calculating to obtain the vector product of the two vectors
According to the vector productCalculating to obtain the area of a triangle formed by three points of the heart of the right hand or the left hand, the index finger of the right hand or the left hand and the thumb of the right hand or the left hand of the ith frame
Calculating the area S of a triangle consisting of three points of the heart of the right hand or the left hand, the index finger of the right hand or the left hand and the thumb of the right hand or the left hand in the i +1 th frame data by the methodi+1Further, the absolute change value of the triangle area in two adjacent frames is obtained as Δ S ═ Si-Si+1|;
And when the absolute change value is larger than the preset value, the heart of the right hand or the left hand of the user, the index finger of the right hand or the left hand and the thumb of the right hand or the left hand are considered to be cooperatively changed, namely the right hand or the left hand of the user is considered to be continuously grasped.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011484298.XA CN112509668A (en) | 2020-12-16 | 2020-12-16 | Method for identifying whether hand is gripping or not |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011484298.XA CN112509668A (en) | 2020-12-16 | 2020-12-16 | Method for identifying whether hand is gripping or not |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112509668A true CN112509668A (en) | 2021-03-16 |
Family
ID=74972446
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011484298.XA Pending CN112509668A (en) | 2020-12-16 | 2020-12-16 | Method for identifying whether hand is gripping or not |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112509668A (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020146672A1 (en) * | 2000-11-16 | 2002-10-10 | Burdea Grigore C. | Method and apparatus for rehabilitation of neuromotor disorders |
US20140204015A1 (en) * | 2013-01-23 | 2014-07-24 | Wistron Corporation | Gesture recognition module and gesture recognition method |
CN106598227A (en) * | 2016-11-15 | 2017-04-26 | 电子科技大学 | Hand gesture identification method based on Leap Motion and Kinect |
CN108261175A (en) * | 2017-12-26 | 2018-07-10 | 上海大学 | Healing hand function quantitative evaluating method of the one kind based on human hand " column grasping " action |
US20190103033A1 (en) * | 2017-10-03 | 2019-04-04 | ExtendView Inc. | Augmented reality system for providing movement sequences and monitoring performance |
CN109634415A (en) * | 2018-12-11 | 2019-04-16 | 哈尔滨拓博科技有限公司 | It is a kind of for controlling the gesture identification control method of analog quantity |
CN110069133A (en) * | 2019-03-29 | 2019-07-30 | 湖北民族大学 | Demo system control method and control system based on gesture identification |
CN110275610A (en) * | 2019-05-27 | 2019-09-24 | 山东科技大学 | A kind of collaboration gesture control coal mining simulation control method based on LeapMotion motion sensing control device |
CN111145865A (en) * | 2019-12-26 | 2020-05-12 | 中国科学院合肥物质科学研究院 | Vision-based hand fine motion training guidance system and method |
-
2020
- 2020-12-16 CN CN202011484298.XA patent/CN112509668A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020146672A1 (en) * | 2000-11-16 | 2002-10-10 | Burdea Grigore C. | Method and apparatus for rehabilitation of neuromotor disorders |
US20140204015A1 (en) * | 2013-01-23 | 2014-07-24 | Wistron Corporation | Gesture recognition module and gesture recognition method |
CN106598227A (en) * | 2016-11-15 | 2017-04-26 | 电子科技大学 | Hand gesture identification method based on Leap Motion and Kinect |
US20190103033A1 (en) * | 2017-10-03 | 2019-04-04 | ExtendView Inc. | Augmented reality system for providing movement sequences and monitoring performance |
CN108261175A (en) * | 2017-12-26 | 2018-07-10 | 上海大学 | Healing hand function quantitative evaluating method of the one kind based on human hand " column grasping " action |
CN109634415A (en) * | 2018-12-11 | 2019-04-16 | 哈尔滨拓博科技有限公司 | It is a kind of for controlling the gesture identification control method of analog quantity |
CN110069133A (en) * | 2019-03-29 | 2019-07-30 | 湖北民族大学 | Demo system control method and control system based on gesture identification |
CN110275610A (en) * | 2019-05-27 | 2019-09-24 | 山东科技大学 | A kind of collaboration gesture control coal mining simulation control method based on LeapMotion motion sensing control device |
CN111145865A (en) * | 2019-12-26 | 2020-05-12 | 中国科学院合肥物质科学研究院 | Vision-based hand fine motion training guidance system and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107301370B (en) | Kinect three-dimensional skeleton model-based limb action identification method | |
JP6207510B2 (en) | Apparatus and method for analyzing golf swing | |
CN112464918B (en) | Body-building action correcting method and device, computer equipment and storage medium | |
CN104524742A (en) | Cerebral palsy child rehabilitation training method based on Kinect sensor | |
CN105832343B (en) | Multidimensional vision hand function rehabilitation quantitative evaluation system and evaluation method | |
Ghasemzadeh et al. | Wearable coach for sport training: A quantitative model to evaluate wrist-rotation in golf | |
de San Roman et al. | Saliency driven object recognition in egocentric videos with deep CNN: toward application in assistance to neuroprostheses | |
KR102320960B1 (en) | Personalized home training behavior guidance and correction system | |
WO2022193425A1 (en) | Exercise data display method and system | |
US20110212810A1 (en) | Balance training system | |
WO2017161733A1 (en) | Rehabilitation training by means of television and somatosensory accessory and system for carrying out same | |
CN112435731A (en) | Method for judging whether real-time posture meets preset rules | |
CN111883229B (en) | Intelligent movement guidance method and system based on visual AI | |
CN113709411A (en) | Sports auxiliary training system of MR intelligent glasses based on eye movement tracking technology | |
CN109126045A (en) | intelligent motion analysis and training system | |
Yang et al. | Hand rehabilitation using virtual reality electromyography signals | |
CN116966056A (en) | Upper limb rehabilitation evaluation system for cerebral apoplexy patient | |
Krabben et al. | How wide should you view to fight? Establishing the size of the visual field necessary for grip fighting in judo | |
CN117503115A (en) | Rehabilitation training system and training method for nerve injury | |
CN112861606A (en) | Virtual reality hand motion recognition and training method based on skeleton animation tracking | |
CN112509668A (en) | Method for identifying whether hand is gripping or not | |
CN115624338A (en) | Upper limb stimulation feedback rehabilitation device and control method thereof | |
JP3686418B2 (en) | Measuring device and method | |
Li et al. | [Retracted] Application of Biomechanics Based on Intelligent Technology and Big Data in Physical Fitness Training of Athletes | |
CN116963807A (en) | Motion data display method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |