CN112132020A - Hand grip judgment method and device - Google Patents
Hand grip judgment method and device Download PDFInfo
- Publication number
- CN112132020A CN112132020A CN202011003481.3A CN202011003481A CN112132020A CN 112132020 A CN112132020 A CN 112132020A CN 202011003481 A CN202011003481 A CN 202011003481A CN 112132020 A CN112132020 A CN 112132020A
- Authority
- CN
- China
- Prior art keywords
- hand
- experimenter
- key point
- point information
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 238000013145 classification model Methods 0.000 claims abstract description 23
- 238000002474 experimental method Methods 0.000 claims abstract description 13
- 238000004590 computer program Methods 0.000 claims description 20
- 238000013528 artificial neural network Methods 0.000 claims description 10
- 238000012549 training Methods 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 6
- 238000007781 pre-processing Methods 0.000 claims description 3
- 230000009471 action Effects 0.000 abstract description 8
- 238000004422 calculation algorithm Methods 0.000 description 15
- 238000001514 detection method Methods 0.000 description 14
- 210000004247 hand Anatomy 0.000 description 13
- 230000006870 function Effects 0.000 description 4
- 230000000717 retained effect Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 238000007635 classification algorithm Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 210000001145 finger joint Anatomy 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000003062 neural network model Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 210000003857 wrist joint Anatomy 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a hand gripping judgment method and a hand gripping judgment device, wherein the method comprises the following steps: acquiring at least one frame of hand image of the experimenter in the experiment video; acquiring hand key point information in the hand image of the experimenter; and inputting the hand key point information into a classification model to obtain a judgment result of whether the hands of the experimenters grip the experimental instrument. The invention can greatly improve the judgment accuracy of the hand gripping action of experimenters and provide reliable basis for air tightness judgment in experiments.
Description
Technical Field
The present invention relates to the field of machine learning technologies, and in particular, to a hand grip determination method, a hand grip determination apparatus, a computer device, a non-transitory computer-readable storage medium, and a computer program product.
Background
In physical or chemical experiments, it is often necessary to check the airtightness of the device by gripping the test tube. At present, intelligent laboratory bench has been able to carry out automatic scoring to the gas tightness of device, and wherein, a key criterion of automatic scoring just is whether laboratory staff's hand grips the test tube tightly.
The traditional method for judging whether the hand grips the object or not is to detect the hand and the object and then judge the relative positions of the hand and the object, and the method has the defect of high misjudgment rate.
Disclosure of Invention
The invention provides a hand gripping judgment method and device for solving the technical problems, which can greatly improve the judgment accuracy of hand gripping actions of experimenters and provide reliable basis for air tightness judgment in experiments.
The technical scheme adopted by the invention is as follows:
a hand grip judgment method comprises the following steps: acquiring at least one frame of hand image of the experimenter in the experiment video; acquiring hand key point information in the hand image of the experimenter; and inputting the hand key point information into a classification model to obtain a judgment result of whether the hands of the experimenters grip the experimental instrument.
The hand gripping judgment method further comprises the following steps: acquiring a data set, wherein the data set comprises a plurality of hand key point information in a gripping state and a plurality of hand key point information in a non-gripping state; and training a neural network through the data set to obtain the classification model.
The method includes the following steps that at least one frame of hand image of an experimenter in an experiment video is obtained, and the method specifically comprises the following steps: detecting a hand target of each frame of image in the experimental video; and filtering the hand targets, and filtering the hand targets of non-experimenters to determine hand images of the experimenters.
Acquiring hand key point information in the hand image of the experimenter, specifically comprising: preprocessing the hand image of the experimenter to obtain a hand area of the experimenter; and detecting hand key point information in the hand area of the experimenter through Openpos.
The neural network is EfficientNet.
A hand grip determination device comprising: the first acquisition module is used for acquiring at least one frame of hand image of the experimenter in the experimental video; the second acquisition module is used for acquiring hand key point information in the hand image of the experimenter; and the judgment module is used for inputting the hand key point information into the classification model so as to obtain a judgment result of whether the hands of the experimenter tightly hold the experimental instrument.
The hand grip judging device further comprises: the third acquisition module is used for acquiring a data set, wherein the data set comprises a plurality of pieces of hand key point information in a gripping state and a plurality of pieces of hand key point information in a non-gripping state; and the training module is used for training the neural network through the data set to obtain the classification model.
A computer device comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein when the processor executes the computer program, the hand grip judgment method is realized.
A non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the hand grip determination method described above.
A computer program product, wherein instructions of the computer program product, when executed by a processor, perform the hand grip determination method.
The invention has the beneficial effects that:
according to the method, at least one frame of hand image of the experimenter in the experimental video is obtained, the hand key point information in the hand image of the experimenter is obtained, and then the hand key point information is input into the classification model to obtain the hand gripping judgment result of whether the hand of the experimenter is in a gripping state, so that the judgment accuracy of the hand gripping action of the experimenter can be greatly improved, and a reliable basis is provided for air tightness judgment in the experiment.
Drawings
FIG. 1 is a flowchart of a hand-clasping determination method according to an embodiment of the present invention;
FIG. 2 is an illustration of Openpos hand key detection according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating actual detected keypoints, according to an embodiment of the present invention;
fig. 4 is a block diagram of a hand-grip determination device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, the hand grip determination method according to the embodiment of the present invention includes the following steps:
and S1, acquiring at least one frame of hand image of the experimenter in the experimental video.
In one embodiment of the invention, the hand targets of each frame of image in the experimental video can be detected, and the hand targets are filtered to filter out the hand targets of non-experimenters so as to determine the hand images of the experimenters.
Specifically, the detection of the hand target in the image can be realized through a target detection algorithm, which may be a two-stage algorithm, such as an R-CNN algorithm, or may also be a one-stage algorithm, such as a Yolo algorithm, an SSD algorithm, or the like.
It should be understood that the experimental video is generally a video taken for an experimenter, and that most other people in the experimental scene are behind the experimenter, so that hands of non-experimenters are mostly smaller and a few are larger in the video image relative to hands of the experimenter. Therefore, in an embodiment of the present invention, for a plurality of hand targets detected in one frame image, the hand target with a smaller detection frame area may be filtered out, the hand target with a larger detection frame area may be retained, and the experimenter hand image may be regenerated. In another embodiment of the invention, for each of a plurality of frames of images in which a hand target is detected, an image with a small area of the detection frame can be filtered, and an image with a large area of the detection frame is reserved. In another embodiment of the present invention, for each of the plurality of frames of images in which a hand target is detected, the image with the area of the detected frame that is greatly different from the other images may be filtered, and the image with the area of the detected frame that is less different from the other images may be retained. The comparison between the large and small values can be set according to the actual size of the video image, the requirement for the filtering accuracy, and other specific conditions, and it is not convenient to limit the specific comparison threshold value here.
Holding a laboratory instrument is an important feature of the hands of laboratory personnel. Therefore, in one embodiment of the present invention, for a hand object detected by the two-stage algorithm, the hand region may be characterized by whether the hand region contains a laboratory instrument, such as a test tube, in the second-step candidate box classification algorithm, so as to filter out the hand object or the hand image not holding the laboratory instrument, and determine the hand image of the laboratory worker.
And S2, acquiring the hand key point information in the hand image of the experimenter.
In one embodiment of the invention, after determining the experimenter hand image, the experimenter hand image may first be preprocessed, such as cut and fill, to obtain an experimenter hand region. In particular, cropping and filling of images can be achieved based on tensorflow.
Then, hand keypoint information in the experimenter hand region can be detected through openpos. As shown in fig. 2, the hand key points that openpos can detect include 21 finger joints, fingertip joints, and wrist joints. In an embodiment of the present invention, openpos detects 11 key points of the hand of the experimenter as shown in fig. 3, and the hand key point information includes the distribution of the key points, which can be understood as the relative position relationship of each key point.
And S3, inputting the hand key point information into the classification model to obtain a judgment result of whether the hands of the experimenter are gripping the experimental instrument.
Wherein, the classification model is a neural network model trained in advance. That is to say, the hand grip determination method according to the embodiment of the present invention further includes: acquiring a data set, wherein the data set comprises a plurality of hand key point information in a gripping state and a plurality of hand key point information in a non-gripping state; the neural network is trained through a data set to obtain a classification model.
Above-mentioned classification model can be according to the key point information of input, whether output hand is in the classification result of the state of gripping, if the hand is in the state of gripping, then can judge that the experimenter hand grips the laboratory glassware tightly, if the hand is not in the state of gripping, then can judge that the experimenter hand does not grip the laboratory glassware tightly.
In an embodiment of the present invention, the neural network is an EfficientNet.
After judging whether the hands of the experimenter tightly hold the experimental instrument, the air tightness grading point of the experimental device can be judged according to the judgment.
According to the hand gripping judgment method provided by the embodiment of the invention, the hand gripping judgment result of whether the hands of the experimenters are in the gripping state is obtained by acquiring at least one frame of hand image of the experimenters in the experimental video, acquiring the hand key point information in the hand image of the experimenters and inputting the hand key point information into the classification model, so that the judgment accuracy of the hand gripping action of the experimenters can be greatly improved, and a reliable basis is provided for air tightness judgment in experiments.
Corresponding to the hand grip determination method of the above embodiment, the invention further provides a hand grip determination device.
As shown in fig. 4, the hand grip determination device according to the embodiment of the present invention includes a first acquiring module 10, a second acquiring module 20, and a determining module 30. The first acquisition module 10 is used for acquiring at least one frame of hand image of an experimenter in an experimental video; the second obtaining module 20 is configured to obtain hand key point information in the hand image of the experimenter; the judgment module 30 is configured to input the hand key point information into the classification model to obtain a judgment result of whether the experimenter grips the experimental instrument.
In an embodiment of the present invention, the first obtaining module 10 may detect a hand target of each frame of image in the experimental video, filter the hand target, and filter out non-experimental person hand targets to determine the experimental person hand image.
Specifically, the first obtaining module 10 may implement detection of a hand target in an image through a target detection algorithm, where the target detection algorithm may be a two-stage algorithm, such as an R-CNN algorithm, or may also be a one-stage algorithm, such as a Yolo algorithm, an SSD algorithm, or the like.
It should be understood that the experimental video is generally a video taken for an experimenter, and that most other people in the experimental scene are behind the experimenter, so that hands of non-experimenters are mostly smaller and a few are larger in the video image relative to hands of the experimenter. Therefore, in an embodiment of the present invention, for a plurality of hand targets detected in one frame image, the hand target with a smaller detection frame area may be filtered out, the hand target with a larger detection frame area may be retained, and the experimenter hand image may be regenerated. In another embodiment of the invention, for each of a plurality of frames of images in which a hand target is detected, an image with a small area of the detection frame can be filtered, and an image with a large area of the detection frame is reserved. In another embodiment of the present invention, for each of the plurality of frames of images in which a hand target is detected, the image with the area of the detected frame that is greatly different from the other images may be filtered, and the image with the area of the detected frame that is less different from the other images may be retained. The comparison between the large and small values can be set according to the actual size of the video image, the requirement for the filtering accuracy, and other specific conditions, and it is not convenient to limit the specific comparison threshold value here.
Holding a laboratory instrument is an important feature of the hands of laboratory personnel. Therefore, in one embodiment of the present invention, for a hand object detected by the two-stage algorithm, the hand region may be characterized by whether the hand region contains a laboratory instrument, such as a test tube, in the second-step candidate box classification algorithm, so as to filter out the hand object or the hand image not holding the laboratory instrument, and determine the hand image of the laboratory worker.
In one embodiment of the present invention, after determining the hand image of the experimenter, the second obtaining module 20 may first perform preprocessing, such as cropping and filling, on the hand image of the experimenter to obtain the hand region of the experimenter. In particular, cropping and filling of images can be achieved based on tensorflow.
Then, the second obtaining module 20 may detect the hand key point information in the hand region of the experimenter through openpos. As shown in fig. 2, the hand key points that openpos can detect include 21 finger joints, fingertip joints, and wrist joints. In an embodiment of the present invention, openpos detects 11 key points of the hand of the experimenter as shown in fig. 3, and the hand key point information includes the distribution of the key points, which can be understood as the relative position relationship of each key point.
The classification model is a neural network model trained in advance. That is to say, the hand-grip determining device according to the embodiment of the present invention may further include a third obtaining module and a training module, where the third obtaining module is configured to obtain a data set, and the data set includes a plurality of pieces of hand key point information in a grip state and a plurality of pieces of hand key point information in a non-grip state; the training module is used for training the neural network through the data set to obtain a classification model.
Above-mentioned classification model can be according to the key point information of input, whether output hand is in the classification result of the state of gripping, if the hand is in the state of gripping, then can judge that the experimenter hand grips the laboratory glassware tightly, if the hand is not in the state of gripping, then can judge that the experimenter hand does not grip the laboratory glassware tightly.
In an embodiment of the present invention, the neural network is an EfficientNet.
After judging whether the hands of the experimenter tightly hold the experimental instrument, the air tightness grading point of the experimental device can be judged according to the judgment.
According to the hand gripping judgment device provided by the embodiment of the invention, the hand gripping judgment result of whether the hands of the experimenters are in the gripping state is obtained by acquiring at least one frame of hand image of the experimenters in the experimental video, acquiring the hand key point information in the hand image of the experimenters and inputting the hand key point information into the classification model, so that the judgment accuracy of the hand gripping action of the experimenters can be greatly improved, and a reliable basis is provided for air tightness judgment in experiments.
The invention further provides a computer device corresponding to the embodiment.
The computer device according to the embodiment of the present invention includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the hand-clasping determination method according to the above-described embodiment of the present invention can be implemented.
According to the computer equipment provided by the embodiment of the invention, when the processor executes the computer program stored on the memory, at least one frame of hand image of the experimenter in the experimental video is obtained, the hand key point information in the hand image of the experimenter is obtained, and then the hand key point information is input into the classification model, so that a hand gripping judgment result of whether the hand of the experimenter is in a gripping state is obtained, therefore, the judgment accuracy of the hand gripping action of the experimenter can be greatly improved, and a reliable basis is provided for air tightness judgment in the experiment.
The invention also provides a non-transitory computer readable storage medium corresponding to the above embodiment.
A non-transitory computer-readable storage medium of an embodiment of the present invention stores thereon a computer program, which, when executed by a processor, can implement the hand-grip determination method according to the above-described embodiment of the present invention.
According to the non-transitory computer-readable storage medium of the embodiment of the invention, when the processor executes the computer program stored on the processor, at least one frame of hand image of the experimenter in the experimental video is obtained, the hand key point information in the hand image of the experimenter is obtained, and then the hand key point information is input into the classification model, so that the hand gripping judgment result of whether the hand of the experimenter is in a gripping state is obtained, therefore, the judgment accuracy of the hand gripping action of the experimenter can be greatly improved, and a reliable basis is provided for air tightness judgment in the experiment.
The present invention also provides a computer program product corresponding to the above embodiments.
When the instructions in the computer program product of the embodiment of the present invention are executed by the processor, the hand-grip determining method according to the above-mentioned embodiment of the present invention can be executed.
According to the computer program product provided by the embodiment of the invention, when the processor executes the instruction, at least one frame of hand image of the experimenter in the experimental video is obtained, the hand key point information in the hand image of the experimenter is obtained, and then the hand key point information is input into the classification model, so that a hand gripping judgment result of whether the hand of the experimenter is in a gripping state is obtained, therefore, the judgment accuracy of the hand gripping action of the experimenter can be greatly improved, and a reliable basis is provided for air tightness judgment in the experiment.
In the description of the present invention, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. The meaning of "plurality" is two or more unless specifically limited otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (10)
1. A hand grip judgment method is characterized by comprising the following steps:
acquiring at least one frame of hand image of the experimenter in the experiment video;
acquiring hand key point information in the hand image of the experimenter;
and inputting the hand key point information into a classification model to obtain a judgment result of whether the hands of the experimenters grip the experimental instrument.
2. The hand grip determination method according to claim 1, further comprising:
acquiring a data set, wherein the data set comprises a plurality of hand key point information in a gripping state and a plurality of hand key point information in a non-gripping state;
and training a neural network through the data set to obtain the classification model.
3. The hand grip judgment method according to claim 1 or 2, wherein the acquiring of at least one frame of experimenter hand image in the experimental video specifically comprises:
detecting a hand target of each frame of image in the experimental video;
and filtering the hand targets, and filtering the hand targets of non-experimenters to determine hand images of the experimenters.
4. The hand grip determination method according to claim 3, wherein the acquiring of the hand key point information in the experimenter hand image specifically comprises:
preprocessing the hand image of the experimenter to obtain a hand area of the experimenter;
and detecting hand key point information in the hand area of the experimenter through Openpos.
5. The hand grip determination method according to claim 2, wherein the neural network is EfficientNet.
6. A hand grip determination device, comprising:
the first acquisition module is used for acquiring at least one frame of hand image of the experimenter in the experimental video;
the second acquisition module is used for acquiring hand key point information in the hand image of the experimenter;
and the judgment module is used for inputting the hand key point information into the classification model so as to obtain a judgment result of whether the hands of the experimenter tightly hold the experimental instrument.
7. The hand grip determination device according to claim 6, further comprising:
the third acquisition module is used for acquiring a data set, wherein the data set comprises a plurality of pieces of hand key point information in a gripping state and a plurality of pieces of hand key point information in a non-gripping state;
and the training module is used for training the neural network through the data set to obtain the classification model.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the hand grip determination method according to any one of claims 1-5 when executing the computer program.
9. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program, when executed by a processor, implements the hand grip determination method according to any one of claims 1-5.
10. A computer program product, characterized in that instructions in the computer program product, when executed by a processor, perform the hand grip determination method according to any one of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011003481.3A CN112132020A (en) | 2020-09-22 | 2020-09-22 | Hand grip judgment method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011003481.3A CN112132020A (en) | 2020-09-22 | 2020-09-22 | Hand grip judgment method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112132020A true CN112132020A (en) | 2020-12-25 |
Family
ID=73842453
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011003481.3A Pending CN112132020A (en) | 2020-09-22 | 2020-09-22 | Hand grip judgment method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112132020A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108227912A (en) * | 2017-11-30 | 2018-06-29 | 北京市商汤科技开发有限公司 | Apparatus control method and device, electronic equipment, computer storage media |
CN110796051A (en) * | 2019-10-19 | 2020-02-14 | 北京工业大学 | Real-time access behavior detection method and system based on container scene |
CN111325128A (en) * | 2020-02-13 | 2020-06-23 | 上海眼控科技股份有限公司 | Illegal operation detection method and device, computer equipment and storage medium |
CN111414813A (en) * | 2020-03-03 | 2020-07-14 | 南京领行科技股份有限公司 | Dangerous driving behavior identification method, device, equipment and storage medium |
CN111444764A (en) * | 2020-02-21 | 2020-07-24 | 广东工业大学 | Gesture recognition method based on depth residual error network |
CN111563480A (en) * | 2020-06-01 | 2020-08-21 | 北京嘀嘀无限科技发展有限公司 | Conflict behavior detection method and device, computer equipment and storage medium |
-
2020
- 2020-09-22 CN CN202011003481.3A patent/CN112132020A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108227912A (en) * | 2017-11-30 | 2018-06-29 | 北京市商汤科技开发有限公司 | Apparatus control method and device, electronic equipment, computer storage media |
CN110796051A (en) * | 2019-10-19 | 2020-02-14 | 北京工业大学 | Real-time access behavior detection method and system based on container scene |
CN111325128A (en) * | 2020-02-13 | 2020-06-23 | 上海眼控科技股份有限公司 | Illegal operation detection method and device, computer equipment and storage medium |
CN111444764A (en) * | 2020-02-21 | 2020-07-24 | 广东工业大学 | Gesture recognition method based on depth residual error network |
CN111414813A (en) * | 2020-03-03 | 2020-07-14 | 南京领行科技股份有限公司 | Dangerous driving behavior identification method, device, equipment and storage medium |
CN111563480A (en) * | 2020-06-01 | 2020-08-21 | 北京嘀嘀无限科技发展有限公司 | Conflict behavior detection method and device, computer equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109948469B (en) | Automatic inspection robot instrument detection and identification method based on deep learning | |
CN111310645B (en) | Method, device, equipment and storage medium for warning overflow bin of goods accumulation | |
CN110472082B (en) | Data processing method, data processing device, storage medium and electronic equipment | |
CN111242899B (en) | Image-based flaw detection method and computer-readable storage medium | |
CN107808126A (en) | Vehicle retrieval method and device | |
EP3629019A3 (en) | Blood coagulation analyzing method, blood coagulation analyzing apparatus, and non-transitory comupter-readable storage medium | |
CN115586256B (en) | Method, device, equipment and storage medium for detecting cleaning grade of experimental equipment | |
CN115619778A (en) | Power equipment defect identification method and system, readable storage medium and equipment | |
CN112991343B (en) | Method, device and equipment for identifying and detecting macular region of fundus image | |
CN113095445B (en) | Target identification method and device | |
CN113554645B (en) | Industrial anomaly detection method and device based on WGAN | |
CN113781483B (en) | Industrial product appearance defect detection method and device | |
CN110111311A (en) | A kind of image quality evaluating method and device | |
CN113780484A (en) | Industrial product defect detection method and device | |
CN112132020A (en) | Hand grip judgment method and device | |
CN115359412B (en) | Hydrochloric acid neutralization experiment scoring method, device, equipment and readable storage medium | |
CN116442786A (en) | Power battery differential pressure abnormality identification method, device, server and storage medium | |
CN115249316A (en) | Industrial defect detection method and device | |
CN110738077A (en) | foreign matter detection method and device | |
CN112132139A (en) | Character recognition method and device | |
JP2001314374A (en) | Corneal endothelial cell measuring apparatus | |
CN116109543A (en) | Method and device for quickly identifying and reading data and computer readable storage medium | |
CN113326749A (en) | Target detection method and device, storage medium and electronic equipment | |
CN112966762A (en) | Wild animal detection method and device, storage medium and electronic equipment | |
KR20210088423A (en) | Occlusal pressure analysis program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201225 |
|
RJ01 | Rejection of invention patent application after publication |