CN111625157A - Fingertip key point detection method, device, equipment and readable storage medium - Google Patents

Fingertip key point detection method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN111625157A
CN111625157A CN202010431810.8A CN202010431810A CN111625157A CN 111625157 A CN111625157 A CN 111625157A CN 202010431810 A CN202010431810 A CN 202010431810A CN 111625157 A CN111625157 A CN 111625157A
Authority
CN
China
Prior art keywords
fingertip
hand
category
detection
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010431810.8A
Other languages
Chinese (zh)
Other versions
CN111625157B (en
Inventor
高原
沈辉
张演龙
孙昊
文石磊
丁二锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202010431810.8A priority Critical patent/CN111625157B/en
Publication of CN111625157A publication Critical patent/CN111625157A/en
Application granted granted Critical
Publication of CN111625157B publication Critical patent/CN111625157B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Biology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a fingertip key point detection method, a fingertip key point detection device, fingertip key point detection equipment and a readable storage medium, and relates to the technical field of computer vision. The specific implementation scheme is as follows: acquiring a hand area in an image to be detected; performing gesture detection on the hand area to obtain the gesture category of the hand area; and outputting position information of fingertip key points corresponding to the fingertip trigger type, which is obtained by detecting the human hand key points in the human hand area, in response to the gesture type being the fingertip trigger type. The embodiment can improve the detection precision of the fingertip key points.

Description

Fingertip key point detection method, device, equipment and readable storage medium
Technical Field
The application relates to computer technology, in particular to the technical field of computer vision.
Background
The fingertip key point detection task is to detect the position of a finger fingertip by analyzing a given input picture, and the fingertip key point detection task is a necessary function of a fingertip reading function in a current education scene.
Compared with detection tasks of other key points, the fingertip key point detection method has the defects that the fingertip key point detection precision is low and the fingertip key point detection method is difficult to put into practical application due to the fact that multiple factors such as serious shielding, large change range and incomplete hands exist.
Disclosure of Invention
The embodiment of the application provides a fingertip key point detection method, a fingertip key point detection device, fingertip key point detection equipment and a readable storage medium.
In a first aspect, an embodiment of the present application provides a method for detecting a fingertip key point, including:
acquiring a hand area in an image to be detected;
performing gesture detection on the hand area to obtain the gesture category of the hand area;
and outputting position information of fingertip key points corresponding to the fingertip trigger type, which is obtained by detecting the human hand key points in the human hand area, in response to the gesture type being the fingertip trigger type.
In a second aspect, an embodiment of the present application provides a fingertip keypoint detection device, including:
the acquisition module is used for acquiring a hand area in an image to be detected;
the gesture detection module is used for carrying out gesture detection on the hand area to obtain the gesture category of the hand area;
and the output module is used for responding to the fact that the gesture type is a fingertip trigger type, and outputting position information of fingertip key points corresponding to the fingertip trigger type, which is obtained by detecting the human hand key points in the human hand area.
In a third aspect, an embodiment of the present application further provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a fingertip keypoint detection method as provided in any of the embodiments.
In a fourth aspect, the present application further provides a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute a fingertip keypoint detection method provided in any embodiment.
According to the technology of the application, the detection precision of the fingertip key points can be improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1a is a flow chart of a first fingertip keypoint detection method in an embodiment of the present application;
FIG. 1b is a diagram illustrating the gestures of the index finger tip trigger category in the embodiment of the present application;
FIG. 1c is a schematic illustration of a human hand keypoint in an embodiment of the present application;
FIG. 2 is a flowchart of a second method for detecting fingertip key points in the embodiment of the present application
FIG. 3a is a flow chart of a third method for fingertip keypoint detection in an embodiment of the present application;
FIG. 3b is a schematic pointing diagram of a target finger in the embodiment of the present application;
fig. 4 is a flowchart of a fourth method for detecting fingertip key points according to the embodiment of the present application;
fig. 5 is a structural diagram of a fingertip key point detection device in the embodiment of the present application;
fig. 6 is a block diagram of an electronic device for implementing the fingertip keypoint detection method according to the embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1a is a flowchart of a first method for detecting fingertip keypoints in an embodiment of the present application, and the embodiment of the present application is applicable to a case where fingertip keypoints are detected from an image. The method is executed by a fingertip key point detection device, the device is realized by software and/or hardware, and the device is specifically configured in electronic equipment with certain data operation capability.
The fingertip key point detection method shown in fig. 1a includes:
and S110, acquiring a hand area in the image to be detected.
When a user points at the electronic screen with a finger to read, shooting the electronic screen to obtain an image to be detected. A human hand is displayed on the image to be detected, and the human hand on the image to be detected may be partially shielded or incomplete.
Specifically, the hand detection is performed on the image to be detected through an object detection algorithm, so that a hand region is obtained. The object detection algorithm includes, but is not limited to, R-CNN (Region CNN).
And S120, performing gesture detection on the hand region to obtain the gesture category of the hand region.
In this embodiment, gesture detection is performed on a human hand region first, and a gesture type in the human hand region is detected.
In this embodiment, the fingertip triggering category is a gesture category when the fingertip of one finger is used to trigger the electronic screen. The fingertip trigger categories include a thumb fingertip trigger category, an index finger fingertip trigger category, a middle finger fingertip trigger category, a ring finger fingertip trigger category, and a little finger fingertip trigger category. FIG. 1b is a diagram illustrating the gesture of the index finger tip trigger category in the embodiment of the present application.
And S130, outputting the position information of the fingertip key points corresponding to the fingertip triggering type, which is obtained by detecting the human hand key points in the human hand region, in response to the fact that the gesture type is the fingertip triggering type.
If the gesture type is the fingertip triggering type, the fact that the user really triggers by the fingertips of the fingers corresponding to the fingertip triggering type is shown, and a fingertip triggering gesture is made, the probability that the fingertip key points exist in the hand area is high, and the detection result precision of the fingertip key points is high.
In an optional implementation manner, in response to that the gesture category is a fingertip trigger category, human hand key point detection is performed on a human hand region, so that position information of a fingertip key point corresponding to the fingertip trigger category is obtained. FIG. 1c is a schematic diagram of key points of a human hand in an embodiment of the present application. Different fingertip trigger categories correspond to different fingertip key points, for example, a thumb fingertip trigger category corresponds to a thumb fingertip 4, an index fingertip trigger category corresponds to an index fingertip key point 8, a middle finger fingertip trigger category corresponds to a middle finger fingertip key point 12, a ring finger fingertip trigger category corresponds to a ring finger fingertip key point 16, and a little finger fingertip trigger category corresponds to a little finger fingertip key point 20.
Specifically, the hand key point detection model is used for detecting the hand key points in the hand area to obtain the position information of the hand key points. The human hand key point detection model is, for example, a Multi-context model, a Learning Feature model, or the like. The model at least outputs the position information of the fingertip key points corresponding to the fingertip trigger category. With reference to fig. 1b and 1c, if the gesture is the index finger fingertip trigger category, the human hand keypoint detection model outputs at least the position information of the fingertip keypoints 8. Based on this, it is necessary to screen the position information of the fingertip key point corresponding to the fingertip trigger type from the position information of the human hand key point.
In another alternative embodiment, before responding to the gesture category being the fingertip trigger category, the position information of the human hand key point is obtained by detecting the human hand key point in the human hand area. Based on the above, in response to the gesture type being the fingertip trigger type, the position information of the fingertip key point corresponding to the fingertip trigger type is screened from the position information of the human hand key point.
It should be noted that, in the two alternative embodiments, the position information of the key point of the human hand may only include the position information of one key point, and the position information of the key point is directly used as the position information of the key point of the fingertip.
In this embodiment, firstly, a hand region is subjected to gesture detection, and when the gesture category is a fingertip trigger category, it indicates that a user really uses a fingertip of a finger corresponding to the fingertip trigger category to trigger, and a fingertip trigger gesture is made, so that a possibility that a fingertip key point corresponding to the fingertip trigger gesture exists in the hand region is high, and under this condition, the detection result precision of the fingertip key point is high; moreover, the fingertip key points and the fingertip triggering types have a corresponding relation, and the detection precision of the fingertip key points is further improved.
According to the embodiment of the present application, fig. 2 is a flowchart of a second method for detecting fingertip keypoints in the embodiment of the present application, and the embodiment of the present application is optimized based on the technical solutions of the above embodiments.
Optionally, the operation "performing gesture detection on the hand region to obtain the gesture category of the hand region" is refined into any one of the following operations "performing gesture detection on the hand region through a gesture classification model to obtain the hand region as a background category; performing gesture detection on the hand region through a gesture classification model to obtain that the hand region is a non-fingertip triggering type; performing gesture detection on the hand region through a gesture classification model to obtain that the hand region belongs to a fingertip trigger category; performing gesture detection on the hand region through a gesture classification model to obtain that the hand region belongs to a fingertip trigger category; detecting the hand key points of the hand region through the hand key point detection model to obtain the position information of the hand key points; and determining that the human hand region belongs to the fingertip trigger category' in response to the detection accuracy of the fingertip trigger category and the detection accuracy of the position information of the human hand key point meeting the set requirement.
Optionally, the operation "responding to the gesture type as the fingertip trigger type, outputting the position information of the fingertip key point corresponding to the fingertip trigger type, which is obtained by detecting the hand key point in the hand region," is refined to "responding to the gesture type as the fingertip trigger type, and the position information of the hand key point obtained by detecting the hand key point in the hand region through the hand key point detection model is obtained; and according to the position relation among the key points, screening the position information of the fingertip key points corresponding to the fingertip trigger type from the position information of the human hand key points.
The fingertip key point detection method shown in fig. 2 includes:
s210, acquiring a human hand area in the image to be detected. S221, S222, S223, or S224 is continuously performed.
S221, performing gesture detection on the hand region through the gesture classification model to obtain that the hand region is a background category, and jumping to S230.
S222, performing gesture detection on the hand region through the gesture classification model to obtain that the hand region is in a non-fingertip triggering type, and jumping to S230.
And S223, performing gesture detection on the hand region through the gesture classification model to obtain that the hand region belongs to the fingertip trigger category. Execution continues with S250.
S224, performing gesture detection on the hand region through the gesture classification model to obtain that the hand region belongs to a fingertip trigger category; and detecting the hand key points of the hand region through the hand key point detection model to obtain the position information of the hand key points. Execution continues with S240.
And S230, ending the operation.
And S240, determining that the human hand area belongs to the fingertip triggering category in response to the fact that the detection precision of the fingertip triggering category and the detection precision of the position information of the human hand key point meet set requirements. Execution continues with S250.
And S250, responding to the fact that the gesture type is the fingertip triggering type, and acquiring position information of the hand key points, which is obtained by detecting the hand key points of the hand region through the hand key point detection model.
And S260, according to the position relation among the key points of the human hand, screening the position information of the fingertip key points corresponding to the fingertip triggering types from the position information of the key points of the human hand.
Specifically, the gesture classification model may be a keras, and is used to perform gesture detection on the hand region to obtain a gesture category of the hand region. Illustratively, the gesture classification model outputs gesture categories including: a fingertip trigger category, a non-fingertip trigger category, and a background category. Wherein the non-fingertip trigger categories include: various traffic gesture categories, referee gesture categories, and various unknown gesture categories. In some cases, the hand region detected by the gesture classification model may not include the hand, and only the background, the gesture classification may also include the background classification.
At S221 and S222, if the human hand region belongs to the background category or the non-fingertip trigger category, the operation is ended.
At S223, if the gesture classification model detects that the human hand region belongs to the fingertip trigger category, it may be directly determined that the detection result of the gesture classification model is correct, and S250 is continuously performed.
At S224, if the gesture classification model detects that the hand region belongs to the fingertip trigger category, it is determined that the accuracy of the detection result of the gesture classification model is to be determined, and auxiliary determination needs to be performed according to the detection result of the hand key point detection model, so as to improve the accuracy of gesture determination. Specifically, the hand key point detection model is used for detecting the hand key points in the hand area to obtain the position information of the hand key points.
Optionally, the human hand key points include a fingertip key point and at least one key point located on the same target finger as the fingertip key point. For convenience of description and distinction, the finger where the fingertip keypoint is located is referred to as the target finger. As shown in connection with FIGS. 1b and 1c, the hand's keypoints include fingertip keypoints 8 and 7, and keypoints 6 and 5, located on the index finger. If a plurality of key points on the same target finger are detected, the target finger is in an open state, the fingertip triggering gesture is met, and whether the gesture is the fingertip triggering category or not can be effectively assisted and judged.
Optionally, the hand keypoints further comprise at least one keypoint on a finger other than the target finger required to form a gesture of the fingertip trigger category. In addition to the target finger, the formation of the fingertip trigger category requires the cooperation of other fingers. As shown in FIG. 1b, forming the index finger-tip trigger category requires the index finger (target finger) and thumb to be mated, and thus, the hand keypoints include keypoints 4 and keypoints 3 on the thumb in addition to fingertip keypoints 8, keypoints 7, keypoints 6, and keypoints 5 located on the index finger. Therefore, the accuracy of gesture judgment can be further improved through key points on other fingers.
In the embodiment, the false detection of the background as the area of the human hand is effectively filtered by detecting the area of the human hand as the background category; in practical application, often only specific gestures need to detect the fingertip position, and the fingertip position of a hand when a user makes an unconscious gesture needs to be filtered out, so that the unconscious gesture is filtered by detecting that a human hand region is a non-fingertip trigger category.
At S240, the detection accuracy may be represented by a confidence. The higher the confidence coefficient is, the higher the detection precision is; the lower the confidence, the lower the detection accuracy. And if the position information of the plurality of human hand key points is detected, calculating the average confidence of the position information of the plurality of human hand key points as the detection precision of the position information of the human hand key points.
Optionally, two optional embodiments are included before S240: 1) calculating comprehensive detection precision according to the confidence coefficient of the fingertip triggering category and the confidence coefficient of the position information of the human hand key point; determining that the detection precision of the fingertip trigger category and the detection precision of the position information of the human hand key point meet the set requirement in response to the fact that the comprehensive detection precision is larger than the set threshold; 2) and determining that the detection precision of the fingertip trigger category and the detection precision of the position information of the human hand key point meet the set requirement in response to the fact that the confidence coefficient of the fingertip trigger category and the confidence coefficient of the position information of the human hand key point are both larger than the set threshold.
Specifically, in the first optional implementation, mathematical operations, such as addition or multiplication, are performed on the confidence coefficient of the fingertip trigger category and the confidence coefficient of the position information of the human hand key point to obtain the comprehensive detection precision; and if the comprehensive detection precision is greater than a set threshold value, such as 0.8, judging that the detection precision of the two meets the set requirement. In a second optional implementation manner, the confidence of the fingertip trigger category and the confidence of the position information of the human hand key point are respectively compared with a set threshold, for example, 0.8, and if both are greater than 0.8, the detection accuracy of the fingertip trigger category and the detection accuracy of the human hand key point are determined to meet the set requirement.
The two optional implementation manners provided by the embodiment can accurately calculate the detection precision of the fingertip trigger category and the detection precision of the position information of the human hand key point, so that the accuracy of gesture judgment is improved.
At S250, if the detection of the hand key point is not performed, the hand key point detection is performed on the hand region by the hand key point detection model, so as to obtain the position information of the hand key point. If the human hand key point detection is performed, if S224 is performed, the position information of the detected human hand key point may be directly acquired.
At S260, the fingertip key points are generally located at the ends of the fingers, and the hand key points located at the edge positions are acquired as the fingertip key points, so that the fingertip key points are accurately screened out through the positional relationship when there are a plurality of hand key points.
According to the embodiment of the present application, fig. 3a is a flowchart of a third method for detecting fingertip keypoints in the embodiment of the present application, and the embodiment of the present application is optimized based on the technical solutions of the above embodiments.
Optionally, after "in response to the gesture category being the fingertip trigger category, outputting position information of a fingertip key point corresponding to the fingertip trigger category, which is obtained by detecting a human hand key point in a human hand region", the operation "determining the pointing direction of the target finger according to the position information of the human hand key point" is added; and determining the pointed position of the fingertip key point according to the pointing direction of the target finger and the position information of the fingertip key point.
The fingertip key point detection method shown in fig. 3a includes:
s310, acquiring a human hand area in the image to be detected. Execution continues with S320, S321, S322, or S323.
S320, performing gesture detection on the hand region through the gesture classification model to obtain that the hand region is a background category, and jumping to S370.
S321, performing gesture detection on the hand region through the gesture classification model to obtain that the hand region is in a non-fingertip triggering type, and jumping to S370.
S322, performing gesture detection on the hand region through the gesture classification model to obtain that the hand region belongs to the fingertip triggering category, and jumping to S330.
S323, performing gesture detection on the hand region through a gesture classification model to obtain that the hand region belongs to a fingertip trigger category; and detecting the hand key points of the hand region through the hand key point detection model to obtain the position information of the hand key points. Execution continues with S324.
And S324, responding to the fact that the detection precision of the fingertip trigger category and the detection precision of the position information of the key point of the human hand meet set requirements, determining that the human hand area belongs to the fingertip trigger category, and continuing to execute S331.
S330, responding to the fact that the gesture type is the fingertip triggering type, detecting the key points of the human hand on the human hand area through the human hand key point detection model to obtain the position information of the key points of the human hand, and obtaining the position information of the key points of the human hand. Execution continues with S340.
If the gesture classification model detects that the hand region belongs to the fingertip triggering category, the detection result of the gesture classification model can be directly determined to be correct; and detecting the hand key points of the hand region through the hand key point detection model to obtain the position information of the hand key points.
And S331, responding to the fact that the gesture type is the fingertip trigger type, and acquiring position information of the hand key point, which is obtained by detecting the hand key point in the hand area through the hand key point detection model. Execution continues with S340.
If the gesture classification model and the hand key point detection model jointly determine the gesture category, the position information of the hand key point is detected by the hand key point detection model, and the operation can be directly obtained.
S340, according to the position relation among the key points of the human hand, screening the position information of the fingertip key points corresponding to the fingertip triggering types from the position information of the key points of the human hand.
And S350, determining the pointing direction of the target finger according to the position information of the key point of the human hand.
In an application scenario of performing fingertip reading on an electronic screen, positions pointed by fingertip key points need to be determined so as to identify contents pointed on the electronic screen. The fingertip key points may point to any direction around, and according to the reading habit of the user, the pointing direction of the fingertip key points should be consistent with that of the fingers. Based on this, the pointing direction of the target finger where the fingertip key point is located is first determined.
Fig. 3b is a schematic pointing diagram of the target finger in the embodiment of the present application. As shown in FIG. 3b, the target finger is the index finger and the hand keypoints comprise fingertip keypoints 8, keypoints 7, keypoints 6, and keypoints 5 located on the target finger. The key points of the human hand are distributed at different positions on the target finger, so that the bending degree of the target finger is indicated, and the pointing direction of the target finger is reflected. Based on the method, the key points of the human hand are fitted according to the position information of the key points of the human hand to obtain a line; and determining the pointing direction of the target finger according to the extending direction of the line at one side of the key point of the fingertip.
The fitting can adopt a straight line fitting method or a curve fitting method to obtain a straight line or a curve. As shown in fig. 3b, a straight line fitting method is adopted to fit the fingertip key point 8, the key point 7, the key point 6 and the key point 5, and specifically, according to coordinates of any two key points of the fingertip key point 8, the key point 7, the key point 6 and the key point 5, a straight line equation y ═ ax + b is obtained by solving.
The line comprises one side and the other side of the fingertip key point, and as shown in fig. 3b, the extending direction of the straight line on one side of the fingertip key point 8 is determined according to the slope of the straight line equation (the extending direction is indicated by an arrow); optionally, the extending direction is determined according to the slope of the tangent line of the curve equation at the key point of the fingertip. Then, the extension direction is determined as the pointing direction of the target finger.
The embodiment adopts a key point fitting method to obtain the direction of the target finger, and has ingenious conception and simple calculation.
And S360, determining the pointed position of the fingertip key point according to the pointing direction of the target finger and the position information of the fingertip key point.
Specifically, a preset length is extended from a fingertip key point along the direction of the target finger to reach the position pointed by the fingertip key point. Wherein the set length can be 0.2 mm or 0.5 mm. As shown in FIG. 3b, the index finger points as shown, and the fingertip keypoints point to a position of "null".
And S370, ending the operation.
In this embodiment, the direction of the target finger is obtained by the fingertip key point and at least one other key point on the target finger, and a completely new direction determination method is provided. Moreover, the positions pointed by the key points of the fingertips can be accurately obtained through the pointing direction of the target finger, and the accuracy of the finger tip point reading is improved.
According to the embodiment of the present application, fig. 4 is a flowchart of a fourth method for detecting fingertip key points provided in the embodiment of the present application, and the embodiment of the present application optimizes a process of determining a fingertip trigger category on the basis of the technical solutions of the above embodiments.
The fingertip key point detection method shown in fig. 4 includes:
and S410, providing options of a plurality of fingertip trigger categories for the user, wherein the fingertip trigger categories comprise a thumb fingertip trigger category, an index finger fingertip trigger category, a middle finger fingertip trigger category, a ring finger fingertip trigger category and a little finger fingertip trigger category.
Specifically, the options of the 5 fingertip trigger categories are displayed on the electronic screen, so that the user can select the options according to the click-to-read habit.
The user may select the option of one of the fingertip triggered categories by triggering the option, such as the index finger fingertip triggered category.
And S420, acquiring the option of the fingertip trigger category selected by the user, and determining the fingertip trigger category according to the option of the fingertip trigger category.
If the user selects the option of the index finger-tip trigger category, determining that the finger-tip trigger category is the index finger-tip trigger category; if the user selects the little finger fingertip trigger category option, the fingertip trigger category is determined to be the little finger fingertip trigger category. Other fingertip trigger categories are analogized and are not described in detail.
And S430, acquiring a human hand area in the image to be detected.
S440, performing gesture detection on the hand region to obtain the gesture category of the hand region.
S450, responding to the fact that the gesture type is the fingertip triggering type, and outputting position information of the fingertip key points corresponding to the fingertip triggering type, wherein the position information is obtained by detecting the human hand key points in the human hand area.
In the embodiment of the application, the options of a plurality of fingertip triggering categories are provided for the user, so that the user can select a proper fingertip triggering gesture according to the reading habit of the user, and the intelligent degree and the use experience of the user are improved.
Fig. 5 is a structural diagram of a fingertip keypoint detection device in an embodiment of the present application, and the embodiment of the present application is applied to a case where a fingertip keypoint is detected from an image, and the device is implemented by software and/or hardware and is specifically configured in an electronic device having a certain data calculation capability.
Fig. 5 shows a fingertip keypoint detection apparatus 500, which includes: an acquisition module 501, a gesture detection module 502 and an output module 503; wherein the content of the first and second substances,
an obtaining module 501, configured to obtain a human hand region in an image to be detected;
the gesture detection module 502 is used for performing gesture detection on the hand region to obtain the gesture category of the hand region;
and the output module 503 is configured to output, in response to the gesture category being the fingertip trigger category, position information of a fingertip keypoint corresponding to the fingertip trigger category, which is obtained by performing human hand keypoint detection on a human hand region.
In the embodiment of the application, firstly, gesture detection is performed on a hand region, when the gesture type is a fingertip trigger type, it is indicated that a user really uses fingertips of fingers corresponding to the fingertip trigger type to perform triggering, and a fingertip trigger gesture is made, so that the probability that fingertip key points corresponding to the fingertip trigger gesture exist in the hand region is high, and under the condition, the precision of detection results of the fingertip key points is high; moreover, the fingertip key points and the fingertip triggering types have a corresponding relation, and the detection precision of the fingertip key points is further improved.
Further, the gesture detection module is specifically configured to perform any one of the following operations:
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region is a background category;
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region is a non-fingertip triggering type;
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region belongs to a fingertip trigger category;
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region belongs to a fingertip trigger category; detecting the hand key points of the hand region through the hand key point detection model to obtain the position information of the hand key points; determining that the hand region belongs to the fingertip triggering category in response to the fact that the detection precision of the fingertip triggering category and the detection precision of the position information of the hand key point meet set requirements; wherein, the key points of the human hand comprise the key points of the finger tip.
Further, the output module 503 is specifically configured to: responding to the gesture type as a fingertip triggering type, and acquiring position information of a human hand key point obtained by detecting the human hand key point on a human hand area through a human hand key point detection model; and according to the position relation among the key points of the human hand, screening the position information of the fingertip key points corresponding to the fingertip triggering types from the position information of the key points of the human hand.
Furthermore, the device also comprises an accuracy calculation module and a first setting requirement determination module, wherein the accuracy calculation module is used for calculating the comprehensive detection accuracy according to the confidence coefficient of the fingertip trigger category and the confidence coefficient of the position information of the human hand key point; the first setting requirement determining module is used for responding to the fact that the comprehensive detection precision is larger than a set threshold value, and determining that the detection precision of the fingertip trigger category and the detection precision of the position information of the human hand key point meet the setting requirement.
Further, the device further comprises a second setting requirement determining module, which is used for determining that the detection precision of the fingertip triggering category and the detection precision of the position information of the human hand key point meet the setting requirement in response to the fact that the confidence coefficient of the fingertip triggering category and the confidence coefficient of the position information of the human hand key point are both greater than the setting threshold.
Further, the human hand key points also include at least one key point located on the same target finger as the fingertip key points.
Further, the hand key points also include at least one key point on a finger other than the target finger required to form a gesture of the fingertip trigger category.
Furthermore, the device also comprises a pointing direction determining module, which is used for determining the pointing direction of the target finger according to the position information of the key point of the human hand; the device also comprises a position determining module which is used for determining the pointed position of the fingertip key point according to the pointing direction of the target finger and the position information of the fingertip key point.
Further, the direction determining module is specifically configured to: fitting the key points of the human hand according to the position information of the key points of the human hand to obtain lines; and determining the pointing direction of the target finger according to the extending direction of the line at one side of the key point of the fingertip.
Further, the device also comprises a fingertip trigger category determining module for providing a plurality of fingertip trigger category options for the user, wherein the fingertip trigger categories comprise a thumb fingertip trigger category, an index finger fingertip trigger category, a middle finger fingertip trigger category, a ring finger fingertip trigger category and a little finger fingertip trigger category; and acquiring a fingertip trigger category option selected by a user, and determining a fingertip trigger category according to the fingertip trigger category option.
The fingertip key point detection device can execute the fingertip key point detection method provided by any embodiment of the application, and has the corresponding functional module and the beneficial effects of executing the fingertip key point detection method.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
Fig. 6 is a block diagram of an electronic device implementing the fingertip keypoint detection method according to the embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 6, the electronic apparatus includes: one or more processors 601, memory 602, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 6, one processor 601 is taken as an example.
The memory 602 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the fingertip keypoint detection method provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the fingertip keypoint detection method provided by the present application.
The memory 602, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the method for fingertip keypoint detection in the embodiment of the present application (for example, the system shown in fig. 5 includes an acquisition module 501, a gesture detection module 502, and an output module 503). The processor 601 executes various functional applications of the server and data processing by running non-transitory software programs, instructions and modules stored in the memory 602, namely, implements the method of fingertip keypoint detection in the above-described method embodiments.
The memory 602 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by use of an electronic device that implements the fingertip keypoint detection method, and the like. Further, the memory 602 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 602 optionally includes memory remotely located from the processor 601, and these remote memories may be connected over a network to an electronic device that performs the fingertip keypoint detection method. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device performing the fingertip keypoint detection method may further include: an input device 603 and an output device 604. The processor 601, the memory 602, the input device 603 and the output device 604 may be connected by a bus or other means, and fig. 6 illustrates the connection by a bus as an example.
The input device 603 may receive input numeric or character information and generate key signal inputs related to user settings and function control of an electronic apparatus performing the fingertip key point detection method, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, or the like. The output devices 604 may include a display device, auxiliary lighting devices (e.g., LEDs), and tactile feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), the internet, and blockchain networks.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (16)

1. A fingertip key point detection method is characterized by comprising the following steps:
acquiring a hand area in an image to be detected;
performing gesture detection on the hand area to obtain the gesture category of the hand area;
and outputting position information of fingertip key points corresponding to the fingertip trigger type, which is obtained by detecting the human hand key points in the human hand area, in response to the gesture type being the fingertip trigger type.
2. The method according to claim 1, wherein the performing gesture detection on the human hand region to obtain the gesture category of the human hand region comprises any one of the following operations:
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region is a background category;
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region is a non-fingertip triggering type;
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region belongs to a fingertip trigger category;
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region belongs to a fingertip trigger category; detecting the hand key points of the hand region through a hand key point detection model to obtain the position information of the hand key points; determining that the human hand area belongs to the fingertip triggering category in response to the fact that the detection precision of the fingertip triggering category and the detection precision of the position information of the human hand key point meet set requirements;
wherein the human hand key points comprise fingertip key points.
3. The method according to claim 2, wherein the outputting, in response to the gesture category being a fingertip trigger category, position information of a fingertip keypoint corresponding to the fingertip trigger category obtained by performing human hand keypoint detection on the human hand region comprises:
responding to the gesture type as a fingertip triggering type, and acquiring position information of the hand key points, which is obtained by performing hand key point detection on the hand region through a hand key point detection model;
and according to the position relation among the key points of the human hand, screening the position information of the fingertip key points corresponding to the fingertip trigger category from the position information of the key points of the human hand.
4. The method according to claim 2, before determining that the human hand region belongs to a fingertip trigger category when the detection accuracy of the position information in response to the fingertip trigger category and the detection accuracy of the human hand key point satisfy a set requirement, further comprising:
calculating comprehensive detection precision according to the confidence coefficient of the fingertip triggering category and the confidence coefficient of the position information of the human hand key point;
and determining that the detection precision of the fingertip trigger category and the detection precision of the position information of the key point of the human hand meet set requirements in response to the fact that the comprehensive detection precision is larger than a set threshold value.
5. The method according to claim 2, before determining that the human hand region belongs to a fingertip trigger category when the detection accuracy of the position information in response to the fingertip trigger category and the detection accuracy of the human hand key point satisfy a set requirement, further comprising:
and determining that the detection precision of the fingertip trigger category and the detection precision of the position information of the human hand key point meet set requirements in response to the fact that the confidence coefficient of the fingertip trigger category and the confidence coefficient of the position information of the human hand key point are both greater than set thresholds.
6. The method of any of claims 2-5, wherein the human hand keypoints further comprise at least one keypoint located on the same target finger as the fingertip keypoint.
7. The method of claim 6, wherein the human hand keypoints further comprise at least one keypoint on a finger other than the target finger required to form a gesture of the fingertip trigger category.
8. The method according to claim 6, further comprising, after outputting, in response to the gesture category being a fingertip trigger category, position information of a fingertip keypoint corresponding to the fingertip trigger category obtained by performing human hand keypoint detection on the human hand region:
determining the pointing direction of the target finger according to the position information of the key point of the human hand;
and determining the pointed position of the fingertip key point according to the pointing direction of the target finger and the position information of the fingertip key point.
9. The method of claim 8, wherein determining the pointing direction of the target finger based on the location information of the hand keypoints comprises:
fitting the key points of the human hand according to the position information of the key points of the human hand to obtain lines;
and determining the pointing direction of the target finger according to the extending direction of the line at one side of the fingertip key point.
10. The method according to any one of claims 1-5, wherein before the performing the gesture detection on the human hand region to obtain the gesture classification of the human hand region, further comprising:
providing a user with options for a plurality of fingertip trigger categories, the fingertip trigger categories including a thumb fingertip trigger category, an index finger fingertip trigger category, a middle finger fingertip trigger category, a ring finger fingertip trigger category, and a little finger fingertip trigger category;
and acquiring a fingertip trigger category option selected by a user, and determining a fingertip trigger category according to the fingertip trigger category option.
11. A fingertip keypoint detection device, comprising:
the acquisition module is used for acquiring a hand area in an image to be detected;
the gesture detection module is used for carrying out gesture detection on the hand area to obtain the gesture category of the hand area;
and the output module is used for responding to the fact that the gesture type is a fingertip trigger type, and outputting position information of fingertip key points corresponding to the fingertip trigger type, which is obtained by detecting the human hand key points in the human hand area.
12. The apparatus of claim 11, wherein the gesture detection module is specifically configured to perform any one of the following:
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region is a background category;
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region is a non-fingertip triggering type;
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region belongs to a fingertip trigger category;
performing gesture detection on the hand region through a gesture classification model to obtain that the hand region belongs to a fingertip trigger category; detecting the hand key points of the hand region through a hand key point detection model to obtain the position information of the hand key points; determining that the human hand area belongs to the fingertip triggering category in response to the fact that the detection precision of the fingertip triggering category and the detection precision of the position information of the human hand key point meet set requirements;
wherein the human hand key points comprise fingertip key points.
13. The apparatus of claim 12, wherein said human hand keypoints further comprise at least one keypoint located on the same target finger as said fingertip keypoint.
14. The apparatus of claim 13, further comprising:
the pointing determination module is used for determining the pointing direction of the target finger according to the position information of the key point of the human hand;
and the position determining module is used for determining the position pointed by the fingertip key point according to the pointing direction of the target finger and the position information of the fingertip key point.
15. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a fingertip keypoint detection method as claimed in any one of claims 1 to 10.
16. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform a fingertip keypoint detection method according to any one of claims 1 to 10.
CN202010431810.8A 2020-05-20 2020-05-20 Fingertip key point detection method, device, equipment and readable storage medium Active CN111625157B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010431810.8A CN111625157B (en) 2020-05-20 2020-05-20 Fingertip key point detection method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010431810.8A CN111625157B (en) 2020-05-20 2020-05-20 Fingertip key point detection method, device, equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN111625157A true CN111625157A (en) 2020-09-04
CN111625157B CN111625157B (en) 2021-09-17

Family

ID=72259960

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010431810.8A Active CN111625157B (en) 2020-05-20 2020-05-20 Fingertip key point detection method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111625157B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112558810A (en) * 2020-12-11 2021-03-26 北京百度网讯科技有限公司 Method, device, equipment and storage medium for detecting fingertip position
CN112651298A (en) * 2020-11-27 2021-04-13 深圳点猫科技有限公司 Point reading method, device, system and medium based on finger joint positioning
CN113065458A (en) * 2021-03-29 2021-07-02 新疆爱华盈通信息技术有限公司 Voting method and system based on gesture recognition and electronic device
CN113792651A (en) * 2021-09-13 2021-12-14 广州广电运通金融电子股份有限公司 Gesture interaction method, device and medium integrating gesture recognition and fingertip positioning
WO2023077665A1 (en) * 2021-11-05 2023-05-11 深圳市鸿合创新信息技术有限责任公司 Palm position determination method and apparatus, and electronic device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103809794A (en) * 2012-11-13 2014-05-21 联想(北京)有限公司 Information processing method and electronic device
CN108475145A (en) * 2016-01-13 2018-08-31 精工爱普生株式会社 Pattern recognition device, image-recognizing method and image identification unit
CN109446994A (en) * 2018-10-30 2019-03-08 北京达佳互联信息技术有限公司 Gesture critical point detection method, apparatus, electronic equipment and storage medium
CN109657537A (en) * 2018-11-05 2019-04-19 北京达佳互联信息技术有限公司 Image-recognizing method, system and electronic equipment based on target detection
CN110516582A (en) * 2019-08-22 2019-11-29 阿依瓦(北京)技术有限公司 A kind of books reading method and system
CN110597450A (en) * 2019-09-16 2019-12-20 广东小天才科技有限公司 False touch prevention identification method and device, touch reading equipment and touch reading identification method thereof
WO2020095876A1 (en) * 2018-11-05 2020-05-14 京セラドキュメントソリューションズ株式会社 Image forming device and numerical value counting method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103809794A (en) * 2012-11-13 2014-05-21 联想(北京)有限公司 Information processing method and electronic device
CN108475145A (en) * 2016-01-13 2018-08-31 精工爱普生株式会社 Pattern recognition device, image-recognizing method and image identification unit
CN109446994A (en) * 2018-10-30 2019-03-08 北京达佳互联信息技术有限公司 Gesture critical point detection method, apparatus, electronic equipment and storage medium
CN109657537A (en) * 2018-11-05 2019-04-19 北京达佳互联信息技术有限公司 Image-recognizing method, system and electronic equipment based on target detection
WO2020095876A1 (en) * 2018-11-05 2020-05-14 京セラドキュメントソリューションズ株式会社 Image forming device and numerical value counting method
CN110516582A (en) * 2019-08-22 2019-11-29 阿依瓦(北京)技术有限公司 A kind of books reading method and system
CN110597450A (en) * 2019-09-16 2019-12-20 广东小天才科技有限公司 False touch prevention identification method and device, touch reading equipment and touch reading identification method thereof

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651298A (en) * 2020-11-27 2021-04-13 深圳点猫科技有限公司 Point reading method, device, system and medium based on finger joint positioning
CN112558810A (en) * 2020-12-11 2021-03-26 北京百度网讯科技有限公司 Method, device, equipment and storage medium for detecting fingertip position
CN112558810B (en) * 2020-12-11 2023-10-03 北京百度网讯科技有限公司 Method, apparatus, device and storage medium for detecting fingertip position
CN113065458A (en) * 2021-03-29 2021-07-02 新疆爱华盈通信息技术有限公司 Voting method and system based on gesture recognition and electronic device
CN113065458B (en) * 2021-03-29 2024-05-28 芯算一体(深圳)科技有限公司 Voting method and system based on gesture recognition and electronic equipment
CN113792651A (en) * 2021-09-13 2021-12-14 广州广电运通金融电子股份有限公司 Gesture interaction method, device and medium integrating gesture recognition and fingertip positioning
CN113792651B (en) * 2021-09-13 2024-04-05 广州广电运通金融电子股份有限公司 Gesture interaction method, device and medium integrating gesture recognition and fingertip positioning
WO2023077665A1 (en) * 2021-11-05 2023-05-11 深圳市鸿合创新信息技术有限责任公司 Palm position determination method and apparatus, and electronic device and storage medium

Also Published As

Publication number Publication date
CN111625157B (en) 2021-09-17

Similar Documents

Publication Publication Date Title
CN111625157B (en) Fingertip key point detection method, device, equipment and readable storage medium
EP3926526A2 (en) Optical character recognition method and apparatus, electronic device and storage medium
RU2711029C2 (en) Touch classification
KR102460737B1 (en) Method, apparatus, apparatus and computer readable storage medium for public handwriting recognition
CN110659600B (en) Object detection method, device and equipment
US9262012B2 (en) Hover angle
US9218060B2 (en) Virtual mouse driving apparatus and virtual mouse simulation method
CN111738072A (en) Training method and device of target detection model and electronic equipment
EP4307096A1 (en) Key function execution method, apparatus and device, and storage medium
Vivek Veeriah et al. Robust hand gesture recognition algorithm for simple mouse control
CN112036315A (en) Character recognition method, character recognition device, electronic equipment and storage medium
CN110532415B (en) Image search processing method, device, equipment and storage medium
CN116483246A (en) Input control method and device, electronic equipment and storage medium
CN106569716B (en) Single-hand control method and control system
CN106598422B (en) hybrid control method, control system and electronic equipment
Nguyen et al. Hand segmentation and fingertip tracking from depth camera images using deep convolutional neural network and multi-task segnet
CN110727383A (en) Touch interaction method and device based on small program, electronic equipment and storage medium
CN111708477B (en) Key identification method, device, equipment and storage medium
EP2618237B1 (en) Gesture-based human-computer interaction method and system, and computer storage media
CN111638787B (en) Method and device for displaying information
US20220050528A1 (en) Electronic device for simulating a mouse
CN111507944B (en) Determination method and device for skin smoothness and electronic equipment
CN113655906A (en) Folding screen control method and device
Waybhase et al. Towards Controlling Mouse through Hand Gestures: A Novel and Efficient Approach.
CN116301361B (en) Target selection method and device based on intelligent glasses and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant