CN111443831A - Gesture recognition method and device - Google Patents

Gesture recognition method and device Download PDF

Info

Publication number
CN111443831A
CN111443831A CN202010234458.9A CN202010234458A CN111443831A CN 111443831 A CN111443831 A CN 111443831A CN 202010234458 A CN202010234458 A CN 202010234458A CN 111443831 A CN111443831 A CN 111443831A
Authority
CN
China
Prior art keywords
finger
touch
gesture recognition
minimum distance
operating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010234458.9A
Other languages
Chinese (zh)
Inventor
翟新刚
张楠赓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canaan Bright Sight Co Ltd
Original Assignee
Canaan Creative Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canaan Creative Co Ltd filed Critical Canaan Creative Co Ltd
Priority to CN202010234458.9A priority Critical patent/CN111443831A/en
Publication of CN111443831A publication Critical patent/CN111443831A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes

Abstract

A gesture recognition method for recognizing the touch position of a finger on a touch plane comprises the following steps: shooting a gesture image, identifying joint points of all fingers in the gesture image, judging an operating finger which executes touch operation according to the distance between the joint points of all the fingers and a touch plane, and judging the touch position of the operating finger according to the degree of curvature of the operating finger. The disclosure also provides a gesture recognition device for executing the gesture recognition method. The method and the device solve the problems that the hardware cost of the virtual keyboard is high, the fingers are shielded from each other, the recognition rate is low and the like.

Description

Gesture recognition method and device
Technical Field
The disclosure relates to the field of gesture recognition, in particular to a gesture recognition method and device.
Background
A keyboard is an important component of human-computer interaction. Currently, the types of keyboards mainly include physical keyboards, touch screen keyboards, virtual keyboards, and the like. The virtual keyboard is mostly realized by adopting a laser projection technology, and has the advantages of small size, convenience and the like.
Virtual keyboards are used in a much more remote experience than physical keyboards. Compared with a touch screen keyboard, the virtual keyboard needs to be added with hardware such as a laser projector. As shown in fig. 1, the virtual keyboard based on the laser projection technology requires hardware devices such as a wide-angle camera, an infrared filter, a laser projector, and a bracket, and the hardware cost is high. The virtual keyboard judges the finger position through the infrared light spot that the discernment fingertip sheltered from the infrared light and forms, and when the finger sheltered from each other, the finger action of sheltering from can be omitted, leads to the recognition rate to reduce, and when user typing speed was very fast, operation omission, mistake touching scheduling problem appear easily in virtual false dish, can further reduce the rate of accuracy of input.
Disclosure of Invention
Technical problem to be solved
The application provides a gesture recognition method and device, and aims to solve the problems that existing virtual keyboard hardware is high in cost, low in accuracy rate and the like.
(II) technical scheme
The disclosure provides a gesture recognition method for recognizing a touch position of a finger on a touch plane, the method including: shooting a gesture image; identifying joint points of each finger in the gesture image; judging an operating finger which executes touch operation according to the distance between the joint point of each finger and the touch plane; and judging the touch position of the operation finger according to the finger curvature of the operation finger.
Optionally, the determining, according to the distance between each finger joint point and the touch plane, an operating finger on which a touch operation is performed includes: acquiring the distance between the joint point of each finger close to the fingertip and the touch plane; and taking the finger corresponding to the minimum distance as an operation finger.
Optionally, the taking the finger corresponding to the minimum distance value as the operation finger includes: judging whether the finger corresponding to the minimum distance value executes touch operation or not; if the finger corresponding to the minimum distance value executes touch operation, taking the finger corresponding to the minimum distance value as an operation finger; and if the finger corresponding to the minimum distance value does not execute the touch operation, stopping executing the gesture recognition operation.
Optionally, the determining whether the finger corresponding to the minimum distance value performs a touch operation includes: comparing the minimum distance value with a preset distance threshold value; if the minimum distance value is smaller than or equal to the distance threshold value, the finger corresponding to the minimum distance value executes touch operation; if the minimum distance value is larger than the distance threshold value, the finger corresponding to the minimum distance value does not execute touch operation.
Optionally, the method further comprises: respectively acquiring the distance between each joint point of the operating finger; and calculating the finger curvature of the operating finger according to the distance between the joint points.
Optionally, the touch plane is a virtual keyboard, and the touch position is an operation position of the virtual keyboard.
Optionally, each finger is associated with a plurality of operation positions of the virtual keyboard, and the operation positions of the virtual keyboard are arranged according to the association relationship with each finger.
Optionally, the determining the touch position of the operating finger according to the degree of curvature of the operating finger includes: acquiring a plurality of operation positions associated with the operation finger according to the distance between the operation finger and the middle finger, wherein the operation positions respectively correspond to one finger bending degree interval; and taking the operation position corresponding to the bending degree section to which the finger bending degree of the operation finger belongs as a touch position.
Optionally, the acquiring a plurality of operation positions associated with the operation finger in the virtual keyboard further includes: acquiring the current operation range of the operation finger; acquiring an operation position belonging to the operation range from the plurality of operation positions.
Another aspect of the present disclosure provides a gesture recognition apparatus, including: the image acquisition module is used for shooting a gesture image; the joint point identification module is used for identifying joint points of all fingers in the gesture image; the finger identification module is used for judging the operating finger which executes the touch operation according to the distance between each finger joint point and the touch plane; and the position identification module is used for judging the touch position of the operating finger according to the finger curvature of the operating finger.
(III) advantageous effects
The method comprises the steps of identifying joint points of fingers in a gesture image, confirming operating fingers according to the joint points, calculating the curvature of the operating fingers according to the joint points, and judging which operating position on a virtual keyboard is touched by the operating fingers according to the curvature of the fingers. The method can be realized by only one camera, the hardware cost is low, the frame rate of the camera is generally more than 30fps, even if the typing speed of a user is increased, the camera cannot miss the record of the gesture, the method confirms the operation finger through the position information of the joint point and calculates the curvature of the operation finger to judge the operation position, the joint point is positioned on the back of the finger, the condition of being shielded does not exist, the problem that the gesture input is not recognized due to being shielded is avoided, and the accuracy is high.
Drawings
FIG. 1 is a schematic diagram schematically illustrating a virtual keyboard principle based on laser projection technology;
FIG. 2 is a flow chart schematically illustrating a gesture recognition method provided by an embodiment of the present disclosure;
FIG. 3 is a diagram schematically illustrating a finger joint distribution map according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram schematically illustrating a virtual keyboard provided by an embodiment of the present disclosure;
fig. 5 schematically illustrates a gesture recognition apparatus provided by an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The words "a", "an" and "the" and the like as used herein are also intended to include the meanings of "a plurality" and "the" unless the context clearly dictates otherwise. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of this disclosure, a computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, the computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer readable medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
Fig. 1 is a schematic diagram schematically illustrating a virtual keyboard principle based on a laser projection technology.
As shown in fig. 1, the hardware of the virtual keyboard based on the laser projection technology includes: camera 1, keyboard projector 2 and infrared word laser head 3.
The working principle of the virtual keyboard based on the laser projection technology comprises the following steps: the keyboard projector 2 projects a keyboard pattern 4 on a plane; the infrared linear laser head 3 emits linear laser which is emitted in parallel to the plane, and if a finger touches a keyboard pattern, a laser spot is formed on the finger; the camera 1 captures laser spots, shoots images and transmits the images to the processor; the processor identifies the position of the keyboard pattern corresponding to the light spot, so that the position of the keyboard touched by the finger can be known.
Fig. 2 schematically shows a flow chart of a gesture recognition method according to an embodiment of the present disclosure.
The gesture recognition method provided by the disclosure is used for recognizing the touch position of a finger on a touch plane.
The touch plane may be a solid plane or a virtual plane, may be a flat plane or an uneven plane, and may include an indication pattern or no pattern on the touch screen plane, and may be visible or invisible. For example, the touch plane may be a physical key pad or a virtual invisible pad.
In the gesture recognition method provided by the embodiment of the disclosure, by recognizing the touch position of the finger on the touch plane, information associated with the touch position can be obtained.
Specifically, as shown in fig. 2, a gesture recognition method according to an embodiment of the present disclosure includes the following operations in steps S1 to S4.
In step S1, a gesture image is captured.
The gesture recognition method is based on an image processing technology, and obtains gesture images in real time or periodically to ensure complete recognition of gestures.
The devices for shooting the gesture images can comprise mobile phones, cameras and the like.
In step S2, the joint point of each finger in the gesture image is recognized.
In the embodiment of the present disclosure, the joint point of each finger may be each finger joint of the physiological structure, or may be according to other parts on the finger, such as a knuckle. Each joint point of each finger in the finger image can be identified through an image processing technology, such as a deep learning model, and the position information of each joint point can be acquired.
Specifically, the finger joint point is set as shown in fig. 3.
In step S3, the operating finger that has performed the touch operation is determined based on the distance between the joint point of each finger and the touch plane.
The distance between the fingertip of the finger performing the touch operation and the touch plane is the closest, and the operating finger can be determined according to the distance between each fingertip of the finger and the touch plane. The specific steps include step S31 to step S32.
Step S31, the distance between the joint point of each finger near the fingertip and the touch plane is obtained.
According to the position information of the joint points close to the finger tips, the distance between the joint points close to the finger tips of each finger and the touch plane finger tip can be calculated.
In step S32, the finger corresponding to the minimum distance is set as the operation finger.
The method includes steps S321 to S323, but does not exclude the case where the finger is drawn from the touch plane but does not perform the touch operation, and in order to avoid the false recognition of the gesture, after acquiring the hand corresponding to the minimum distance, it is necessary to determine whether the finger corresponding to the minimum distance performs the touch operation.
S321 determines whether the finger corresponding to the minimum distance performs a touch operation.
And comparing the minimum distance value with a preset distance threshold value.
If the minimum distance value is smaller than or equal to the distance threshold value, the finger corresponding to the minimum distance value executes touch operation.
If the minimum distance value is greater than the distance threshold value, the finger corresponding to the minimum distance value does not execute the touch operation.
S322, if the finger corresponding to the minimum distance performs the touch operation, the finger corresponding to the minimum distance is used as the operating finger.
And S323, if the finger corresponding to the minimum distance value does not execute the touch operation, stopping executing the gesture recognition operation.
If the finger corresponding to the minimum distance value does not execute the touch operation, the current gesture is an invalid gesture which can be ignored, so that the execution of the gesture recognition can be stopped, the invalid recognition of the gesture is avoided, and the calculation resources are wasted.
In step S4, the touch position of the operating finger is determined based on the degree of curvature of the operating finger.
The method for calculating the degree of finger curvature of the operating finger includes steps S41 to S42.
In step S41, the distances between the respective joint points of the operating finger are acquired.
And step S42, calculating the finger curvature of the operating finger according to the distance between the joint points.
In this embodiment, the distances between the finger head of the operating finger and the joint points in the finger, and between the finger center and the joint points at the finger root of the operating finger may be calculated respectively based on the position information of the joint points on the operating finger, and the ratio of the distances between the finger head of the operating finger and the joint points in the finger, and between the finger center and the joint points at the finger root of the operating finger may be used as the degree of bending of the finger. The way of calculating the degree of curvature of the finger is various according to the distance between the joint points, and is not limited to the above method.
The touch plane is a virtual keyboard, and the touch position is an operation position of the virtual keyboard.
Each finger is associated with a plurality of operation positions of the virtual keyboard, and the operation positions of the virtual keyboard are arranged according to the association relation with each finger.
Specifically, referring to the virtual keyboard in fig. 4, a plurality of operation positions associated with respective fingers are arranged in a line in the virtual keyboard.
The step of determining the touched position of the operating finger based on the degree of curvature of the operating finger includes steps S43 to S44.
Step S43, acquiring a plurality of operation positions associated with the operation finger, wherein the plurality of operation positions respectively correspond to one finger curvature section.
According to the preset rule, each finger is associated with a plurality of operation positions, and the association relationship can improve the working efficiency of the user when the user executes the operation. After the operating finger is determined, the operating position associated with the finger can be obtained according to the mapping relation of the preset rule.
When a plurality of operation positions associated with the operation finger are few, the operation positions can be directly acquired so as to determine the position touched by the operation finger.
When there are a plurality of operation positions associated with the operation finger, the finger curvature sections corresponding to the operation positions may intersect, which is not favorable for accurately determining the operation position.
In order to improve the recognition accuracy and further narrow the range of the operation position controllable by the operation finger, the method further includes steps S431 to S432.
And step S431, acquiring the current operation range of the operation finger according to the distance between the operation finger and the middle finger.
And obtaining the current operation range of the operation finger by taking the joint point of the middle finger as a reference point according to the current position information of each joint point of the operation finger. For example, assuming that the manipulation finger is the left index finger, when the distance between the left index finger and the joint point of the finger portion of the left middle finger is large, it is possible to determine that the current manipulation range of the left index finger is several manipulation positions farther from the left index finger, and when the distance between the left index finger and the joint point of the finger portion of the left middle finger is small, it is determined that the current manipulation range of the left index finger is several manipulation positions closer to the left index finger.
In step S432, an operation position belonging to the operation range is acquired from the plurality of operation positions.
In step S44, the operation position corresponding to the bending section to which the finger bending of the operation finger belongs is set as the touch position.
In the embodiment of the present disclosure, each acquired operation position corresponds to one finger bending degree section, the finger bending degree sections corresponding to each operation position are acquired according to step S43 without intersecting each other, it is determined to which finger bending degree of the operation finger belongs, each finger bending degree section corresponding to each operation position is acquired according to step S43, and the operation position corresponding to the bending degree section to which the finger bending degree of the operation finger belongs is taken as the touch position.
Example 1
In the embodiment of the present disclosure, the touch plane is a virtual keyboard, and the touch position is an operation position of the virtual keyboard, i.e. the touch position is a key of the virtual keyboard. The following describes in detail an execution flow of a gesture recognition method proposed by the present disclosure.
And after the gesture image is acquired, inputting the gesture image into a preset depth learning model. The processing of the deep learning model on the gesture image mainly comprises two steps: firstly, defining a hand outline from an image, namely finding the position of a hand; second, a joint point is located for the finger of the hand found in the first step. Identifying the knuckle of the finger includes obtaining position information of the knuckle.
And calculating the distance between the finger head of each finger and the touch plane according to the position information of the joint point of each finger close to the finger head, and taking the finger corresponding to the minimum distance as an operation finger.
In this embodiment, taking the operating finger as the left ring finger as an example, a ratio of a distance between a knuckle point of the left ring finger and a knuckle point in the finger to a distance between the knuckle point in the finger and a knuckle point of the finger root is calculated to obtain a curvature of the current left ring finger.
And judging which specific key is touched by the ring finger of the left hand according to the degree of curvature of the finger of the left hand. According to a preset rule, keys related to the left-hand ring finger in the virtual keyboard are three keys of W, S and X, after the bending degree of the left-hand ring finger is obtained, according to a bending degree section corresponding to the preset three keys of W, S and X, the bending degree of the left-hand ring finger is judged to belong to which key bending degree section of the W, S and X, and the key corresponding to the section is the key pressed by the left-hand ring finger in the virtual keyboard.
Example 2
In the embodiment of the present disclosure, the touch plane is a virtual keyboard, and the touch position is an operation position of the virtual keyboard. The following describes in detail how to narrow the operation range and determine the operation position when the operation position associated with the operation finger is large.
In this embodiment, taking the operation finger as the right index finger as an example, according to a preset rule, the right index finger is associated with several keys "Y", "H", "N", "U", "J" and "M" in the virtual keyboard, wherein according to the preset rule, the operation positions of "Y", "H" and "N" and the operation positions of "U", "J" and "M" can be regarded as being divided into two rows. Calculating the distance between the joint points of the index finger of the right hand and the middle finger of the right hand close to the finger head, comparing the distance with a preset distance threshold, if the distance is greater than the distance threshold, determining that the operation range of the index finger of the right hand is several operation positions of 'Y', 'H' and 'N', and if the distance is less than or equal to the distance threshold, determining that the current operation range of the index finger of the right hand is several operation positions of 'U', 'J' and 'M'. If the index finger of the right hand currently touches the operation positions among the keys of "U", "J" and "M", the operation positions of the keys are obtained. Then, the position touched by the operating finger is determined from the operating positions of the several keys according to the degree of curvature of the right hand finger.
The method is easy to realize, the operating finger is confirmed through the position information of the joint point, the curvature of the operating finger is calculated to judge the operating position, the joint point is located on the back of the finger, the situation that the joint point is shielded does not exist, the problems that gesture input is not recognized due to shielding and the like are avoided, and the accuracy is high.
Fig. 3 is a diagram schematically illustrating a finger joint distribution diagram according to an embodiment of the present disclosure.
As shown in fig. 3, fig. 3 schematically shows a manner of setting a finger joint point, where on the back of a right finger, one joint point is respectively set in the finger head, the middle finger and the base of the finger, the joint point in the figure may be a joint point of a physiological structure, or may not be a joint point of a physiological structure, that is, the joint point may be located above a knuckle, or may be located on a finger joint, taking the thumb of the right hand as an example, the thumb has one less knuckle than other fingers, but may still have the same number of joint points as other fingers, and joint points may be respectively set on the first knuckle and the second knuckle of the thumb and on the joint between the two knuckles. The left hand joint is arranged in the same way as the right hand joint, which is not shown in the figure.
FIG. 4 is a schematic diagram schematically illustrating a virtual keyboard provided by an embodiment of the present disclosure;
as shown in fig. 4, the operation positions of the virtual keyboard associated with the fingers are respectively shown in the figure in order from left to right, for example, the left little finger is associated with "Q", "a" and "Z", the left ring finger is associated with "W", "S" and "X", the left middle finger is associated with "E", "D" and "C", the left index finger is associated with "R", "F", "V", "T", "G" and "B", and the left thumb and the right thumb are associated with "Space (Space)". The keys on the virtual keyboard are not limited to those shown in the figures.
Fig. 5 schematically illustrates a gesture recognition apparatus provided by an embodiment of the present disclosure.
A gesture recognition apparatus 500 provided by embodiments of the present disclosure may be a camera-equipped notebook, desktop, tablet, workstation, mobile device, retail point of sale device, smart phone, all-in-one (AiO) computer, gaming device, or any other device suitable for performing the functions described below.
As shown in fig. 5, a gesture recognition apparatus 500 provided in an embodiment of the present disclosure includes: an image acquisition module 510, a joint identification module 520, a finger identification module 530, and a location identification module 540.
And an image acquisition module 510 for capturing gesture images.
And the joint point identification module 520 is used for identifying joint points of all fingers in the gesture image.
The finger recognition module 530 is configured to determine an operation finger performing a touch operation according to a distance between each finger joint point and the touch plane.
And the position identification module 540 is used for judging the touch position of the operating finger according to the finger curvature of the operating finger.
The workflow of the gesture recognition apparatus 500 is described in detail below.
When the image capturing module 510 captures the gesture image, the finger image is transmitted to the joint recognition module 520. The joint point recognition module 520 recognizes the hand contour in the gesture image according to a preset learning model, further recognizes joint points of each finger, and obtains position information of the joint points of each finger. The finger recognition module 530 calculates the distance between each finger and the touch plane according to the position information of the knuckle point of each finger, and determines the finger corresponding to the minimum distance as the operation finger. The position recognition module 540 calculates the degree of finger curvature of the operating finger according to the key points of the finger head, the middle finger and the finger root of the operating finger, and can acquire a plurality of operating positions associated with the operating finger according to a preset rule, and the operating position matched with the degree of finger curvature of the operating finger is used as the touch position of the operating finger.
According to embodiments of the present invention, at least one of the image acquisition module 510, the joint point identification module 520, the finger identification module 530, and the position identification module 540 may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a programmable logic array (P L A), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or any other reasonable manner of integrating or packaging circuits, or may be implemented in hardware or firmware, or any other reasonable combination of software, hardware, and three firmware implementations, or at least one of the image acquisition module 510, the joint point identification module 520, the finger identification module 530, and the position identification module 520 may be implemented at least partially as a computer program module that, when executed by a computer, executes at least a computer function.
The above embodiments are provided to further explain the objects, technical solutions and advantages of the present invention in detail, it should be understood that the above embodiments are only examples of the present invention and are not intended to limit the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A gesture recognition method is used for recognizing the touch position of a finger on a touch plane, and is characterized by comprising the following steps:
shooting a gesture image;
identifying joint points of each finger in the gesture image;
judging an operating finger which executes touch operation according to the distance between the joint point of each finger and the touch plane;
and judging the touch position of the operation finger according to the finger curvature of the operation finger.
2. The gesture recognition method according to claim 1, wherein the determining of the operation finger on which the touch operation is performed based on the distance between each finger joint point and the touch plane includes:
acquiring the distance between the joint point of each finger close to the fingertip and the touch plane;
and taking the finger corresponding to the minimum distance as an operation finger.
3. The gesture recognition method according to claim 2, wherein the step of using the finger corresponding to the minimum distance as the operation finger includes:
judging whether the finger corresponding to the minimum distance value executes touch operation or not;
if the finger corresponding to the minimum distance value executes touch operation, taking the finger corresponding to the minimum distance value as an operation finger;
and if the finger corresponding to the minimum distance value does not execute the touch operation, stopping executing the gesture recognition operation.
4. The gesture recognition method according to claim 3, wherein the determining whether the finger corresponding to the minimum distance value performs a touch operation comprises:
comparing the minimum distance value with a preset distance threshold value;
if the minimum distance value is smaller than or equal to the distance threshold value, the finger corresponding to the minimum distance value executes touch operation;
if the minimum distance value is larger than the distance threshold value, the finger corresponding to the minimum distance value does not execute touch operation.
5. The gesture recognition method according to claim 2, further comprising:
respectively acquiring the distance between each joint point of the operating finger;
and calculating the finger curvature of the operating finger according to the distance between the joint points.
6. The gesture recognition method of claim 1, wherein the touch plane is a virtual keyboard and the touch location is an operation location of the virtual keyboard.
7. The gesture recognition method according to claim 6, wherein the respective fingers are associated with a plurality of operation positions of the virtual keyboard, respectively, the operation positions of the virtual keyboard being arranged in association with the respective fingers.
8. The gesture recognition method according to claim 7, wherein the determining of the touched position of the operation finger based on the degree of finger curvature of the operation finger includes:
acquiring a plurality of operation positions associated with the operation finger, wherein the operation positions respectively correspond to one finger bending degree interval;
and taking the operation position corresponding to the bending degree section to which the finger bending degree of the operation finger belongs as a touch position.
9. The gesture recognition method according to claim 8, wherein the obtaining a plurality of operation positions associated with the operation finger in the virtual keyboard further comprises:
acquiring the current operating range of the operating finger according to the distance between the operating finger and the middle finger;
acquiring an operation position belonging to the operation range from the plurality of operation positions.
10. A gesture recognition apparatus, comprising:
the image acquisition module is used for shooting a gesture image;
the joint point identification module is used for identifying joint points of all fingers in the gesture image;
the finger identification module is used for judging the operating finger which executes the touch operation according to the distance between each finger joint point and the touch plane;
and the position identification module is used for judging the touch position of the operating finger according to the finger curvature of the operating finger.
CN202010234458.9A 2020-03-30 2020-03-30 Gesture recognition method and device Pending CN111443831A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010234458.9A CN111443831A (en) 2020-03-30 2020-03-30 Gesture recognition method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010234458.9A CN111443831A (en) 2020-03-30 2020-03-30 Gesture recognition method and device

Publications (1)

Publication Number Publication Date
CN111443831A true CN111443831A (en) 2020-07-24

Family

ID=71649342

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010234458.9A Pending CN111443831A (en) 2020-03-30 2020-03-30 Gesture recognition method and device

Country Status (1)

Country Link
CN (1) CN111443831A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112839172A (en) * 2020-12-31 2021-05-25 深圳瞬玩科技有限公司 Shooting subject identification method and system based on hand identification
CN113010014A (en) * 2021-03-18 2021-06-22 深圳市科服信息技术有限公司 Virtual keyboard system and method based on intelligent control and computer readable storage medium
CN113253908A (en) * 2021-06-22 2021-08-13 腾讯科技(深圳)有限公司 Key function execution method, device, equipment and storage medium
CN114596582A (en) * 2022-02-28 2022-06-07 北京伊园未来科技有限公司 Augmented reality interaction method and system with vision and force feedback

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200945174A (en) * 2008-04-14 2009-11-01 Pointgrab Ltd Vision based pointing device emulation
CN103197767A (en) * 2013-04-10 2013-07-10 周可 Method and device for virtual keyboard input by aid of hand signs
US20130328769A1 (en) * 2011-02-23 2013-12-12 Lg Innotek Co., Ltd. Apparatus and method for inputting command using gesture
JP2014165660A (en) * 2013-02-25 2014-09-08 Univ Of Tsukuba Method of input with virtual keyboard, program, storage medium, and virtual keyboard system
US20150235447A1 (en) * 2013-07-12 2015-08-20 Magic Leap, Inc. Method and system for generating map data from an image
CN105068662A (en) * 2015-09-07 2015-11-18 哈尔滨市一舍科技有限公司 Electronic device used for man-machine interaction
CN105653029A (en) * 2015-12-25 2016-06-08 乐视致新电子科技(天津)有限公司 Method and system for obtaining immersion feel in virtual reality system as well as intelligent glove
CN106845335A (en) * 2016-11-29 2017-06-13 歌尔科技有限公司 Gesture identification method, device and virtual reality device for virtual reality device
WO2017113794A1 (en) * 2015-12-31 2017-07-06 北京体基科技有限公司 Gesture recognition method, control method and apparatus, and wrist-type device
JP2017219942A (en) * 2016-06-06 2017-12-14 株式会社リコー Contact detection device, projector device, electronic blackboard system, digital signage device, projector device, contact detection method, program and recording medium
CN107918507A (en) * 2016-10-10 2018-04-17 广东技术师范学院 A kind of virtual touchpad method based on stereoscopic vision
WO2019004686A1 (en) * 2017-06-26 2019-01-03 서울대학교산학협력단 Keyboard input system and keyboard input method using finger gesture recognition
CN109448707A (en) * 2018-12-18 2019-03-08 北京嘉楠捷思信息技术有限公司 Voice recognition method and device, equipment and medium
CN110478860A (en) * 2019-09-02 2019-11-22 燕山大学 The virtual rehabilitation system of hand function obstacle based on hand object natural interaction

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200945174A (en) * 2008-04-14 2009-11-01 Pointgrab Ltd Vision based pointing device emulation
US20130328769A1 (en) * 2011-02-23 2013-12-12 Lg Innotek Co., Ltd. Apparatus and method for inputting command using gesture
JP2014165660A (en) * 2013-02-25 2014-09-08 Univ Of Tsukuba Method of input with virtual keyboard, program, storage medium, and virtual keyboard system
CN103197767A (en) * 2013-04-10 2013-07-10 周可 Method and device for virtual keyboard input by aid of hand signs
US20150235447A1 (en) * 2013-07-12 2015-08-20 Magic Leap, Inc. Method and system for generating map data from an image
CN105068662A (en) * 2015-09-07 2015-11-18 哈尔滨市一舍科技有限公司 Electronic device used for man-machine interaction
CN105653029A (en) * 2015-12-25 2016-06-08 乐视致新电子科技(天津)有限公司 Method and system for obtaining immersion feel in virtual reality system as well as intelligent glove
WO2017113794A1 (en) * 2015-12-31 2017-07-06 北京体基科技有限公司 Gesture recognition method, control method and apparatus, and wrist-type device
JP2017219942A (en) * 2016-06-06 2017-12-14 株式会社リコー Contact detection device, projector device, electronic blackboard system, digital signage device, projector device, contact detection method, program and recording medium
CN107918507A (en) * 2016-10-10 2018-04-17 广东技术师范学院 A kind of virtual touchpad method based on stereoscopic vision
CN106845335A (en) * 2016-11-29 2017-06-13 歌尔科技有限公司 Gesture identification method, device and virtual reality device for virtual reality device
WO2018098861A1 (en) * 2016-11-29 2018-06-07 歌尔科技有限公司 Gesture recognition method and device for virtual reality apparatus, and virtual reality apparatus
WO2019004686A1 (en) * 2017-06-26 2019-01-03 서울대학교산학협력단 Keyboard input system and keyboard input method using finger gesture recognition
CN109448707A (en) * 2018-12-18 2019-03-08 北京嘉楠捷思信息技术有限公司 Voice recognition method and device, equipment and medium
CN110478860A (en) * 2019-09-02 2019-11-22 燕山大学 The virtual rehabilitation system of hand function obstacle based on hand object natural interaction

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵璐;刘越;卓荦;: "触觉再现技术研究进展", no. 11 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112839172A (en) * 2020-12-31 2021-05-25 深圳瞬玩科技有限公司 Shooting subject identification method and system based on hand identification
CN113010014A (en) * 2021-03-18 2021-06-22 深圳市科服信息技术有限公司 Virtual keyboard system and method based on intelligent control and computer readable storage medium
CN113253908A (en) * 2021-06-22 2021-08-13 腾讯科技(深圳)有限公司 Key function execution method, device, equipment and storage medium
CN114596582A (en) * 2022-02-28 2022-06-07 北京伊园未来科技有限公司 Augmented reality interaction method and system with vision and force feedback

Similar Documents

Publication Publication Date Title
CN111443831A (en) Gesture recognition method and device
JP5991041B2 (en) Virtual touch screen system and bidirectional mode automatic switching method
KR101872426B1 (en) Depth-based user interface gesture control
US8525876B2 (en) Real-time embedded vision-based human hand detection
US11386717B2 (en) Fingerprint inputting method and related device
EP3526959B1 (en) Method of acquiring biometric data and electronic device therefor
US9857971B2 (en) System and method for receiving user input and program storage medium thereof
KR20150002776A (en) Rapid gesture re-engagement
US20200125176A1 (en) Method of Virtual User Interface Interaction Based on Gesture Recognition and Related Device
US9262012B2 (en) Hover angle
US9898809B2 (en) Systems, methods and techniques for inputting text into mobile devices using a camera-based keyboard
CN104166509A (en) Non-contact screen interaction method and system
CN112068698A (en) Interaction method and device, electronic equipment and computer storage medium
CN108027648A (en) The gesture input method and wearable device of a kind of wearable device
US11886643B2 (en) Information processing apparatus and information processing method
CN106569716B (en) Single-hand control method and control system
US20160034027A1 (en) Optical tracking of a user-guided object for mobile platform user input
WO2019037257A1 (en) Password input control device and method, and computer readable storage medium
WO2016197815A2 (en) Method and apparatus for using fingerprint operation, and terminal
JP2013077180A (en) Recognition device and method for controlling the same
CN110291495B (en) Information processing system, information processing method, and program
JP2014130449A (en) Information processor and control method therefor
US11782594B2 (en) Display apparatus, display system, and display method
KR20190049349A (en) Method for recognizing user's touch on projection image and apparatus for performing the method
JP2013200654A (en) Display control device, display control method, information display system, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20201120

Address after: Room 206, 2 / F, building C, phase I, Zhongguancun Software Park, No. 8, Dongbei Wangxi Road, Haidian District, Beijing 100094

Applicant after: Canaan Bright Sight Co.,Ltd.

Address before: 100094, No. 3, building 23, building 8, northeast Wang Xi Road, Beijing, Haidian District, 307

Applicant before: Canaan Creative Co.,Ltd.

TA01 Transfer of patent application right