CN109829368B - Palm feature recognition method and device, computer equipment and storage medium - Google Patents
Palm feature recognition method and device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN109829368B CN109829368B CN201811583897.XA CN201811583897A CN109829368B CN 109829368 B CN109829368 B CN 109829368B CN 201811583897 A CN201811583897 A CN 201811583897A CN 109829368 B CN109829368 B CN 109829368B
- Authority
- CN
- China
- Prior art keywords
- palm
- key point
- finger
- coordinates
- judging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 210000003811 finger Anatomy 0.000 claims abstract description 85
- 244000060701 Kaempferia pandurata Species 0.000 claims abstract description 12
- 235000016390 Uvaria chamae Nutrition 0.000 claims abstract description 12
- 210000004932 little finger Anatomy 0.000 claims abstract description 11
- 238000012544 monitoring process Methods 0.000 claims abstract description 11
- 230000015654 memory Effects 0.000 claims description 28
- 238000004590 computer program Methods 0.000 claims description 4
- 210000004247 hand Anatomy 0.000 abstract description 4
- 238000001514 detection method Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 238000012549 training Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 5
- 238000013461 design Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000005291 magnetic effect Effects 0.000 description 3
- 230000036632 reaction speed Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000011895 specific detection Methods 0.000 description 2
- 230000001131 transforming effect Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000037303 wrinkles Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Landscapes
- Collating Specific Patterns (AREA)
Abstract
The invention provides a palm feature identification method, which comprises the following steps: monitoring a triggering event of palm feature recognition, and acquiring a palm image when the triggering event is detected; extracting coordinates of a first key point at a finger tip and a second key point at a finger root in the palm image from a pre-constructed plane coordinate system, and judging finger orientations of the palm in the palm image according to the coordinates of the first key point and the second key point; and extracting the coordinates of a third key point at the root of the little finger, and judging the palm characteristics of the palm image according to the coordinates of the first key point, the coordinates of the third key point and the finger direction. The palm feature identification method and device can realize the palm feature of the left and right attributes of the palm, solve the problem of identification of the attributes of the left and right hands of the same person in the prior art, and can be helpful for accurately identifying palm print information.
Description
Technical Field
The present invention relates to the field of information processing technologies, and in particular, to a palm feature recognition method, apparatus, computer device, and storage medium.
Background
The palm print recognition algorithm is a newer biometric technology proposed in recent years. Palmprint refers to the palm image of the distal finger to the wrist portion. Many of these features can be used for identification: such as main lines, wrinkles, fine textures, ridge tips, bifurcation points, and the like. Palmprint recognition is also a non-invasive recognition method, which is easy to accept by users and has low requirements on acquisition equipment.
The most important features in palmprints are the ridge features, and the clearest ones of these ridge features are essentially unchanged along with a person's lifetime. The features of the palmprint include a ridge feature, a dot feature texture feature, and also include geometric features: such as the width, length and geometry of the palm, and the distribution of the different areas of the palm. The information contained in the palm print is far more abundant than the information contained in a fingerprint, and the identity of a person can be completely determined by using the line characteristics, the point characteristics, the texture characteristics and the geometric characteristics of the palm print. Therefore, the application range of palm print recognition must be wider and wider, and how to improve the precision of the palm print recognition algorithm becomes a problem to be studied.
At present, the palm print recognition algorithm on the market is difficult to accurately judge the direction of the hand and the left and right attributes of the hand, and the difficulty is caused to the independent recognition of the left and right hands of the same person in palm print recognition.
Therefore, the specific attribute of the palm cannot be identified by the existing technical scheme, so that the precision of palm print identification is not high enough, and the identification algorithm is complex and unreliable.
Disclosure of Invention
The invention provides a palm feature identification method and a corresponding device, which mainly realize the palm feature of left and right attributes of a palm, solve the problem of identification of the attributes of left and right hands of the same person in the prior art, and are helpful for more accurately identifying palm print information.
The invention also provides a computer device and a readable storage medium for executing the palm feature identification method of the invention.
In order to solve the problems, the invention adopts the following technical proposal:
in a first aspect, the present invention provides a method for identifying palm features, the method comprising:
monitoring a triggering event of palm feature recognition, and acquiring a palm image when the triggering event is detected;
extracting coordinates of a first key point at a finger tip and a second key point at a finger root in the palm image from a pre-constructed plane coordinate system, and judging finger orientations of the palm in the palm image according to the coordinates of the first key point and the second key point;
and extracting the coordinates of a third key point at the root of the little finger, and judging the palm characteristics of the palm image according to the coordinates of the first key point, the coordinates of the third key point and the finger direction.
Specifically, the determining the finger pointing direction of the palm in the palm image according to the coordinates of the first key point and the second key point specifically includes:
reading the coordinates (X1, Y1) of the first keypoint and the coordinates (X2, Y2) of the second keypoint;
if Y1 is greater than Y2, judging that the finger direction of the palm in the palm image is the positive direction pointing to the Y axis;
and if Y1 is less than Y2, judging that the finger pointing direction of the palm in the palm image is the negative direction pointing to the Y axis.
Specifically, the extracting coordinates of a third key point at the root of the little finger, according to the coordinates of the first key point, the coordinates of the third key point and the finger pointing direction, judges palm features of the palm image, includes:
reading coordinates (X3, Y3) of the third keypoint;
and when the finger points to a positive direction pointing to the Y axis and the abscissa X1 of the first key point is smaller than the abscissa X3 of the third key point, or when the finger points to a negative direction pointing to the Y axis and the abscissa X1 of the first key point is larger than the abscissa X3 of the third key point, judging that the palm feature is a left hand, and otherwise, judging that the palm feature is a right hand.
Specifically, the method further comprises the following steps:
and extracting coordinates of a fourth key point at the root of the index finger, judging the inclination angle of the palm in the palm image according to the reference connecting line of the fourth key point and the third key point, and rotating the palm image according to the inclination angle so as to adjust the reference connecting line to a horizontal position.
Specifically, the method further comprises the following steps:
taking the midpoint E of the reference connecting line as a starting point to make a perpendicular bisector of the reference connecting line;
and taking a point C (X, Y) on the midplane of the reference connecting line near the palm center area, so that the length L of the CE is equal to half of the reference connecting line, and taking the point C as the palm center point.
Preferably, the method further comprises:
for the rotated palm image, C (X, Y) is taken as a midpoint, and points p1 (X-L, Y-L), p2 (X+L, Y-L), p3 (X-L, Y+L) and p4 (X+L, Y+L) are taken as vertexes to determine a palm center region;
and extracting palm print characteristic points of the palm center area to identify the palm print characteristic points.
Specifically, when the triggering event of palm recognition is a sliding pressing operation acting on a terminal screen, after the palm feature of the palm image is determined according to the coordinates of the first key point, the coordinates of the third key point, and the finger pointing direction, the method further includes:
and adjusting the virtual keys on the terminal screen to one side of the screen corresponding to the left and right attributes of the palm according to the left and right attributes of the identified palm.
In a second aspect, the present invention provides a palm left-right attribute identifying device, the method comprising:
the monitoring module is used for monitoring a triggering event of palm feature recognition, and acquiring a palm image when the triggering event is detected;
the first judging module is used for extracting coordinates of a first key point at a finger tip and a second key point at a finger root in the palm image in a pre-constructed plane coordinate system, and judging finger orientations of the palm in the palm image according to the coordinates of the first key point and the second key point;
and the second judging module is used for extracting the coordinates of a third key point at the root of the little finger and judging the palm characteristics of the palm image according to the coordinates of the first key point, the coordinates of the third key point and the finger direction.
In a third aspect, the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of identifying palm features of any of the first aspects.
In a fourth aspect, the present invention provides a computer device comprising a memory and a processor, the memory having stored therein computer readable instructions which, when executed by the processor, cause the processor to perform the steps of the method of identifying palm features as claimed in any of the first aspects.
Compared with the prior art, the technical scheme of the invention has at least the following advantages:
the invention provides a palm feature recognition method, which comprises the steps of monitoring a triggering event of palm feature recognition, and acquiring a palm image when the triggering event is detected; extracting coordinates of a first key point at a finger tip and a second key point at a finger root in the palm image from a pre-constructed plane coordinate system, and judging finger orientations of the palm in the palm image according to the coordinates of the first key point and the second key point; and extracting the coordinates of a third key point at the root of the little finger, and judging the palm characteristics of the palm image according to the coordinates of the first key point, the coordinates of the third key point and the finger direction. The palm feature identification method and device can realize the palm feature of the left and right attributes of the palm, solve the problem of identification of the attributes of the left and right hands of the same person in the prior art, and can be helpful for accurately identifying palm print information.
2, according to the left and right attributes of the identified palm, the virtual keys on the terminal screen can be adjusted to one side of the screen corresponding to the left and right attributes of the palm. For example, when the method is applied to the APP related to the game, the position of the virtual key can be adjusted in time according to the left and right attributes of the identified palm or finger contacting the mobile terminal. When the user uses the left hand to operate, the corresponding virtual key is adjusted to the left side area of the screen of the mobile terminal; when the user uses the right hand to operate, the corresponding virtual keys are adjusted to the right side area of the screen of the mobile terminal, so that the reaction speed and the operation speed of the user are improved, and the user experience is improved.
Drawings
FIG. 1 is a flow chart of a method of identifying palm features in one embodiment;
FIG. 2 is a schematic diagram of a planar rectangular coordinate system constructed according to the palm image in one embodiment;
FIG. 3 is a schematic diagram of adjusting the horizontal position of the palm in one embodiment;
FIG. 4 is a block diagram of a palm feature recognition device according to another embodiment;
FIG. 5 is a block diagram of the internal architecture of a computer device in one embodiment.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
In order to enable those skilled in the art to better understand the present invention, the following description will make clear and complete descriptions of the technical solutions according to the embodiments of the present invention with reference to the accompanying drawings.
In some of the flows described in the specification and claims of the present invention and in the foregoing figures, a plurality of operations appearing in a particular order are included, but it should be clearly understood that the operations may be performed in other than the order in which they appear herein or in parallel, the sequence numbers of the operations such as S11, S12, etc. are merely used to distinguish between the various operations, and the sequence numbers themselves do not represent any order of execution. In addition, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first" and "second" herein are used to distinguish different messages, devices, modules, etc., and do not represent a sequence, and are not limited to the "first" and the "second" being different types.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by one of ordinary skill in the art. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein includes all or any element and all combination of one or more of the associated listed items.
It will be understood by those of ordinary skill in the art that unless otherwise defined, all terms used herein (including technical and scientific terms) have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, wherein the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to fall within the scope of the invention.
Referring to fig. 1, an embodiment of the present invention provides a palm feature recognition method, as shown in fig. 1, including the following steps:
s11, monitoring a triggering event of palm feature recognition, and acquiring a palm image when the triggering event is detected.
In the embodiment of the invention, the triggering event of palm recognition includes, but is not limited to, an event that a user clicks a key of palm print recognition, a gesture operation for palm print recognition, and the like.
And after the triggering event is detected, starting the camera to acquire an image to be identified, and detecting a palm image in the image to be identified through a target detection algorithm SSD (Single Shot MultiBox Detector) algorithm. SSD is a single detector that uses a VGG19 network as the feature extractor. A custom convolution layer is added after the network and prediction is performed using the convolution kernel.
In one possible design, the target detection algorithm includes the following steps:
firstly, inputting a picture (200 x 200), inputting the picture into a pre-trained classification network to obtain feature maps with different sizes, and modifying a traditional VGG16 network;
wherein, the FC6 and FC7 layers of VGG16 are converted into convolution layers;
removing all the Dropout layer and the FC8 layer;
an Atrous algorithm (hole algorithm) is added;
transforming Pool5 from 2x2-S2 to 3x3-S1;
secondly, extracting feature maps of Conv4_3, conv7, conv8_2, conv9_2, conv10_2 and Conv11_2 layers, constructing 6 BBs with different scale sizes at each point on the feature map layers, and then respectively detecting and classifying to generate a plurality of BBs;
thirdly, BBs obtained by different feature maps are combined, and a part of overlapped or incorrect BBs are restrained by an NMS (non-maximum restraint) method to generate a final BB set, namely a detection result. The target detection algorithm applied by the invention adopts the multi-scale characteristic diagram primary convolution for detection.
And S12, extracting coordinates of a first key point at a finger tip and a second key point at a finger root in the palm image in a pre-constructed plane coordinate system, and judging finger orientations of the palm in the palm image according to the coordinates of the first key point and the second key point.
Referring to fig. 2, fig. 2 is a schematic diagram of a plane rectangular coordinate system constructed according to the palm image. In the embodiment of the invention, a plane rectangular coordinate system needs to be constructed. Specifically, a point in the palm image is taken as an origin to construct a plane rectangular coordinate system, wherein the origin can be selected according to actual needs, and the method is not specifically limited herein.
In the embodiment of the present invention, after the palm image is acquired, the first key point is extracted according to a key point detection algorithm, such as T in fig. 2, and the second key point is such as M in fig. 2. The specific detection mode is as follows: and training the Cascade CNN model, wherein the training data is a training picture marked with the positions of fingertips and finger roots. After training the CNN model, the newly input hand image is detected using the model. The invention judges the finger pointing direction of the palm in the palm image by comparing the coordinate values of the first key point and the second key point.
Specifically, please continue to refer to fig. 2, in the rectangular coordinate system shown in fig. 2, the coordinates (X1, Y1) of the first key point T and the coordinates (X2, Y2) of the second key point M are read. If Y1> Y2, judging that the finger pointing direction of the palm in the palm image is the positive direction pointing to the Y axis, and determining that an is_up mark is True; if Y1< Y2, judging that the finger pointing direction of the palm in the palm image points to the negative direction of the Y axis, and determining that the is_up mark is False.
And S13, extracting coordinates of a third key point at the root of the little finger, and judging palm features of the palm image according to the coordinates of the first key point, the coordinates of the third key point and the finger direction.
In the embodiment of the invention, the palm features are left and right attributes of the palm. The left and right attributes of the palm can be judged according to the coordinates of the first key point, the coordinates of the third key point and the finger pointing direction.
With continued reference to fig. 2, the coordinates (X3, Y3) of the third key point are read; and when the finger points to a positive direction pointing to the Y axis and the abscissa X1 of the first key point is smaller than the abscissa X3 of the third key point, or when the finger points to a negative direction pointing to the Y axis and the abscissa X1 of the first key point is larger than the abscissa X3 of the third key point, judging that the palm feature is a left hand, and otherwise, judging that the palm feature is a right hand.
In one possible design, the present invention may also identify the left and right attributes of the finger, for example, by the profile information of each of the left and right hand fingers.
The invention recognizes that the left and right attributes of the palm or finger can be applied to intelligent adjustment of the virtual key positions. Specifically, according to the left and right attributes of the identified palm, the virtual keys on the terminal screen can be adjusted to the screen side corresponding to the left and right attributes of the palm. For example, in one possible application scenario, the palm left and right attribute identification method of the present invention is applied to game software, and can monitor the palm print of the palm or the fingerprint of the finger of the user in real time, determine the left and right attribute of the palm or the finger according to the palm print of the palm or the fingerprint of the finger, and intelligently adjust the virtual button to the corresponding area of the mobile terminal screen according to the left and right attribute of the palm or the finger, for example, when the current user touches the screen of the mobile terminal with the left finger, adjust the corresponding virtual button to the left area of the screen of the mobile terminal, so as to facilitate the user operation and avoid the user from frequently changing the finger to operate the mobile terminal.
In the embodiment of the invention, after the palm characteristics of the palm are judged, the horizontal position of the palm can be adjusted according to the palm characteristics. Specifically, the coordinate of the fourth key point at the root of the index finger can be extracted, the inclination angle of the palm in the palm image is judged according to the reference connecting line of the fourth key point and the third key point, and the palm image is rotated according to the inclination angle so as to adjust the reference connecting line to the horizontal position.
Referring to fig. 3, fig. 3 is a schematic diagram illustrating adjustment of the horizontal position of the palm. As shown in fig. 3, the fourth key point in fig. 3 is point a, and the third key point is point B. In the embodiment of the invention, the inclination angle of the palm is judged according to the connecting line AB of the fourth key point and the third key point, the position of the palm is corrected according to the inclination angle, and the connecting line AB of the fourth key point and the third key point is rotated to the horizontal position, so that the correction of the palm image is realized, and the purpose of correcting the palm image is to adjust the palm in the palm image so as to facilitate the palm print recognition of the subsequent palm.
With continued reference to fig. 3, the present invention may further implement palm segmentation based on the adjusted palm image to extract the palm center region. Specifically, taking the midpoint E of the reference connecting line as a starting point to make a perpendicular bisector of the reference connecting line; a point C (X, Y) is taken on the midpoint of the reference line near the palm center area such that the length L of CE is equal to half of the reference line with the point C as the palm center point. For the rotated palm image, C (X, Y) is taken as a midpoint, and points p1 (X-L, Y-L), p2 (X+L, Y-L), p3 (X-L, Y+L) and p4 (X+L, Y+L) are taken as vertexes to determine a palm center region; and extracting palm print characteristic points of the palm center area to identify the palm print characteristic points. The division of the palm center area is convenient for better carrying out palm print recognition for the follow-up.
Referring to fig. 4, in another embodiment, the present invention provides a palm feature recognition device, including:
the monitoring module 11 is configured to monitor a triggering event of palm feature recognition, and acquire a palm image when the triggering event is detected.
In the embodiment of the invention, the triggering event of palm recognition includes, but is not limited to, an event that a user clicks a key of palm print recognition, a gesture operation for palm print recognition, and the like.
And after the triggering event is detected, starting the camera to acquire an image to be identified, and detecting a palm image in the image to be identified through a target detection algorithm SSD (Single Shot MultiBox Detector) algorithm. SSD is a single detector that uses a VGG19 network as the feature extractor. A custom convolution layer is added after the network and prediction is performed using the convolution kernel.
In one possible design, the target detection algorithm includes the following steps:
firstly, inputting a picture (200 x 200), inputting the picture into a pre-trained classification network to obtain feature maps with different sizes, and modifying a traditional VGG16 network;
wherein, the FC6 and FC7 layers of VGG16 are converted into convolution layers;
removing all the Dropout layer and the FC8 layer;
an Atrous algorithm (hole algorithm) is added;
transforming Pool5 from 2x2-S2 to 3x3-S1;
secondly, extracting feature maps of Conv4_3, conv7, conv8_2, conv9_2, conv10_2 and Conv11_2 layers, constructing 6 BBs with different scale sizes at each point on the feature map layers, and then respectively detecting and classifying to generate a plurality of BBs;
thirdly, BBs obtained by different feature maps are combined, and a part of overlapped or incorrect BBs are restrained by an NMS (non-maximum restraint) method to generate a final BB set, namely a detection result. The target detection algorithm applied by the invention adopts the multi-scale characteristic diagram primary convolution for detection.
The first determining module 12 is configured to extract coordinates of a first key point at a finger tip and a second key point at a finger root in the palm image in a pre-constructed planar coordinate system, and determine a finger direction of the palm in the palm image according to the coordinates of the first key point and the second key point.
Referring to fig. 2, fig. 2 is a schematic diagram of a plane rectangular coordinate system constructed according to the palm image. In the embodiment of the invention, a plane rectangular coordinate system needs to be constructed. Specifically, a point in the palm image is taken as an origin to construct a plane rectangular coordinate system, wherein the origin can be selected according to actual needs, and the method is not specifically limited herein.
In the embodiment of the present invention, after the palm image is acquired, the first key point is extracted according to a key point detection algorithm, such as T in fig. 2, and the second key point is such as M in fig. 2. The specific detection mode is as follows: and training the Cascade CNN model, wherein the training data is a training picture marked with the positions of fingertips and finger roots. After training the CNN model, the newly input hand image is detected using the model. The invention judges the finger pointing direction of the palm in the palm image by comparing the coordinate values of the first key point and the second key point.
Specifically, please continue to refer to fig. 2, in the rectangular coordinate system shown in fig. 2, the coordinates (X1, Y1) of the first key point T and the coordinates (X2, Y2) of the second key point M are read. If Y1> Y2, judging that the finger pointing direction of the palm in the palm image is the positive direction pointing to the Y axis, and determining that an is_up mark is True; if Y1< Y2, judging that the finger pointing direction of the palm in the palm image points to the negative direction of the Y axis, and determining that the is_up mark is False.
The second judging module 13 is configured to extract coordinates of a third key point at the root of the little finger, and judge palm features of the palm image according to the coordinates of the first key point, the coordinates of the third key point, and the finger pointing direction.
In the embodiment of the invention, the palm features are left and right attributes of the palm. The left and right attributes of the palm can be judged according to the coordinates of the first key point, the coordinates of the third key point and the finger pointing direction.
With continued reference to fig. 2, the coordinates (X3, Y3) of the third key point are read; and when the finger points to a positive direction pointing to the Y axis and the abscissa X1 of the first key point is smaller than the abscissa X3 of the third key point, or when the finger points to a negative direction pointing to the Y axis and the abscissa X1 of the first key point is larger than the abscissa X3 of the third key point, judging that the palm feature is a left hand, and otherwise, judging that the palm feature is a right hand.
In one possible design, the present invention may also identify the left and right attributes of the finger, for example, by the profile information of each of the left and right hand fingers.
The invention recognizes that the left and right attributes of the palm or finger can be applied to intelligent adjustment of the virtual key positions. Specifically, according to the left and right attributes of the identified palm, the virtual keys on the terminal screen can be adjusted to the screen side corresponding to the left and right attributes of the palm. For example, in one possible application scenario, the palm left and right attribute identification method of the present invention is applied to game software, and can monitor the palm print of the palm or the fingerprint of the finger of the user in real time, determine the left and right attribute of the palm or the finger according to the palm print of the palm or the fingerprint of the finger, and intelligently adjust the virtual button to the corresponding area of the mobile terminal screen according to the left and right attribute of the palm or the finger, for example, when the current user touches the screen of the mobile terminal with the left finger, adjust the corresponding virtual button to the left area of the screen of the mobile terminal, so as to facilitate the user operation and avoid the user from frequently changing the finger to operate the mobile terminal.
In the embodiment of the invention, after the palm characteristics of the palm are judged, the horizontal position of the palm can be adjusted according to the palm characteristics. Specifically, the coordinate of the fourth key point at the root of the index finger can be extracted, the inclination angle of the palm in the palm image is judged according to the reference connecting line of the fourth key point and the third key point, and the palm image is rotated according to the inclination angle so as to adjust the reference connecting line to the horizontal position.
Referring to fig. 3, fig. 3 is a schematic diagram illustrating adjustment of the horizontal position of the palm. As shown in fig. 3, the fourth key point in fig. 3 is point a, and the third key point is point B. In the embodiment of the invention, the inclination angle of the palm is judged according to the connecting line AB of the fourth key point and the third key point, the position of the palm is corrected according to the inclination angle, and the connecting line AB of the fourth key point and the third key point is rotated to the horizontal position, so that the correction of the palm image is realized, and the purpose of correcting the palm image is to adjust the palm in the palm image so as to facilitate the palm print recognition of the subsequent palm.
With continued reference to fig. 3, the present invention may further implement palm segmentation based on the adjusted palm image to extract the palm center region. Specifically, taking the midpoint E of the reference connecting line as a starting point to make a perpendicular bisector of the reference connecting line; a point C (X, Y) is taken on the midpoint of the reference line near the palm center area such that the length L of CE is equal to half of the reference line with the point C as the palm center point. For the rotated palm image, C (X, Y) is taken as a midpoint, and points p1 (X-L, Y-L), p2 (X+L, Y-L), p3 (X-L, Y+L) and p4 (X+L, Y+L) are taken as vertexes to determine a palm center region; and extracting palm print characteristic points of the palm center area to identify the palm print characteristic points. The division of the palm center area is convenient for better carrying out palm print recognition for the follow-up.
In another embodiment, an embodiment of the present invention provides a computer readable storage medium, where a computer program is stored, where the program is executed by a processor to implement the method for identifying palm features according to any one of the claims. The computer readable storage medium includes, but is not limited to, any type of disk including floppy disks, hard disks, optical disks, CD-ROMs, and magneto-optical disks, ROMs (Read-Only memories), RAMs (Random AcceSS Memory, random access memories), EPROMs (EraSable Programmable Read-Only memories), EEPROMs (Electrically EraSable Programmable Read-Only memories), flash memories, magnetic cards, or optical cards. That is, a storage device includes any medium that stores or transmits information in a form readable by a device (e.g., computer, cell phone), and may be read-only memory, magnetic or optical disk, etc.
The computer readable storage medium provided by the embodiment of the invention can monitor the triggering event of palm feature recognition, and when the triggering event is detected, a palm image is obtained; extracting coordinates of a first key point at a finger tip and a second key point at a finger root in the palm image from a pre-constructed plane coordinate system, and judging finger orientations of the palm in the palm image according to the coordinates of the first key point and the second key point; and extracting the coordinates of a third key point at the root of the little finger, and judging the palm characteristics of the palm image according to the coordinates of the first key point, the coordinates of the third key point and the finger direction. The wrong exercise mode of the user causes damage to exercise equipment and the problem of affecting the physical health of the user. The palm information identifying method and device can achieve left and right attributes of the palm, and can accurately identify palm print information based on the left and right attributes of the palm. In addition, the virtual key on the terminal screen can be adjusted to one side of the screen corresponding to the left and right attributes of the palm according to the left and right attributes of the identified palm. For example, when the method is applied to the APP related to the game, the position of the virtual key can be adjusted in time according to the identified left and right attributes of the palm or the finger contacting the mobile terminal. When the user uses the left hand to operate, the corresponding virtual key is adjusted to the left side area of the screen of the mobile terminal; when the user uses the right hand to operate, the corresponding virtual keys are adjusted to the right side area of the screen of the mobile terminal, so that the reaction speed and the operation speed of the user are improved, and the user experience is improved.
Furthermore, in yet another embodiment, the present invention provides a computer apparatus, as shown in fig. 5, which includes a processor 303, a memory 305, an input unit 307, and a display unit 309. Those skilled in the art will appreciate that the structural elements illustrated in FIG. 5 do not constitute a limitation of all computer devices, and may include more or fewer elements than shown, or may combine certain elements. The memory 305 may be used to store the application 301 and various functional modules, and the processor 303 runs the application 301 stored in the memory 305 to perform various functional applications of the device and data processing. The memory 305 may be or include both internal memory and external memory. The internal memory may include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), flash memory, or random access memory. The external memory may include a hard disk, floppy disk, ZIP disk, U-disk, tape, etc. The disclosed memory includes, but is not limited to, these types of memory. The memory 305 disclosed herein is by way of example only and not by way of limitation.
The input unit 307 is used for receiving input of a signal and receiving keywords input by a user. The input unit 307 may include a touch panel and other input devices. The touch panel may collect touch operations on or near the user (e.g., the user's operation on or near the touch panel using any suitable object or accessory such as a finger, stylus, etc.), and drive the corresponding connection device according to a preset program; other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., play control keys, switch keys, etc.), a trackball, mouse, joystick, etc. The display unit 309 may be used to display information input by a user or information provided to the user and various menus of the computer device. The display unit 309 may take the form of a liquid crystal display, an organic light emitting diode, or the like. The processor 303 is a control center of the computer device that connects various parts of the entire computer using various interfaces and lines, performs various functions and processes data by running or executing software programs and/or modules stored in the memory 303, and invoking data stored in the memory. The one or more processors 303 shown in fig. 5 are capable of performing, implementing the functions of the monitoring module 11, the first judgment module 12, and the second judgment module 13 shown in fig. 4.
In one implementation, the computer device includes a memory 305 and a processor 303, where the memory 305 stores computer readable instructions that, when executed by the processor, cause the processor 303 to perform the steps of a palm feature identification method described in the above embodiment.
The computer equipment provided by the embodiment of the invention can monitor the triggering event of palm feature recognition, and when the triggering event is detected, a palm image is obtained; extracting coordinates of a first key point at a finger tip and a second key point at a finger root in the palm image from a pre-constructed plane coordinate system, and judging finger orientations of the palm in the palm image according to the coordinates of the first key point and the second key point; and extracting the coordinates of a third key point at the root of the little finger, and judging the palm characteristics of the palm image according to the coordinates of the first key point, the coordinates of the third key point and the finger direction. The wrong exercise mode of the user causes damage to exercise equipment and the problem of affecting the physical health of the user. The palm information identifying method and device can achieve left and right attributes of the palm, and can accurately identify palm print information based on the left and right attributes of the palm. In addition, the virtual key on the terminal screen can be adjusted to one side of the screen corresponding to the left and right attributes of the palm according to the left and right attributes of the identified palm. For example, when the method is applied to the APP related to the game, the position of the virtual key can be adjusted in time according to the identified left and right attributes of the palm or the finger contacting the mobile terminal. When the user uses the left hand to operate, the corresponding virtual key is adjusted to the left side area of the screen of the mobile terminal; when the user uses the right hand to operate, the corresponding virtual keys are adjusted to the right side area of the screen of the mobile terminal, so that the reaction speed and the operation speed of the user are improved, and the user experience is improved.
The computer readable storage medium provided in the embodiments of the present invention may implement the above embodiment of the palm feature identification method, and specific function implementation is described in the method embodiment, and will not be described herein.
Those skilled in the art will appreciate that implementing all or part of the above-described methods in accordance with the embodiments may be accomplished by way of a computer program stored in a computer-readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. The storage medium may be a nonvolatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a random access Memory (Random Access Memory, RAM).
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the invention and are described in detail herein without thereby limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.
Claims (8)
1. A method for identifying palm features, the method comprising:
monitoring a triggering event of palm feature recognition, and acquiring a palm image when the triggering event is detected;
extracting coordinates of a first key point at a finger tip and a second key point at a finger root in the palm image from a pre-constructed plane coordinate system, and judging finger orientations of the palm in the palm image according to the coordinates of the first key point and the second key point, wherein the method comprises the following steps:
reading the coordinates (X1, Y1) of the first keypoint and the coordinates (X2, Y2) of the second keypoint;
if Y1 is greater than Y2, judging that the finger direction of the palm in the palm image is the positive direction pointing to the Y axis;
if Y1 is less than Y2, judging that the finger direction of the palm in the palm image is the negative direction pointing to the Y axis;
extracting coordinates of a third key point at the root of the little finger, and judging left and right palm attribute characteristics of the palm image according to the coordinates of the first key point, the coordinates of the third key point and the finger direction, wherein the method comprises the following steps:
reading coordinates (X3, Y3) of the third keypoint;
and when the finger points to a positive direction pointing to the Y axis and the abscissa X1 of the first key point is smaller than the abscissa X3 of the third key point, or when the finger points to a negative direction pointing to the Y axis and the abscissa X1 of the first key point is larger than the abscissa X3 of the third key point, judging that the left and right attribute characteristics of the palm are left hand, otherwise, right hand.
2. The method of identifying palm features of claim 1, further comprising:
and extracting coordinates of a fourth key point at the root of the index finger, judging the inclination angle of the palm in the palm image according to the reference connecting line of the fourth key point and the third key point, and rotating the palm image according to the inclination angle so as to adjust the reference connecting line to a horizontal position.
3. The method of identifying palm features of claim 2, further comprising:
taking the midpoint E of the reference connecting line as a starting point to make a perpendicular bisector of the reference connecting line;
a point C (X, Y) is taken near the central area of the palm on the midplane of the reference line,
so that the length L of CE is equal to half of the reference line, and the point C is taken as the palm center point.
4. A method of identifying palm features as claimed in claim 3, further comprising:
for the rotated palm image, C (X, Y) is taken as a midpoint, and points p1 (X-L, Y-L), p2 (X+L, Y-L), p3 (X-L, Y+L) and p4 (X+L, Y+L) are taken as vertexes to determine a palm center region;
and extracting palm print characteristic points of the palm center area to identify the palm print characteristic points.
5. The method according to claim 1, wherein when the trigger event of palm recognition is a sliding press operation acting on a terminal screen, after the determining of the palm left-right attribute feature of the palm image according to the coordinates of the first key point, the coordinates of the third key point, and the finger orientation, the method further comprises:
and adjusting the virtual keys on the terminal screen to one side of the screen corresponding to the left and right palm attribute characteristics according to the judged left and right palm attribute characteristics.
6. A left-right attribute recognition apparatus of a palm, the method comprising:
the monitoring module is used for monitoring a triggering event of palm feature recognition, and acquiring a palm image when the triggering event is detected;
the first judging module is used for extracting coordinates of a first key point at a finger tip and a second key point at a finger root in the palm image in a pre-constructed plane coordinate system, and judging finger directions of the palm in the palm image according to the coordinates of the first key point and the second key point, and comprises the following steps:
reading the coordinates (X1, Y1) of the first keypoint and the coordinates (X2, Y2) of the second keypoint;
if Y1 is greater than Y2, judging that the finger direction of the palm in the palm image is the positive direction pointing to the Y axis;
if Y1 is less than Y2, judging that the finger direction of the palm in the palm image is the negative direction pointing to the Y axis;
the second judging module is configured to extract coordinates of a third key point at a root of the little finger, and judge left and right palm attribute features of the palm image according to the coordinates of the first key point, the coordinates of the third key point, and the finger pointing direction, and includes:
reading coordinates (X3, Y3) of the third keypoint;
and when the finger points to a positive direction pointing to the Y axis and the abscissa X1 of the first key point is smaller than the abscissa X3 of the third key point, or when the finger points to a negative direction pointing to the Y axis and the abscissa X1 of the first key point is larger than the abscissa X3 of the third key point, judging that the left and right attribute characteristics of the palm are left hand, otherwise, right hand.
7. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the palm feature identification method of any one of claims 1 to 5.
8. A computer device comprising a memory and a processor, the memory having stored therein computer readable instructions which, when executed by the processor, cause the processor to perform the steps of the palm feature identification method of any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811583897.XA CN109829368B (en) | 2018-12-24 | 2018-12-24 | Palm feature recognition method and device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811583897.XA CN109829368B (en) | 2018-12-24 | 2018-12-24 | Palm feature recognition method and device, computer equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109829368A CN109829368A (en) | 2019-05-31 |
CN109829368B true CN109829368B (en) | 2024-02-20 |
Family
ID=66861044
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811583897.XA Active CN109829368B (en) | 2018-12-24 | 2018-12-24 | Palm feature recognition method and device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109829368B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110851048A (en) * | 2019-09-30 | 2020-02-28 | 华为技术有限公司 | Method for adjusting control and electronic equipment |
CN110993107B (en) * | 2019-12-25 | 2023-06-09 | 新绎健康科技有限公司 | Human body five-finger image processing method and device |
CN112132099A (en) * | 2020-09-30 | 2020-12-25 | 腾讯科技(深圳)有限公司 | Identity recognition method, palm print key point detection model training method and device |
CN112749512B (en) * | 2021-01-18 | 2024-01-26 | 杭州易现先进科技有限公司 | Gesture estimation optimization method, system and electronic device |
CN113221891B (en) * | 2021-05-12 | 2022-12-09 | 佛山育脉科技有限公司 | Method and device for adjusting identification angle of palm vein image |
CN113392787A (en) * | 2021-06-22 | 2021-09-14 | 中国工商银行股份有限公司 | Palm image preprocessing method, device, equipment, medium and program product |
CN114155296A (en) * | 2021-11-19 | 2022-03-08 | 宁波芯然科技有限公司 | Method for determining central area of palm image |
CN113963158A (en) * | 2021-11-25 | 2022-01-21 | 佳都科技集团股份有限公司 | Palm vein image region-of-interest extraction method and device |
CN117472189B (en) * | 2023-12-27 | 2024-04-09 | 大连三通科技发展有限公司 | Typing or touch control realization method with physical sense |
CN117475539B (en) * | 2023-12-28 | 2024-04-12 | 深圳市盛思达通讯技术有限公司 | Entrance guard management method and system based on palm print recognition |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007052534A (en) * | 2005-08-16 | 2007-03-01 | Kddi Corp | Palm print authentication device, palm print authentication program, palm print authentication method, palm print image extraction method, and portable telephone terminal with palm print authentication device |
JP2008217307A (en) * | 2007-03-02 | 2008-09-18 | Kddi Corp | Palm print authentication device, portable telephone terminal, program and palm print authentication method |
JP2010026658A (en) * | 2008-07-16 | 2010-02-04 | Kddi Corp | Palm position detection device, palm print authentication device, cellphone terminal, program and palm position detection method |
JP2010113530A (en) * | 2008-11-06 | 2010-05-20 | Nippon Hoso Kyokai <Nhk> | Image recognition device and program |
CN102073843A (en) * | 2010-11-05 | 2011-05-25 | 沈阳工业大学 | Non-contact rapid hand multimodal information fusion identification method |
CN102163282A (en) * | 2011-05-05 | 2011-08-24 | 汉王科技股份有限公司 | Method and device for acquiring interested area in palm print image |
CN103955674A (en) * | 2014-04-30 | 2014-07-30 | 广东瑞德智能科技股份有限公司 | Palm print image acquisition device and palm print image positioning and segmenting method |
CN104123537A (en) * | 2014-07-04 | 2014-10-29 | 西安理工大学 | Rapid authentication method based on handshape and palmprint recognition |
CN104580143A (en) * | 2014-11-09 | 2015-04-29 | 李若斌 | Security authentication method based on gesture recognition, terminal, server and system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10142836B2 (en) * | 2000-06-09 | 2018-11-27 | Airport America, Llc | Secure mobile device |
JP5747642B2 (en) * | 2011-05-06 | 2015-07-15 | 富士通株式会社 | Biometric authentication device, biometric authentication system, biometric authentication server, biometric authentication client, and biometric authentication device control method |
-
2018
- 2018-12-24 CN CN201811583897.XA patent/CN109829368B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007052534A (en) * | 2005-08-16 | 2007-03-01 | Kddi Corp | Palm print authentication device, palm print authentication program, palm print authentication method, palm print image extraction method, and portable telephone terminal with palm print authentication device |
JP2008217307A (en) * | 2007-03-02 | 2008-09-18 | Kddi Corp | Palm print authentication device, portable telephone terminal, program and palm print authentication method |
JP2010026658A (en) * | 2008-07-16 | 2010-02-04 | Kddi Corp | Palm position detection device, palm print authentication device, cellphone terminal, program and palm position detection method |
JP2010113530A (en) * | 2008-11-06 | 2010-05-20 | Nippon Hoso Kyokai <Nhk> | Image recognition device and program |
CN102073843A (en) * | 2010-11-05 | 2011-05-25 | 沈阳工业大学 | Non-contact rapid hand multimodal information fusion identification method |
CN102163282A (en) * | 2011-05-05 | 2011-08-24 | 汉王科技股份有限公司 | Method and device for acquiring interested area in palm print image |
CN103955674A (en) * | 2014-04-30 | 2014-07-30 | 广东瑞德智能科技股份有限公司 | Palm print image acquisition device and palm print image positioning and segmenting method |
CN104123537A (en) * | 2014-07-04 | 2014-10-29 | 西安理工大学 | Rapid authentication method based on handshape and palmprint recognition |
CN104580143A (en) * | 2014-11-09 | 2015-04-29 | 李若斌 | Security authentication method based on gesture recognition, terminal, server and system |
Also Published As
Publication number | Publication date |
---|---|
CN109829368A (en) | 2019-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109829368B (en) | Palm feature recognition method and device, computer equipment and storage medium | |
CN110232311B (en) | Method and device for segmenting hand image and computer equipment | |
US20210271340A1 (en) | Gesture recognition devices and methods | |
CN106537305B (en) | Method for classifying touch events and touch sensitive device | |
CN107679446B (en) | Human face posture detection method, device and storage medium | |
US11776322B2 (en) | Pinch gesture detection and recognition method, device and system | |
CN103065134A (en) | Fingerprint identification device and method with prompt information | |
CN110008824B (en) | Palmprint recognition method, palmprint recognition device, palmprint recognition computer device and palmprint recognition storage medium | |
CN111414837A (en) | Gesture recognition method and device, computer equipment and storage medium | |
CN108781252B (en) | Image shooting method and device | |
JP6487642B2 (en) | A method of detecting a finger shape, a program thereof, a storage medium of the program, and a system for detecting a shape of a finger. | |
CN109375833B (en) | Touch instruction generation method and device | |
Wang et al. | A new hand gesture recognition algorithm based on joint color-depth superpixel earth mover's distance | |
CN110008825A (en) | Palm grain identification method, device, computer equipment and storage medium | |
CN104915009B (en) | The method and system of gesture anticipation | |
CN112489084B (en) | Trajectory tracking system and method based on face recognition | |
WO2024093665A1 (en) | Identity recognition image processing method and apparatus, computer device, and storage medium | |
US20170177847A1 (en) | Apparatus and method for verifying an identity of a user | |
CN113282164A (en) | Processing method and device | |
JP2013186698A (en) | Handwriting management program and recording display device | |
WO2018068484A1 (en) | Three-dimensional gesture unlocking method, method for acquiring gesture image, and terminal device | |
Sonoda et al. | A letter input system based on handwriting gestures | |
CN112596603A (en) | Gesture control method, device, equipment and storage medium for nuclear power station control system | |
CN114078258A (en) | Image matching method applied to fingerprint identification and related device | |
CN116863541B (en) | Dynamic gesture recognition method and device, related equipment and handwriting recognition method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |