CN110147730B - Palm print recognition method and device and terminal equipment - Google Patents

Palm print recognition method and device and terminal equipment Download PDF

Info

Publication number
CN110147730B
CN110147730B CN201910302774.2A CN201910302774A CN110147730B CN 110147730 B CN110147730 B CN 110147730B CN 201910302774 A CN201910302774 A CN 201910302774A CN 110147730 B CN110147730 B CN 110147730B
Authority
CN
China
Prior art keywords
image
palm
palm print
identified
vertex
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910302774.2A
Other languages
Chinese (zh)
Other versions
CN110147730A (en
Inventor
侯丽
王福晴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201910302774.2A priority Critical patent/CN110147730B/en
Publication of CN110147730A publication Critical patent/CN110147730A/en
Application granted granted Critical
Publication of CN110147730B publication Critical patent/CN110147730B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The application is applicable to the technical field of biological recognition, and provides a palmprint recognition method, a palmprint recognition device and terminal equipment, wherein the method comprises the following steps: acquiring a palmprint image to be identified, wherein the palmprint image is a rectangular image of a palmprint area in a palm; selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule, and determining a triangle image area as an image area associated with the root area of the thumb according to the target vertex; cutting off an image area associated with the thumb root area in the palm print image to be identified to obtain a target palm print image; and carrying out palm print recognition on the target palm print image by using a preset palm print recognition algorithm to obtain a palm print recognition result. The application can solve the problems that in the existing palm print recognition method, palm print characteristics are not fixed easily due to the change of the gesture of the thumb, the palm print recognition is difficult to be accurately carried out, and the accuracy of the palm print recognition is affected.

Description

Palm print recognition method and device and terminal equipment
Technical Field
The application belongs to the technical field of biological recognition, and particularly relates to a palm print recognition method, a palm print recognition device and terminal equipment.
Background
Along with the development of science and technology, more and more enterprises adopt a biological recognition mode to carry out personnel identity verification. Palmprint identification is a non-invasive biometric feature identification technology, and features in palmprint images can be used for carrying out identity identification to identify the identity of a user.
The palm print image refers to a palm image from the tail end of a finger to the wrist part, and when the palm print image is acquired, the gesture of each finger can have a certain influence on palm print characteristics, wherein the influence of the thumb and the palm print characteristics is particularly remarkable, the gesture change of the thumb can cause the palm print characteristics of partial areas in the palm print image to be unfixed, so that palm print recognition is difficult to accurately perform, and the accuracy of the palm print recognition is influenced.
In summary, in the existing palm print recognition method, palm print features are not fixed easily due to the posture change of the thumb, so that palm print recognition is difficult to be accurately performed, and accuracy of palm print recognition is affected.
Disclosure of Invention
In view of the above, embodiments of the present application provide a method, an apparatus, and a terminal device for identifying palmprints, so as to solve the problem that palmprint features are not fixed, and palmprint identification is difficult to be accurately performed, and accuracy of palmprint identification is affected in the existing palmprint identification method.
A first aspect of an embodiment of the present application provides a palmprint identifying method, including:
acquiring a palmprint image to be identified, wherein the palmprint image is a rectangular image of a palmprint area in a palm;
selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule, and determining a triangle image area as an image area associated with the root area of the thumb according to the target vertex;
cutting off an image area associated with the thumb root area in the palm print image to be identified to obtain a target palm print image;
and carrying out palm print recognition on the target palm print image by using a preset palm print recognition algorithm to obtain a palm print recognition result.
A second aspect of an embodiment of the present application provides a palmprint recognition apparatus, including:
the image acquisition module is used for acquiring a palmprint image to be identified, wherein the palmprint image is a rectangular image of a palmprint area in a palm;
the association area module is used for selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule, and determining a triangle image area as an image area associated with the root area of the thumb according to the target vertex;
the palm print cutting module is used for cutting off an image area associated with the thumb root area in the palm print image to be identified to obtain a target palm print image;
and the palm print recognition module is used for carrying out palm print recognition on the target palm print image by using a preset palm print recognition algorithm to obtain a palm print recognition result.
A third aspect of the embodiments of the present application provides a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method as described above when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the method as described above.
Compared with the prior art, the embodiment of the application has the beneficial effects that:
in the palm print identification method, the triangular image area which is related to the thumb root area in the palm print image to be identified is cut off, and then palm print identification is carried out, and as the palm print characteristics of the image area which is related to the thumb root area are not fixed and are easy to change due to the change of the gesture of the thumb, the palm print characteristics of the image area which is related to the thumb root area are not considered in the identification process, and only the image of the area with the palm print characteristics which is relatively fixed is used for carrying out palm print identification, so that the accuracy of palm print identification can be improved, and the problems that the palm print characteristics are not fixed and are difficult to accurately carry out palm print identification and the accuracy of palm print identification is influenced due to the change of the gesture of the thumb in the traditional palm print identification method are solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation flow of a palmprint recognition method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a palm print recognition device according to an embodiment of the present application;
fig. 3 is a schematic diagram of a terminal device according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a palm provided by an embodiment of the present application;
fig. 5 is a schematic diagram of a palmprint image to be identified according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," etc. are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance.
Embodiment one:
referring to fig. 1, a method for identifying a palm print according to a first embodiment of the present application includes:
step S101, acquiring a palm print image to be identified, wherein the palm print image is a rectangular image of a palm print area in a palm;
when the palm print identification is required, the palm print identification device can firstly acquire a palm print image to be identified, wherein the palm print image is a rectangular image of a palm print area in a palm. Step S103, selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule, and determining a triangle image area as an image area associated with the root area of the thumb according to the target vertex;
when the palm print image to be identified is obtained, a target vertex can be selected from the four vertices of the palm print image to be identified according to a preset vertex selection rule, a triangle image area is determined according to the target vertex, and the triangle image area is used as an image area associated with the thumb root area.
Step S103, cutting off an image area associated with the thumb root area in the palm print image to be identified to obtain a target palm print image;
and after the image area associated with the thumb root area is acquired, cutting off the image area associated with the thumb root area in the palm print image to be identified, and obtaining a target palm print image.
The image cut-out may be performed by setting the pixel value of each pixel point in the image area associated with the thumb root area to 0.
And step S104, carrying out palm print recognition on the target palm print image by using a preset palm print recognition algorithm to obtain a palm print recognition result.
After the target palm print image is obtained, a palm print recognition algorithm can be used for carrying out palm print recognition on the target palm print image to obtain a palm print recognition result, the preset palm print recognition algorithm can be selected as a preset neural network model, for example, a trained MobileNet model can be selected for palm print recognition, and a training process of the neural network model can be carried out by adopting a sample palm print image with an image area related to a thumb root area cut off.
Further, before the acquiring the palmprint image to be identified, the method further comprises:
a1, receiving a palm print identification instruction, and acquiring a first palm image according to the palm print identification instruction;
when the user needs to conduct palm print recognition, palm print recognition instructions can be sent to the palm print recognition device by touching palm print detection buttons on the palm print recognition device or making corresponding operation gestures and the like, the palm print recognition function of the palm print recognition device is triggered, the palm print recognition device can start the camera according to the palm print recognition instructions, the target detection algorithm is used for detecting palm areas in a picture shot by the camera, and the first palm images are collected.
The specific type of the target detection algorithm may be selected according to the actual situation, for example, the SSD (Single Shot MultiBox Detector) algorithm may be selected.
A2, rotating the first palm image to enable the palm in the first palm image to be aligned along a preset direction, and obtaining a second palm image;
after the first palm image is acquired, the palm direction in the first palm image may be any direction, so that the first palm image can be rotated, the palm in the first palm image is aligned along the preset direction, a second palm image is obtained, and an aligned palm print image to be identified can be obtained by cutting the aligned second palm image.
And A3, cutting the second palm image by a preset palm print image cutting algorithm to obtain a palm print image to be identified.
After the second palm image is obtained, a palm print area in the second palm image can be identified and cut by using a preset palm print image cutting algorithm, and a palm print image to be identified is obtained.
After obtaining the palm print image to be identified through a preset palm print image cutting algorithm, the shape of the palm print image to be identified can be detected, and whether the palm print image to be identified is a rectangular image or not can be judged.
When the palm print image to be identified is not a rectangular image, calculating a rectangular area with the largest area in the palm print image to be identified, wherein the sides of the rectangular area are respectively in the horizontal direction and the vertical direction.
After the rectangular area is determined through calculation, cutting the image of the rectangular area in the palm print image to be identified to obtain a rectangular palm print image, and taking the rectangular palm print image as a new palm print image to be identified.
Further, the rotating the first palm image, so that the palm in the first palm image is aligned along a preset direction, and the obtaining the second palm image specifically includes:
b1, carrying out key point identification on the first palm image to obtain palm key points of the first palm image;
when the first palm image is rotated, key point identification can be performed on the first palm image to obtain palm key points in the first palm image, the palm key points can comprise finger root points and fingertip points corresponding to all fingers, a key point identification algorithm can be selected according to practical conditions, for example, a trained Cascade CNN model can be selected to perform palm key point identification, and before the Cascade CNN model is used, the Cascade CNN model can be trained, wherein the training image is a palm sample image marked with the finger tip points and the finger root points.
And B2, rotating the first palm image by taking a connecting line of the index finger root point and the tail finger root point in the palm key points as a first line segment, so that the first line segment is in a horizontal direction and the fingers are oriented upwards, and obtaining a second palm image.
After the palm key points are obtained, a connecting line of the index finger root points and the tail finger root points in the palm key points can be used as a first line segment, and the first palm image is rotated to enable the first line segment to be in a horizontal direction and the fingers to face upwards.
The finger orientation can be identified in various ways, for example, the finger orientation can be judged by specifying a finger tip point and specifying a finger root point, for example, the middle finger root point and the ordinate of the middle finger tip point can be selected for judgment, if the ordinate of the middle finger root point is greater than the ordinate of the middle finger tip point, the finger orientation is upward, and if the ordinate of the middle finger root point is less than the ordinate of the middle finger tip point, the finger orientation is downward.
By rotating the first palm image, an aligned second palm image can be obtained, and the palm print image to be identified can be conveniently intercepted.
Further, the step of cutting the second palm image by a preset palm print image cutting algorithm to obtain a palm print image to be identified specifically includes:
and C1, determining a palm center point according to the perpendicular bisectors of the first line segments, taking the palm center point as a rectangular midpoint, taking the first line segments as the first sides of the rectangle, and cutting rectangular areas in the second palm images to obtain the palm print images to be identified.
After the second palm image is obtained, the palm print image to be identified can be intercepted in the second palm image, at this time, the palm point can be determined according to the perpendicular bisector of the first line segment, the palm point is taken as the middle point of the rectangle, the determination method of the palm point can be set according to practical situations, for example, when the rectangle is square, the length from the intersection point of the perpendicular bisector and the first line segment to the palm point is half of the length from the intersection point of the perpendicular bisector and the first line segment to the palm point, when the rectangle is not square, the length from the intersection point of the perpendicular bisector and the first line segment to the palm point can be set according to practical situations, and the position of the palm point is determined according to the preset length from the intersection point of the perpendicular bisector and the first line segment to the palm point.
Further, the selecting a target vertex from the four vertices of the palm print image to be identified according to a preset vertex selection rule, and determining a triangle image area as an image area associated with the thumb root area according to the target vertex specifically includes:
d1, selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule;
when the image area associated with the thumb root area needs to be determined, a target vertex can be selected from four vertices of the palm print image to be identified according to a preset vertex selection rule.
And D2, taking two sides connected with the target vertex as a first side and a second side of the triangle, determining a triangle image area according to the target vertex, a first preset length corresponding to the first side and a second preset length corresponding to the second side, and taking the triangle image area as an image area associated with the thumb root area.
The two sides connected with the target vertex are taken as the first side and the second side of the triangle, the first preset length corresponding to the first side and the second preset length corresponding to the second side can be preset, and since the palm print image to be identified is a rectangular image, the first side and the second side of the triangle are mutually perpendicular, and under the condition that the lengths of the right-angle vertex and the two right-angle sides of the right-angle triangle are obtained, the right-angle triangle image area can be determined, and the right-angle triangle image area is taken as the image area associated with the root area of the thumb.
Further, the preset vertex selection rule specifically includes:
e1, acquiring left and right hand attributes corresponding to the palm print image to be identified;
when the palm print image to be identified is the aligned palm print image, the left-right hand attribute corresponding to the palm print image to be identified can be acquired first.
E2, when the left-right hand attribute of the palm print image to be identified is a left-hand attribute and the finger is upward, selecting a left lower corner vertex of the palm print image to be identified as a target vertex;
the palm print image to be identified is generally an aligned image with the fingers facing upwards, and is rectangular in shape, and the rectangle includes four vertices, namely an upper left corner vertex, a lower left corner vertex, an upper right corner vertex and a lower right corner vertex.
When the left-right hand attribute of the palm print image to be identified is left-hand attribute and the finger faces upwards, the left lower corner vertex of the palm print image to be identified is used as the target vertex of the triangle.
And E3, when the left-right hand attribute of the palm print image to be identified is a right-hand attribute and the finger is upward, selecting the right lower corner vertex of the palm print image to be identified as a target vertex.
When the left-right hand attribute of the palm print image to be identified is the right hand attribute and the finger faces upwards, the right lower corner vertex of the palm print image to be identified is used as the target vertex of the triangle.
E4, when the left-right hand attribute of the palm print image to be identified is left-hand attribute and the finger faces downwards, selecting the top right corner vertex of the palm print image to be identified as a target vertex;
when the left-right hand attribute of the palm print image to be identified is left-hand attribute and the finger is downward, the right-upper corner vertex of the palm print image to be identified is used as the target vertex of the triangle.
And E5, when the left-right hand attribute of the palm print image to be identified is a right hand attribute and the finger faces downwards, selecting the top left corner vertex of the palm print image to be identified as a target vertex.
When the left-right hand attribute of the palm print image to be identified is the right hand attribute and the finger faces downwards, the left-upper corner vertex of the palm print image to be identified is used as the target vertex of the triangle.
The left and right hand attributes of the palm print image to be identified can be determined according to the abscissa of the designated key point in the palm key points, for example, the middle finger root point and the tail finger root point are selected as the designated key points, when the abscissa of the middle finger root point is smaller than the abscissa of the tail finger root point, the left and right hand attributes of the palm print image to be identified are left hand attributes, and when the abscissa of the middle finger root point is larger than the abscissa of the tail finger root point, the left and right hand attributes of the palm print image to be identified are right hand attributes.
As shown in fig. 4 and 5, the image of the rectangular area is a palm print image to be identified, when the left-right hand attribute of the palm print image to be identified is a left-hand attribute and the finger is oriented upwards, the left lower corner vertex of the palm print image to be identified is used as the target vertex of the triangle, two sides connected with the target vertex are used as the first side and the second side of the triangle, the triangle area (the triangle area with the lines thickened) is determined according to the first preset length and the second preset length, and the pixel value of the pixel point in the triangle area is set to be 0, so as to obtain the standard palm print image.
In the method for recognizing the palm print provided in the first embodiment, the triangle image area associated with the thumb root area in the palm print image to be recognized is cut off, and then the palm print recognition is performed, because the palm print characteristics of the image area associated with the thumb root area are not fixed and are easy to change due to the posture change of the thumb, the palm print characteristics of the image area associated with the thumb root area are not considered in the recognition process, and only the palm print recognition is performed by using the image of the area with the palm print characteristics relatively fixed, so that the accuracy of the palm print recognition can be improved, and the problem that the palm print recognition is difficult to accurately perform due to the fact that the palm print characteristics are not fixed due to the posture change of the thumb in the existing palm print recognition method is solved, and the accuracy of the palm print recognition is affected.
Before acquiring the palm print image to be identified, the first palm image can be acquired first, the first palm image is rotated to enable the palm to be aligned with the preset direction, and the second palm image is obtained, so that the aligned palm print image to be identified can be acquired conveniently.
In the process of rotating and aligning the first palm images, a connecting line of the root points of the index finger and the root points of the tail finger can be used as a first line segment, rotating and aligning are carried out according to the first line segment to obtain a second palm image, the palm print image to be identified is intercepted through the first line segment, a cutting method of the palm print image is simplified, and the palm print image to be identified can be obtained through cutting without complex calculation.
In determining the image area associated with the thumb root area, the image area associated with the thumb root area may be selected by a triangle, and the triangle area may be determined by selecting a target vertex of the triangle from four vertices of the palm print image to be identified by the left-right hand attribute and the finger orientation, with two sides connected to the target vertex as two right-angle sides of the triangle.
When the left and right hand attributes of the palm print image to be identified are determined, the left and right hand attributes can be identified through the key points of the palm, and compared with the existing mode of judging the left and right hand attributes through the large thenar, the calculation process is simplified, and the judgment accuracy is improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
Embodiment two:
a second embodiment of the present application provides a palm print recognition device, for convenience of description, only the portion relevant to the present application is shown, as shown in fig. 2, the palm print recognition device includes,
the image acquisition module 201 is configured to acquire a palmprint image to be identified, where the palmprint image is a rectangular image of a palmprint area in a palm;
the association area module 202 is configured to select a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule, and determine a triangle image area as an image area associated with the root area of the thumb according to the target vertex;
the palm print cutting module 203 is configured to cut off an image area associated with the thumb root area in the palm print image to be identified, so as to obtain a target palm print image;
the palm print recognition module 204 is configured to perform palm print recognition on the target palm print image by using a preset palm print recognition algorithm, so as to obtain a palm print recognition result.
Further, the device further comprises:
the image acquisition module is used for receiving a palm print identification instruction and acquiring a first palm image according to the palm print identification instruction;
the image alignment module is used for rotating the first palm image so that the palms in the first palm image are aligned along a preset direction to obtain a second palm image;
and the image cutting module is used for cutting the second palm image by a preset palm print image cutting algorithm to obtain a palm print image to be identified.
Further, the image alignment module specifically includes:
the key point sub-module is used for carrying out key point identification on the first palm image to obtain palm key points of the first palm image;
and the rotating sub-module is used for rotating the first palm image by taking the connecting line of the index finger root point and the tail finger root point in the palm key points as a first line segment, so that the first line segment is in the horizontal direction and the fingers are oriented upwards, and a second palm image is obtained.
Further, the image cutting module is specifically configured to determine a palm center point according to a perpendicular bisector of the first line segment, and cut a rectangular area in the second palm image with the palm center point as a rectangular midpoint and the first line segment as a rectangular first side, so as to obtain a palm print image to be identified.
Further, the association zone module 202 specifically includes:
the vertex sub-module is used for selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule;
and the region sub-module is used for determining a triangle image region by taking two sides connected with the target vertex as a first side and a second side of the triangle, and taking the triangle image region as an image region associated with the thumb root region according to the target vertex, a first preset length corresponding to the first side and a second preset length corresponding to the second side.
Further, the vertex submodule specifically includes:
the attribute sub-module is used for acquiring left-right hand attributes corresponding to the palm print image to be identified;
the left upper sub-module is used for selecting a left lower corner vertex of the palm print image to be identified as a target vertex when the left and right hand attribute of the palm print image to be identified is a left hand attribute and the finger is upward;
the right upper sub-module is used for selecting a right lower corner vertex of the palm print image to be identified as a target vertex when the left-right hand attribute of the palm print image to be identified is a right-hand attribute and the finger is upward;
the left lower sub-module is used for selecting the top right corner vertex of the palm print image to be identified as a target vertex when the left-right hand attribute of the palm print image to be identified is left-hand attribute and the finger faces downwards;
and the lower right sub-module is used for selecting the top left corner vertex of the palm print image to be identified as a target vertex when the left-right hand attribute of the palm print image to be identified is a right-hand attribute and the finger faces downwards.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein.
Embodiment III:
fig. 3 is a schematic diagram of a terminal device according to a third embodiment of the present application. As shown in fig. 3, the terminal device 3 of this embodiment includes: a processor 30, a memory 31 and a computer program 32 stored in said memory 31 and executable on said processor 30. The processor 30, when executing the computer program 32, implements the steps of the above-described palmprint recognition method embodiment, such as steps S101 to S104 shown in fig. 1. Alternatively, the processor 30 may perform the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules 201-204 shown in fig. 2, when executing the computer program 32.
Illustratively, the computer program 32 may be partitioned into one or more modules/units that are stored in the memory 31 and executed by the processor 30 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 32 in the terminal device 3. For example, the computer program 32 may be divided into an image acquisition module, an associated region module, a palmprint excision module, and a palmprint recognition module, each of which functions specifically as follows:
the image acquisition module is used for acquiring a palmprint image to be identified, wherein the palmprint image is a rectangular image of a palmprint area in a palm;
the association area module is used for selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule, and determining a triangle image area as an image area associated with the root area of the thumb according to the target vertex;
the palm print cutting module is used for cutting off an image area associated with the thumb root area in the palm print image to be identified to obtain a target palm print image;
and the palm print recognition module is used for carrying out palm print recognition on the target palm print image by using a preset palm print recognition algorithm to obtain a palm print recognition result.
The terminal device 3 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor 30, a memory 31. It will be appreciated by those skilled in the art that fig. 3 is merely an example of the terminal device 3 and does not constitute a limitation of the terminal device 3, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device may further include an input-output device, a network access device, a bus, etc.
The processor 30 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field-programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may be an internal storage unit of the terminal device 3, such as a hard disk or a memory of the terminal device 3. The memory 31 may be an external storage device of the terminal device 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the terminal device 3. The memory 31 is used for storing the computer program as well as other programs and data required by the terminal device. The memory 31 may also be used for temporarily storing data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (9)

1. A method of palmprint recognition, comprising:
acquiring a palmprint image to be identified, wherein the palmprint image is a rectangular image of a palmprint area in a palm;
selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule, and determining a triangle image area as an image area associated with the root area of the thumb according to the target vertex;
cutting off an image area associated with the thumb root area in the palm print image to be identified to obtain a target palm print image;
performing palm print recognition on the target palm print image by using a preset palm print recognition algorithm to obtain a palm print recognition result;
selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule, and determining a triangle image area as an image area associated with the thumb root area according to the target vertex specifically comprises:
selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule;
and determining a triangle image area by taking two sides connected with the target vertex as a first side and a second side of the triangle, and taking the triangle image area as an image area associated with the thumb root area according to the target vertex, a first preset length corresponding to the first side and a second preset length corresponding to the second side.
2. The palm print identification method of claim 1, further comprising, prior to said acquiring the palm print image to be identified:
receiving a palmprint recognition instruction, and collecting a first palmprint image according to the palmprint recognition instruction;
rotating the first palm image to enable the palm in the first palm image to be aligned along a preset direction, and obtaining a second palm image;
and cutting the second palm image by a preset palm print image cutting algorithm to obtain a palm print image to be identified.
3. The method of claim 2, wherein rotating the first palm image such that the palm in the first palm image is aligned along a preset direction, and obtaining the second palm image specifically comprises:
performing key point identification on the first palm image to obtain palm key points of the first palm image;
and rotating the first palm image by taking a connecting line of the index finger root point and the tail finger root point in the palm key points as a first line segment, so that the first line segment is in a horizontal direction and the fingers are oriented upwards, and obtaining a second palm image.
4. The method for recognizing palmprint as recited in claim 3, wherein the step of cutting the second palmprint image with a preset palmprint image cutting algorithm includes:
and determining a palm center point according to the perpendicular bisectors of the first line segments, taking the palm center point as a rectangular midpoint, taking the first line segments as the first sides of the rectangle, and cutting rectangular areas in the second palm images to obtain the palm print images to be identified.
5. The method for identifying palmprint as recited in any one of claims 1 to 4, wherein the preset vertex selection rules specifically include:
acquiring left and right hand attributes and finger orientations corresponding to the palm print image to be identified;
when the left-right hand attribute of the palm print image to be identified is left-hand attribute and the finger is upward, selecting the left lower corner vertex of the palm print image to be identified as a target vertex;
when the left-right hand attribute of the palm print image to be identified is a right-hand attribute and the finger is upwards oriented, selecting a right lower corner vertex of the palm print image to be identified as a target vertex;
when the left-right hand attribute of the palm print image to be identified is left-hand attribute and the finger faces downwards, selecting the top right corner vertex of the palm print image to be identified as a target vertex;
and when the left-right hand attribute of the palm print image to be identified is the right hand attribute and the finger faces downwards, selecting the top left corner vertex of the palm print image to be identified as the target vertex.
6. A palm print recognition device for implementing the palm print recognition method according to any one of claims 1 to 5, the palm print recognition device comprising:
the image acquisition module is used for acquiring a palmprint image to be identified, wherein the palmprint image is a rectangular image of a palmprint area in a palm;
the association area module is used for selecting a target vertex from four vertices of the palm print image to be identified according to a preset vertex selection rule, and determining a triangle image area as an image area associated with the root area of the thumb according to the target vertex;
the palm print cutting module is used for cutting off an image area associated with the thumb root area in the palm print image to be identified to obtain a target palm print image;
and the palm print recognition module is used for carrying out palm print recognition on the target palm print image by using a preset palm print recognition algorithm to obtain a palm print recognition result.
7. The palm print identification device of claim 6, wherein the device further comprises:
the image acquisition module is used for receiving a palm print identification instruction and acquiring a first palm image according to the palm print identification instruction;
the image alignment module is used for rotating the first palm image so that the palms in the first palm image are aligned along a preset direction to obtain a second palm image;
and the image cutting module is used for cutting the second palm image by a preset palm print image cutting algorithm to obtain a palm print image to be identified.
8. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 5 when the computer program is executed.
9. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 5.
CN201910302774.2A 2019-04-15 2019-04-15 Palm print recognition method and device and terminal equipment Active CN110147730B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910302774.2A CN110147730B (en) 2019-04-15 2019-04-15 Palm print recognition method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910302774.2A CN110147730B (en) 2019-04-15 2019-04-15 Palm print recognition method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN110147730A CN110147730A (en) 2019-08-20
CN110147730B true CN110147730B (en) 2023-10-31

Family

ID=67588265

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910302774.2A Active CN110147730B (en) 2019-04-15 2019-04-15 Palm print recognition method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN110147730B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103198304A (en) * 2013-04-19 2013-07-10 吉林大学 Palm print extraction and identification method
CN103268483A (en) * 2013-05-31 2013-08-28 沈阳工业大学 Method for recognizing palmprint acquired in non-contact mode in open environment
CN104866804A (en) * 2014-02-20 2015-08-26 阿里巴巴集团控股有限公司 Palm print information identification method and palm print information identification device
CN105701513A (en) * 2016-01-14 2016-06-22 深圳市未来媒体技术研究院 Method of rapidly extracting area of interest of palm print
CN105938549A (en) * 2016-06-08 2016-09-14 大连民族大学 Palm print ROI segmentation method in palm print identification
CN107016323A (en) * 2016-01-28 2017-08-04 厦门中控生物识别信息技术有限公司 A kind of localization method and device of palm area-of-interest
CN109190460A (en) * 2018-07-23 2019-01-11 南京航空航天大学 Based on cumulative matches and etc. error rates hand shape arm vein fusion identification method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103198304A (en) * 2013-04-19 2013-07-10 吉林大学 Palm print extraction and identification method
CN103268483A (en) * 2013-05-31 2013-08-28 沈阳工业大学 Method for recognizing palmprint acquired in non-contact mode in open environment
CN104866804A (en) * 2014-02-20 2015-08-26 阿里巴巴集团控股有限公司 Palm print information identification method and palm print information identification device
CN105701513A (en) * 2016-01-14 2016-06-22 深圳市未来媒体技术研究院 Method of rapidly extracting area of interest of palm print
CN107016323A (en) * 2016-01-28 2017-08-04 厦门中控生物识别信息技术有限公司 A kind of localization method and device of palm area-of-interest
CN105938549A (en) * 2016-06-08 2016-09-14 大连民族大学 Palm print ROI segmentation method in palm print identification
CN109190460A (en) * 2018-07-23 2019-01-11 南京航空航天大学 Based on cumulative matches and etc. error rates hand shape arm vein fusion identification method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于几何特征的掌纹区域分割算法;李子良;田启川;朱艳春;李临生;;计算机应用研究;第27卷(第06期);第2370-2376页 *

Also Published As

Publication number Publication date
CN110147730A (en) 2019-08-20

Similar Documents

Publication Publication Date Title
EP3537375B1 (en) Image segmentation methods, image segmentation system and device comprising same, and storage medium
CN111178250B (en) Object identification positioning method and device and terminal equipment
CN108230383B (en) Hand three-dimensional data determination method and device and electronic equipment
CN111815754B (en) Three-dimensional information determining method, three-dimensional information determining device and terminal equipment
WO2018028546A1 (en) Key point positioning method, terminal, and computer storage medium
CN109829368B (en) Palm feature recognition method and device, computer equipment and storage medium
EP3113114A1 (en) Image processing method and device
CN107958230B (en) Facial expression recognition method and device
CN110008824B (en) Palmprint recognition method, palmprint recognition device, palmprint recognition computer device and palmprint recognition storage medium
US10922535B2 (en) Method and device for identifying wrist, method for identifying gesture, electronic equipment and computer-readable storage medium
CN111597910A (en) Face recognition method, face recognition device, terminal equipment and medium
CN116129350B (en) Intelligent monitoring method, device, equipment and medium for safety operation of data center
CN113095292A (en) Gesture recognition method and device, electronic equipment and readable storage medium
CN116071790A (en) Palm vein image quality evaluation method, device, equipment and storage medium
CN113780201B (en) Hand image processing method and device, equipment and medium
CN110032941B (en) Face image detection method, face image detection device and terminal equipment
CN112597978B (en) Fingerprint matching method and device, electronic equipment and storage medium
CN113705344A (en) Palm print recognition method and device based on full palm, terminal equipment and storage medium
CN113934312B (en) Touch object identification method based on infrared touch screen and terminal equipment
CN110147730B (en) Palm print recognition method and device and terminal equipment
CN111783677A (en) Face recognition method, face recognition device, server and computer readable medium
CN108629219B (en) Method and device for identifying one-dimensional code
CN105224957A (en) A kind of method and system of the image recognition based on single sample
CN115565103A (en) Dynamic target detection method and device, computer equipment and storage medium
US20170185831A1 (en) Method and device for distinguishing finger and wrist

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant