CN110008871B - Palm print identification method and system - Google Patents

Palm print identification method and system Download PDF

Info

Publication number
CN110008871B
CN110008871B CN201910229458.7A CN201910229458A CN110008871B CN 110008871 B CN110008871 B CN 110008871B CN 201910229458 A CN201910229458 A CN 201910229458A CN 110008871 B CN110008871 B CN 110008871B
Authority
CN
China
Prior art keywords
palm
pattern
positioning
palm print
texture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201910229458.7A
Other languages
Chinese (zh)
Other versions
CN110008871A (en
Inventor
曲晓峰
郭振华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Graduate School Tsinghua University
Original Assignee
Shenzhen Graduate School Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Graduate School Tsinghua University filed Critical Shenzhen Graduate School Tsinghua University
Priority to CN201910229458.7A priority Critical patent/CN110008871B/en
Publication of CN110008871A publication Critical patent/CN110008871A/en
Application granted granted Critical
Publication of CN110008871B publication Critical patent/CN110008871B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention provides a palm print recognition method and a system, wherein the method comprises the following steps: s1, generating a positioning pattern with screening property suitable for drawing on the palm of human hand, so as to draw the pattern on the palm; s2, shooting a palm image with a positioning pattern; s3, positioning according to the positioning pattern, and extracting the interested region from one or more selected regions in the pattern design; s4, extracting features from the positioned one or more regions of interest; and S5, performing texture recognition by using the extracted features, and performing positioning pattern recognition when the positioning pattern has other recognizable information. The biometric recognition after the transfer of the pattern such as the two-dimensional code is the recognition of the combination of the transfer pattern and the skin texture. In this way, the transfer pattern ensures revocable and reusability of the identification features; the skin texture ensures the reliability of the identification of the person.

Description

Palm print identification method and system
Technical Field
The invention belongs to the field of biological feature identification, and particularly relates to a palm print identification method and a palm print identification system.
Technical Field
The biological characteristic recognition technology extracts the characteristics of the body surface of a human body to recognize the identity. Compared with a method for identifying the identity by using the possessor and the remembered information such as a key, a password and the like, the method has the advantages that the biological characteristics cannot be lost or forgotten, and the method is safer and more convenient. The biological feature recognition technology has been widely developed under the era opportunity of recent information technology and artificial intelligence outbreak.
Common biometric identification methods include fingerprint identification, face identification, iris identification, and palmprint identification. These identification techniques share the property that their biometric characteristics are not changed throughout life, or at least remain stable for a long period of time. This feature has the advantage that the identification of the identity is reliable for a long time. But also has the disadvantage that these biological characteristics, once revealed, stolen or accidentally disclosed, can cause irreparable loss. Because these features are effective for a long time and cannot be revoked, the identity security and privacy of the owner cannot be protected. At least technically, this loss is irreparable. In application, the punishment of the alarm-related theft and utilization behaviors is mainly relied on by a rigorous and drastic method. Accordingly, law enforcement is costly and difficult to ensure security, especially for individual users who are consumers, without simply directly confirming whether an individual's biometric information has been irrevocably revoked by a manufacturer or device.
The existing template protection method for biological feature recognition generally encrypts template data at a data level and encrypts a biological feature template by using symmetric and asymmetric keys. However, such template encryption depends on the encryption means and technical process of the manufacturer itself, and from the perspective of the user, it is impossible to revoke or reliably confirm that the template is indeed destroyed or revoked.
On the other hand, the prior art has also developed some counterfeiting techniques designed to combat biometric identification. For example, in terms of physical characteristics, some common targeted counterfeiting technologies include contact lenses for iris recognition (also called cosmetic pupils or iris patches), colloid headgear for human faces and various cosmetic face-lifting technologies, colloid copy technology for fingerprint and palm print recognition, and so on. The above techniques are all forgery techniques designed to combat biometric identification, and in some applications, can pass through systems lacking corresponding challenge detection techniques to some extent. Therefore, once the biometric information of the individual is revealed, there is a risk of being counterfeited; plus the biometric information of the individual is irrevocable, this risk cannot be eliminated once it is established.
Therefore, there is a need to provide an anti-counterfeit technology for identification and detection.
On the other hand, the palm print recognition is very accurate at present, and is difficult to apply in practice, mainly because the palm prints of the same person and the same palm at different positions are very different, and the palm prints can be greatly changed under the condition that the palm does various actions. Therefore, in the prior art, palm print recognition has instability.
Disclosure of Invention
The invention aims to provide an improved palm print identification technology, which is mainly improved in that the technology can provide stable and reliable biometric identification within a period of time and can be cancelled.
Therefore, the invention provides a palm print identification method, which comprises the following steps: s1, generating a positioning pattern with screening property suitable for drawing on the palm of human hand, so as to draw the positioning pattern on the palm; s2, shooting a palm image with the positioning pattern; s3, positioning according to the positioning pattern, extracting a Region of Interest (ROI) from one or more selected regions in the pattern design; s4, extracting characteristics from the positioned one or more regions of interest; and S5, performing texture recognition by using the extracted features, and performing recognition of the positioning pattern.
In some embodiments of the invention, the following features are also included:
the positioning pattern comprises positioning information and pattern information which can be identified by a computer;
in step S1, the method of drawing the pattern on the palm is one of the following: printing and manufacturing tattoo patches in batches according to the pattern template; when in use, the tattoo patch is stuck on the palm, and then adhesive paper is put down to leave the tattoo patch pattern; or inputting the pattern into a tattoo printer, and printing the pattern on the palm using the tattoo printing apparatus.
The pattern in step S1 is one of the following ways: is a simple rectangular frame used as a basic positioning; or adding a rectangular frame with directional positioning guide information, wherein the positioning guide information specifically comprises: a rectangle frame, a square frame, a round frame or an inner circle and outer square area with a unfilled corner or a round corner or a corner with a dot or a corner with a five-pointed star; or three standard shapes are used, including square, triangular and round positioning modes; or a two-dimensional code.
The imaging scheme for capturing the palm image in step S2 is one of the following: placing and taking a picture in the position and range required by the equipment, wherein the imaging distance and the camera visual field range are selected during the design and debugging of the equipment; or incomplete fixed position, can freely put in horizontal position, include: limiting the specific posture of the hand, and directly searching and positioning by using a template matching method in the later stage; or the specific posture of the hand is not limited, and the color and the texture of the palm skin are positioned in the later period; or directly positioning the palm position by using an object detection method based on deep learning; or the horizontal direction and the depth direction are not limited, and the palm is searched in the three-dimensional space at the later stage; or scan the palm print image using a line scan image sensor.
In step S3, the imaging scheme according to pattern positioning is: extracting haar or sift characteristics from the image, and then searching by using a pre-trained template; or directly using an object detection method based on deep learning.
In step S3, the method of detecting the human hand is one of the following: extracting a foreground and a background according to the difference of the front frame and the back frame of the image; extracting a hand area in the foreground according to the skin color of the hand; or an object detection method based on deep learning is used.
In step S4, the range of the extracted features is an area where the skin is not colored within the pattern range; the method comprises the following steps: the areas in the rectangular frame pattern, the areas inside the rectangular frame, and the areas of the plurality of rectangular blocks in the two-dimensional code pattern that are not colored.
In step S4, for each independent rectangular region, the specific feature extraction method is to use n Gabor filters for filtering, where n may be selected from 4 to 8; extracting a filtering result of each filter; and comparing the filter results at the same position in the image, and selecting the maximum value as the current position code.
The scheme for comparison and identification in step S5 is: corresponding to each skin uncolored area, comparing the feature codes of the areas, and calculating the feature similarity ratio of the sampling feature codes and the template feature codes; and counting the feature similarity ratio of each region, and if the total similarity ratio exceeds a threshold value, determining that the current sampling palm print is matched with the template palm print.
The invention also comprises a palm print recognition system which adopts the palm print recognition method to recognize the palm print.
Due to the adoption of the scheme, the invention has the beneficial effects that: the invention provides an identity authentication method capable of being autonomously and physically revoked by a user, and provides a short-term biological feature recognition method. The method has the advantages that the additional information is superposed on the biological characteristics, and the pattern assistance is adopted, so that the biological characteristics are high in positioning speed, stable in identification and high in performance.
The biometric recognition after the transfer of the pattern such as the two-dimensional code is the recognition of the combination of the transfer pattern and the skin texture. In this way, the transfer pattern ensures revocable and reusability of the identification features; the skin texture ensures the stability and reliability of the identification of the person.
The invention adds the positioning pattern with screening property, so that people can screen a proper readable palm print area according to the shape and the position of the pattern, thereby overcoming the instability and the high difficulty of the whole palm print identification, and thus, the accuracy of the palm print identification can be improved to the magnitude of the iris identification and greatly exceeds the precision of the fingerprint identification.
Drawings
FIG. 1 is a schematic diagram of a basic flow of an embodiment of the present invention.
Fig. 2A, 2B, and 2C are schematic diagrams of the texture of the Gabor filter at different angles.
Fig. 3 is a schematic diagram of an actual flow chart in the application of the embodiment of the present invention.
Fig. 4 is a schematic diagram of a two-dimensional palm print code in an embodiment of the invention.
Detailed Description
The basic flow of the scheme of the embodiment is shown in fig. 1, and comprises the following steps:
s1, generating a positioning pattern with screening property suitable for drawing on the palm of a human hand, and drawing the pattern on the palm; screening properties, in this example, refer to: the pattern itself has two parts of black and transparent, the black part is covered; transparent portions, revealing the skin texture. Namely: the skin can be selectively covered and exposed.
S2, shooting a palm image with a positioning pattern;
s3, positioning according to the positioning pattern, extracting a Region of Interest (ROI) from one or more selected regions in the pattern design;
s4, extracting features from the positioned one or more regions of interest;
and S5, performing texture recognition including palm print recognition and positioning pattern recognition by using the extracted features.
The positioning pattern comprises positioning information and pattern information which can be identified by a computer. The positioning information includes information for determining a palm print area, such as shape and position information of the positioning pattern itself; the computer-recognizable pattern information includes additional information such as a two-dimensional code or the like which is readable by a computer, in addition to the shape and position information of the pattern itself.
S1, drawing in drawing patterns on a palm includes two implementation methods:
1. and manufacturing tattoo patches in batches according to the pattern template. When in use, the tattoo patch is stuck on the palm, and then adhesive paper is put down to leave the tattoo patch pattern;
2. the pattern is input to a tattoo printer and printed onto the palm using tattoo printing.
Comparison of the two methods: the tattoo paste is manufactured in batch, is low in cost, simple to use and high in pasting speed, but needs to be prepared in advance, only can be added with uniform preset information, and cannot temporarily modify or add personal information of more users. The tattoo printer is expensive, but convenient to modify, and can be designed and printed at any time, and the design and printing of the patterns are convenient and fast to use.
The mode that utilizes tattoo subsides or print tattoo pattern draws, and good effect lies in: the printing effect is stable in a short time and can be used for five to seven days; when needed, the product can be washed away by alcohol at any time without leaving any trace; the whole process is safe and harmless, and the realization basis is that the user can physically revoke the identification scheme at any time.
Wherein, the pattern in the 'S1 drawing pattern on the palm' has a plurality of realization modes according to the design requirement:
1. the simplest mode is a simple rectangular frame which is used as the simplest positioning information;
2. positioning guide information with directions can be added, such as a rectangular frame, a square frame, a circular frame, an inner circle and an outer circle, and the like, wherein the rectangular frame has a notch or a corner with a round angle or a corner with a round dot or a corner with a five-pointed star;
3. three standard shape (square, triangular, circular) positioning modes can be used;
4. a two-dimensional code can be directly drawn.
Different patterns in which additional information of different complexity can be added. The simple rectangular frame only has basic positioning information. The direction positioning guide is adopted, the positive direction can be indicated, and the running time of feature extraction (feature extraction according to ROI range) and a matching algorithm (scaling and moving for multiple times for adapting to displacement deviation) can be obviously saved. Three standard shapes are used for positioning, scale information is given besides the marking range and the positive direction, and a basis is provided for image scaling standardization. The two-dimension code is directly drawn, so that on one hand, the stable and reliable realization of the ecology of the two-dimension code (the prior public technology) can be fully utilized to provide positioning information such as range, direction, scale and the like; another aspect also provides the ability to add customized information. For example: the feature identification of a certain international conference is participated in, and the conference related information can be added into the two-dimensional code, so that the search and the query are facilitated; the two-dimensional code has an information error correction mechanism, and can be properly beautified by tolerating small-range tampering, such as LOGO with a conference; meanwhile, the two-dimension code is convenient for a user or other third parties without complete biological feature identification authentication qualification to partially read related information, for example, participating people use the two-dimension code to favor the hotel having a related agreement with the conference group committee.
Wherein, the good effect of the 'S1 drawing pattern on the palm' is as follows:
1. drawing patterns on the palm can assist in positioning. This positioning has two contributing scenarios. a. When the palm is shot, the drawn pattern can be used as positioning information to determine the range of the image to be shot so as to prevent incomplete shooting of the image or to use the pattern as a reference when the imaging parameters such as the size of the visual field, the distance of the focal length, the zoom magnification and the like need to be adjusted. b. When locating a region of interest (ROI), the pattern can be used directly as a region of the ROI. In the traditional biological feature recognition technology, when the pattern assistance is not provided, natural features (palm edges, same texture regions and skin color regions) are mostly adopted as the basis for ROI positioning, but the natural features have the problems of complex extraction method, large display difference under different visual angles and the like, and are not stable and reliable enough. The pattern is used as the position basis for ROI extraction, because the pattern is designed and defined in advance, the extraction method can be extremely simple and reliable, and because the pattern is drawn on the surface of palm skin, the position is also very stable.
2. The pattern is drawn on the palm with colored (black or colored) areas and uncolored (skin-primary) areas. The colored areas are used primarily for positioning or for carrying information. And an uncolored region for feature extraction. The combination of the two provides space for the application.
Wherein the imaging scheme of "s 2. taking palm image" may be:
1. most basically, the imaging distance and the camera field of view are fixed, and the palm is placed at a designed position (fixed distance, adjusted focal length, illumination, depth of field and aperture) to take a picture. In order to further improve imaging quality, an imaging environment can be protected, a light-tight material is used for forming a shell of the container or the equipment, imaging equipment is fixed, the placing position of the palm is indicated, or the positioning column is directly placed, so that the palm is required to be clamped on the positioning column.
2. The device has the advantages that the device is not completely fixed in position, can be freely placed in a horizontal position, has two schemes, has specific postures of hands, and directly uses a template matching method to search and position in the later period; the specific posture of the hand is not limited, and the palm skin color and texture are used for positioning in the later period; the alternative also directly uses an object detection method based on deep learning to directly locate the palm position.
3. Neither the horizontal nor the depth direction is limited. The palm needs to be retrieved later in three-dimensional space.
4. A line scan image sensor may also be used to scan the palm print image. The line scanning image sensor needs to scan a whole palm print image by combining a roller and a photoelectric encoder for scanning one line at a time for imaging.
Wherein, the imaging scheme of 'S3. positioning according to the pattern' is as follows:
1. according to the positioning method of image positioning, two implementation schemes exist: a. directly detecting the pattern; b. the hand is detected first and then the pattern is detected.
2. The specific pattern detection mode is determined according to the pattern. The method comprises the following steps: haar or sift (scale innovative feature transform) features are extracted from the image and then searched using a pre-trained template. The object detection method based on deep learning can also be directly used for detection. (deep learning-based object detection methods may refer to MASK R-CNN).
3. The detection of human hands has two schemes: a. extracting a foreground and a background according to the difference of the front frame and the back frame of the image; and extracting a hand area in the foreground according to the skin color of the hand. b. An object detection method based on deep learning is used.
Wherein, the scheme of 'S4. extracting the characteristics' is as follows:
1. the range of feature extraction is the area of uncolored skin within the pattern. For example, in a rectangular frame pattern, the area inside the rectangular frame; in the two-dimensional code pattern, a plurality of rectangular block regions are not colored.
2. For each independent rectangular region, a specific feature extraction method is to use n Gabor filters for filtering (different filter filtering directions are different), and n can be selected from 4 to 8. The filtering results of the respective filters are extracted.
The Gabor filter is simply a superposition of a sine function and a gaussian function, as shown in fig. 2A-2C. Wherein, the texture features in different directions can be extracted when the angles are different. Typically taken from 0-pi average. When n is 4 in the scheme, the value is selected from [0,1/4 pi, 1/2 pi and 3/4 pi ].
3. And comparing the filter results at the same position in the image, and selecting the maximum value as the current position code. The meaning of representation is the main texture direction of the current position.
The feature extraction schemes are available, and mainly refer to various texture extraction methods, and here, it is mainly noted that because a palm print is blocked by a pattern, the palm print needs to be divided into a plurality of rectangular image blocks, and each rectangular image block needs to be separately subjected to feature extraction coding. And finally, summarizing the data for identification.
Wherein, the scheme of the 'S5. comparison identification' is as follows:
1. and corresponding to each skin uncolored area, comparing the feature codes of the areas, and calculating the feature similarity ratio of the sampling feature code and the template feature code.
2. And counting the similarity ratio of the characteristics of each region, and if the total similarity ratio exceeds a threshold value (such as 80%), considering that the current sampling palm print is matched with the template palm print.
The actual flow when applied is shown in fig. 3.
The technical scheme of the embodiment of the invention has the following beneficial effects:
1. an authentication method which can be physically revoked by a user autonomously;
2. a short-term biometric identification method;
3. superimposing additional information on the biometric feature;
4. the pattern assistance is provided, the biological feature positioning speed is high, the identification is stable, and the performance is high;
5. because the positioning pattern with screening property is added, the proper readable palm print area can be screened according to the shape and the position of the pattern, so that the instability and the high difficulty of the whole palm print identification are overcome, the accuracy of the palm print identification can be improved to the magnitude of the iris identification, and the accuracy of the palm print identification is greatly improved.
The core of the embodiment of the invention is biological characteristic identification after the two-dimensional code is transferred, namely identification of combination of the transferred pattern and skin texture. In this way, the transfer pattern ensures revocable and reusability of the identification features; the skin texture ensures the stability of the identification of the person.
Examples of applications are:
the following description will be given to the application of the technology of the embodiment of the present invention in a practical situation, taking an example of a king going to a mountain house to participate in a three-day conference:
after the king registration is signed in, the group generates a two-dimensional code with the identity information and the meeting information of the king, prints the two-dimensional code on the palm (inner side of the palm) of the left hand (or right hand) of the king, and performs system registration of biometric information as shown in fig. 4.
When a meeting is attended, only the stretching hand of a king party needs to be gently swept on the scanner, and the palm image with the two-dimensional code is shot by the scanner, so that the identity and meeting information of the king party can be accurately identified by the image. In the adjacent Tian-chi region, even if an identical two-dimensional code is printed on the hand, even if the two-dimensional code is indistinguishable, the palm texture of the blank region of the two-dimensional code is different, and the palm texture is simply detected and identified as counterfeit.
In the identification process, when a security check scanner of a party service group is used for scanning a left hand (or a right hand), the area to be identified can be quickly and accurately positioned according to the two-dimensional code in the acquired image; confirming the identity of the Wangzhi by biological characteristics according to the texture of the non-painted area in the two-dimensional code; and meanwhile, the specific item of the royal party and other information of the royal party, such as whether the specific item is a VIP ticket, are confirmed according to the two-dimensional code information.
The special dining room of the great party can simply scan the two-dimensional code, confirm the information of the party participating in the king and provide lunch, and does not need to leak the biological characteristic information of the king to corresponding external suppliers even locally.
Every king can normally carry out various daily activities outside the meeting. The patterns can not be worn during swimming and bathing.
After meeting, the royal jelly can perfectly wash off the patterns without residues only by washing hands with alcohol solution. The scanners of the crew can no longer identify a king. Meanwhile, the king can also actively confirm that the related authentication information of the party is completely cancelled.
And when the user participates in other conferences again, the two-dimensional code is printed again. Even if the conference is completely in the same position, the two-dimensional codes are different, the black and exposed areas are different, and the areas for extracting the features are different each time different conferences have different information. Meanwhile, the two-dimensional code re-pasted or printed has a slightly different position with a high probability. Thus, different two-dimensional codes and different biological characteristics have no worry about information leakage or repetition.

Claims (10)

1. A palm print recognition method is characterized by comprising the following steps:
s1, generating a positioning pattern with screening property suitable for drawing on the palm of human hand, so as to draw the positioning pattern on the palm;
s2, shooting a palm image with the positioning pattern;
s3, positioning according to the positioning pattern, and extracting the interested area from one or more selected areas in the pattern design;
s4, extracting characteristics from the positioned one or more regions of interest;
s5, identifying the palm print texture by using the extracted features, and identifying the positioning pattern;
the positioning pattern covers one part of the palm texture and exposes the other part of the palm texture, so that the exposed palm texture is screened from the whole palm texture according to the shape and the position of the positioning pattern to form a readable palm texture area, and the positioning pattern and the readable palm texture area are combined for identification, so that the removable biometric identification is realized.
2. The palm print recognition method of claim 1, characterized in that: the positioning pattern comprises positioning information and pattern information which can be identified by a computer;
in step S1, the method of drawing the pattern on the palm is one of the following:
printing and manufacturing tattoo pastes on the gummed paper in batches according to the pattern template; when in use, the tattoo patch is stuck on the palm, and then adhesive tapes are put down to leave a tattoo patch pattern; or
The pattern is input to a tattoo printer and printed on the palm using a tattoo printing apparatus.
3. The palm print recognition method of claim 1, characterized in that: the positioning pattern in step S1 is one of the following ways:
a simple rectangular frame, which can be used as a basic positioning; or
Adding a rectangular frame with directional positioning guide information, wherein the positioning guide information specifically comprises the following steps: a rectangle frame, a square frame, a round frame or an inner circle and outer square area with a unfilled corner or a round corner or a corner with a dot or a corner with a five-pointed star; or
Three standard shapes are used, including square, triangular and round positioning modes; or
A two-dimensional code.
4. The palm print recognition method according to claim 1, wherein the imaging scheme for taking the palm image in step S2 is one of:
placing and taking a picture of the palm in a position and within a range required by the equipment, wherein the imaging distance and the camera visual field range are selected during the design and debugging of the equipment; or
Incomplete fixed position can be freely put at horizontal position, includes: limiting the specific posture of the hand, and directly searching and positioning by using a template matching method in the later stage; or the specific posture of the hand is not limited, and the color and the texture of the palm skin are positioned in the later period; or directly positioning the palm position by using an object detection method based on deep learning; or
The horizontal direction and the depth direction are not limited, and the palm is searched in the three-dimensional space at the later stage; or
The palm print image is scanned using a line scan image sensor.
5. The palm print recognition method of claim 1, characterized in that: in step S3, the imaging scheme according to pattern positioning is: extracting haar or sift characteristics from the image, and then searching by using a pre-trained template; or directly using an object detection method based on deep learning.
6. The palm print recognition method of claim 1, characterized in that: in step S3, the method of detecting the human hand is one of the following: extracting a foreground and a background according to the difference of the front frame and the back frame of the image; extracting a hand area in the foreground according to the skin color of the hand; or an object detection method based on deep learning is used.
7. The palm print recognition method of claim 1, characterized in that: in step S4, the range of the extracted features is an area where the skin is not colored within the pattern range; the method comprises the following steps: the areas in the rectangular frame pattern, the areas inside the rectangular frame, and the areas of the plurality of rectangular blocks in the two-dimensional code pattern that are not colored.
8. The palm print recognition method of claim 1, characterized in that: in step S4, for each independent rectangular region, the specific feature extraction method is to use n Gabor filters for filtering, where n is selected from 4 to 8; extracting a filtering result of each filter; and comparing the filter results at the same position in the image, and selecting the maximum value as the current position code.
9. The palm print recognition method of claim 1, characterized in that: the scheme for comparison and identification in step S5 is: corresponding to each skin uncolored area, comparing the feature codes of the areas, and calculating the feature similarity ratio of the sampling feature codes and the template feature codes; and counting the feature similarity ratio of each region, and if the total similarity ratio exceeds a threshold value, determining that the current sampling palm print is matched with the template palm print.
10. A palm print recognition system, comprising: a palm print recognition method as claimed in any one of claims 1 to 9.
CN201910229458.7A 2019-03-25 2019-03-25 Palm print identification method and system Expired - Fee Related CN110008871B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910229458.7A CN110008871B (en) 2019-03-25 2019-03-25 Palm print identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910229458.7A CN110008871B (en) 2019-03-25 2019-03-25 Palm print identification method and system

Publications (2)

Publication Number Publication Date
CN110008871A CN110008871A (en) 2019-07-12
CN110008871B true CN110008871B (en) 2020-12-11

Family

ID=67168028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910229458.7A Expired - Fee Related CN110008871B (en) 2019-03-25 2019-03-25 Palm print identification method and system

Country Status (1)

Country Link
CN (1) CN110008871B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110866464A (en) * 2019-10-29 2020-03-06 上海躲猫猫信息技术有限公司 Commodity traceability anti-counterfeiting method and system based on texture partition

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6920236B2 (en) * 2001-03-26 2005-07-19 Mikos, Ltd. Dual band biometric identification system
CN101133959A (en) * 2006-08-30 2008-03-05 上海雷硕医疗器械有限公司 Palmprint image collecting device
CN102254188B (en) * 2011-08-04 2013-03-13 汉王科技股份有限公司 Palmprint recognizing method and device
CN103617441A (en) * 2013-11-27 2014-03-05 上海电机学院 Generating system and pattern recognition method of seal based on two-dimension code
CN104866804B (en) * 2014-02-20 2019-10-11 阿里巴巴集团控股有限公司 A kind of method and apparatus of palm print information identification
KR102183873B1 (en) * 2014-07-04 2020-11-30 매그나칩 반도체 유한회사 Recognizing apparatus a moving direction of gesture and method thereof
CN104123537B (en) * 2014-07-04 2017-06-20 西安理工大学 A kind of quick auth method based on hand and personal recognition
CN104182764A (en) * 2014-08-19 2014-12-03 田文胜 Pattern recognition system
CN104361353B (en) * 2014-11-17 2017-11-10 山东大学 A kind of application of localization method of area-of-interest in instrument monitoring identification
CN106250958A (en) * 2016-03-25 2016-12-21 立德高科(昆山)数码科技有限责任公司 With hiding mark combination tag and generate method with generate system
CN108416338B (en) * 2018-04-28 2021-12-14 深圳信息职业技术学院 Non-contact palm print identity authentication method

Also Published As

Publication number Publication date
CN110008871A (en) 2019-07-12

Similar Documents

Publication Publication Date Title
Raja et al. Morphing attack detection-database, evaluation platform, and benchmarking
CN108985134B (en) Face living body detection and face brushing transaction method and system based on binocular camera
Sharif et al. Face Recognition using Gabor Filters.
CN109583304A (en) A kind of quick 3D face point cloud generation method and device based on structure optical mode group
US8818048B2 (en) System and method for cancelable iris recognition
Bhatti et al. Smart attendance management system using face recognition
TW200905577A (en) Iris recognition system
CN113298060B (en) Privacy-protecting biometric feature recognition method and device
Bodade et al. Iris analysis for biometric recognition systems
CN110008871B (en) Palm print identification method and system
WO2020191520A1 (en) Microstructure detection based anti-counterfeiting paper product, and manufacturing method and authentication method therefor
Sharma et al. A review paper on facial recognition techniques
Sapkale et al. A finger vein recognition system
Blommé Evaluation of biometric security systems against artificial fingers
Galbally et al. Fingerprint anti-spoofing in biometric systems
Zhong et al. VeinDeep: Smartphone unlock using vein patterns
JP5279007B2 (en) Verification system, verification method, program, and recording medium
Li et al. Palmprint liveness detection by combining binarized statistical image features and image quality assessment
Ibsen et al. Attacking Face Recognition with T-shirts: Database, Vulnerability Assessment and Detection
CN107368811B (en) LBP-based face feature extraction method under infrared and non-infrared illumination
Mitra et al. ◾ Overview of Biometric Authentication
SulaimanAlshebli et al. The Cyber Security Biometric Authentication based on Liveness Face-Iris Images and Deep Learning Classifier
Grinchuk et al. Training a multimodal neural network to determine the authenticity of images
CN111126283A (en) Rapid in-vivo detection method and system for automatically filtering fuzzy human face
CN111860343B (en) Method and device for determining face comparison result

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201211