CN110309699B - Automatic extraction method of subcutaneous sweat pore map based on OCT - Google Patents

Automatic extraction method of subcutaneous sweat pore map based on OCT Download PDF

Info

Publication number
CN110309699B
CN110309699B CN201910219874.9A CN201910219874A CN110309699B CN 110309699 B CN110309699 B CN 110309699B CN 201910219874 A CN201910219874 A CN 201910219874A CN 110309699 B CN110309699 B CN 110309699B
Authority
CN
China
Prior art keywords
image
point set
papillary layer
oct
stratum corneum
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910219874.9A
Other languages
Chinese (zh)
Other versions
CN110309699A (en
Inventor
梁荣华
丁宝进
陈朋
王海霞
张怡龙
刘义鹏
蒋莉
潘栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN201910219874.9A priority Critical patent/CN110309699B/en
Publication of CN110309699A publication Critical patent/CN110309699A/en
Application granted granted Critical
Publication of CN110309699B publication Critical patent/CN110309699B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1353Extracting features related to minutiae or pores

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

An OCT-based subcutaneous sweat pore map extraction method comprises the following steps: 1) carrying out gray value difference operation on each pixel of each OCT image, and selecting points with results larger than a threshold value as an initial characteristic point set; 2) separating the stratum corneum feature point set from the initial feature point set by applying Hough transform, performing quadratic polynomial fitting on the stratum corneum feature point set to obtain a stratum corneum contour, and removing feature points positioned near and above the periphery of the stratum corneum contour; 3) removing characteristic points outside the papillary layer contour from far to near to obtain an accurate papillary layer characteristic point set, and carrying out cubic interpolation fitting on the papillary layer characteristic point set to obtain a papillary layer contour; 4) and (3) obtaining sweat gland tangents according to the positions of the two contours, splicing all the sweat gland tangents obtained in the OCT image into a subcutaneous sweat pore image with the size of W multiplied by N, and enhancing the image to obtain the final result. The invention can obtain the correct sweat gland tangent line and finally obtain a clear subcutaneous sweat pore map.

Description

Automatic extraction method of subcutaneous sweat pore map based on OCT
Technical Field
The invention relates to the field of fingerprint identification, in particular to an automatic extraction method of a subcutaneous sweat pore map based on an OCT system.
Background
Fingerprint identification technology has become the most used biometric feature in current individual identification or authentication due to its uniqueness, permanence, and ease of collection. At present, sweat pores distributed between ridges of fingerprints, i.e. 3-level features of fingerprints, are increasingly being applied to the field of fingerprint identification. However, when irreparable damage is present to the finger surface from dirt, perspiration, and scars or cuts, the sweat pores are destroyed and the identification task is not completed. In addition, the sweat pores in the epidermis have two states of opening and closing, which also affect the accuracy of recognition.
Studies have shown that sweat pores in the epidermis of the fingers are formed by the subcutaneous sweat glands. Sweat glands grow in the papillary layer (dermal papilla) at the junction of the dermis and epidermis, extending all the way to the skin surface, thereby forming sweat pores. Therefore, compared with the superficial sweat pores, the sweat gland cross section (namely the subcutaneous sweat pore diagram) obtained by transversely cutting the subcutaneous tissues of the fingers has the advantages of being not easy to damage, complete and stable. Meanwhile, the non-invasive imaging technology of Optical Coherence Tomography (OCT) can acquire information of 1-3 mm depth under the surface of human skin to obtain 3D volume data of finger fingerprints, and the possibility is provided for acquiring a high-resolution subcutaneous sweat pore map.
Disclosure of Invention
In order to overcome the defect of poor accuracy of the existing subcutaneous sweat pore map extraction mode, the invention provides an automatic extraction method of the subcutaneous sweat pore map based on OCT.
In order to achieve the purpose, the invention adopts the technical scheme that:
an automatic extraction method of a subcutaneous sweat pore map based on OCT comprises the following steps:
1) setting the size of OCT fingerprint volume data as W multiplied by H multiplied by N, namely, the OCT fingerprint volume data is composed of N OCT images with the resolution ratio of W multiplied by H, representing the vertical section of N continuous finger fingerprints on the space, carrying out gray value difference operation on each pixel of each OCT image, and selecting the point with the result larger than the threshold value as an initial characteristic point set;
2) because the stratum corneum feature points almost form a slightly curved straight line, a hough transform is applied to separate a stratum corneum feature point set from an initial feature point set, quadratic polynomial fitting is carried out on the stratum corneum feature point set to obtain a stratum corneum contour, and feature points positioned near and above the periphery of the stratum corneum contour are removed from the point set;
3) removing characteristic points outside the papillary layer contour from far to near to obtain an accurate papillary layer characteristic point set, and carrying out cubic interpolation fitting on the papillary layer characteristic point set to obtain a papillary layer contour;
4) obtaining sweat gland tangents (consisting of W pixels) according to the positions of the two contours, splicing all the sweat gland tangents obtained in the OCT image into a subcutaneous sweat pore image with the size of W multiplied by N, and obtaining the final result through image enhancement.
Further, in step 1), since the variation of the gray scale values in the stratum corneum and the papillary layer is large, the characteristic points of the OCT image are extracted using the gray scale value difference operation:
Figure BDA0002003238130000021
wherein g (x, y) is represented by the gray scale value of the coordinate point (x, y), and 0 ≦ x<W,0 is more than or equal to y and less than H-1, the larger the value of y is, the deeper the position is, namely the lower the position of the image is, and all the points in the formula are taken as an initial characteristic point set P0
Still further, the step 2) comprises the following steps:
2.1) the initial characteristic points are distributed on the cuticle and the papillary layer in a centralized way due to the fact that the gray values of the images on the cuticle and the papillary layer are greatly changed; meanwhile, the stratum corneum feature points almost form a slightly curved straight line, and thus the stratum corneum is extracted using hough transform. Firstly, converting an initial characteristic point image into a binary image, applying Hough transform to the binary image, searching Hn (equal to 6 maximum values) in a result matrix, and obtaining 6 line segments distributed on the characteristic points of the stratum corneum; connecting the gaps among the line segments, and respectively extending the two line segments at the leftmost side and the rightmost side of the image to the edge of the image leftwards and rightwards to form a continuous line segment L;
2.2) because the line segment L is substantially coincident with the stratum corneum, for any point j, it is assigned to a stratum corneum feature point as long as the following equation is satisfied:
dis(j,L)<=v(j∈P0) (2)
that is, the distance between the point j and the straight line L is less than the threshold value v equal to 1, and thus the stratum corneum feature point is obtainedSet PSCSimultaneously to PSCFitting a quadratic polynomial to obtain the outline L of the stratum corneumSC
2.3) removal at LSCUpper and all and LSCFeature points with distance less than 10 to obtain feature point set P 'for extracting feature points of papillary layer'PL
Further, the step 3) comprises the following steps:
3.1) remove feature points farther from the papillary layer first: point set P'PLAll points i satisfying the following formula are removed:
Figure BDA0002003238130000031
where W is the width of the image, (x)i,yi) Is the coordinate of point i; q is a point set P 'by using a least square method'PLFitting the obtained approximate quadratic curve to obtain a curve,
Figure BDA0002003238130000032
representing the ordinate value corresponding to the point i abscissa of Q; t is1Taking 25 as a distance threshold;
3.2) the feature points to be removed, which are closer to the papillary layer, are approximately horizontal straight line segments of sweat glands, which are located above the papillary layer; since the contour of the papillary layer is a continuous and smoothly-changing curve, the slope of two endpoints of the sweat gland segment is larger and the sign is opposite, the distance between the two endpoints is smaller and the segment is approximately horizontally distributed, so that the two endpoints of the sweat gland close to the papillary layer are defined as:
Figure BDA0002003238130000033
wherein (X)A,YA) And (X)B,YB) Respectively representing A and B endpoints, G(A)And G(B)Respectively representing the slopes, T, of the two end points of A, B210, Han-gland model parameter t1Taking the length slightly larger than the diameter of sweat gland, t2Take a very small value, here, t120 and t2Removing the characteristic points between the end points A and B in accordance with the formula 4, and keeping the characteristic point set P of the nipple layerPLUsing cubic spline interpolation to collect the characteristic point set P of the papillary layerPLFitting into a smooth continuous papillary layer profile LPL
The step 4) comprises the following steps:
4.1) obtaining a sweat gland tangent L by the following formulaSG
(LSG)x=(0.3*(LSC)x+0.7*(LPL)x) (5)
Namely LSGBetween the two contour lines, but closer to the papillary layer contour, will LSGTaking out the pixels on the positions to obtain a straight line with the length of W, and splicing the straight line into a complete subcutaneous sweat pore image in a surface mode according to the image sequence;
4.2) obtaining a final subcutaneous sweat pore map using image enhancement on the obtained subcutaneous sweat pore image.
Compared with the prior art, the invention has the beneficial effects that: the stable classification and extraction of the cuticle and papillary layer characteristic points can be carried out aiming at different papillary layer depths, so that a correct sweat gland tangent line is obtained, and finally, a clear subcutaneous sweat pore map is obtained.
Drawings
FIG. 1 is a flow chart of the algorithm of the present invention;
FIG. 2 is an OCT image;
FIG. 3 is an initial feature point diagram;
fig. 4 is the result of a hough transform;
fig. 5 is a result of connecting line segments obtained by hough transform;
FIG. 6 is a stratum corneum contour;
FIG. 7 is a set P 'of the remaining papillary layer feature points'PL
FIG. 8 is a nipple layer profile;
the middle line in fig. 9 represents the sweat gland tangent;
fig. 10 is the resulting subcutaneous sweat pore pattern.
Detailed Description
The invention will be further described with reference to the following figures and embodiments:
referring to fig. 1 to 10, an OCT-based subcutaneous sweat pore map automatic extraction method includes the following steps:
1) setting the size of OCT fingerprint volume data as that, namely, the OCT fingerprint volume data is composed of OCT images with the resolution ratio as high as that, representing the vertical section of N pieces of finger fingerprints which are continuous in space, carrying out gray value difference operation on each pixel of each OCT image, and selecting the point with the result larger than the threshold value as an initial characteristic point set;
because the gray value changes greatly in the stratum corneum and papillary layer, the characteristic points of the OCT image are extracted using gray value difference operation:
d(x,y)=g(x,y+1)-g(x,y)
d(x,y)>5
wherein g (x, y) is represented by a gray value (0. ltoreq. x) of the coordinate point (x, y)<W,0 is more than or equal to y and less than H-1), the deeper position is represented by the larger value of y (namely the lower position of the image), and all the points in the above formula are taken as an initial characteristic point set P0
2) Because the stratum corneum feature points almost form a slightly curved straight line, a hough transform is applied to separate a stratum corneum feature point set from an initial feature point set, quadratic polynomial fitting is carried out on the stratum corneum feature point set to obtain a stratum corneum contour, and feature points positioned near and above the periphery of the stratum corneum contour are removed from the point set; the method comprises the following steps:
2.1) the initial characteristic points are distributed on the cuticle and the papillary layer in a centralized way due to the fact that the gray values of the images on the cuticle and the papillary layer are greatly changed; meanwhile, the stratum corneum feature points almost form a slightly curved straight line, and thus the stratum corneum is extracted using hough transform. Firstly, converting an initial characteristic point image into a binary image, applying Hough transform to the binary image, searching Hn (equal to 6 maximum values) in a result matrix, and obtaining 6 line segments distributed on the characteristic points of the stratum corneum; connecting the gaps among the line segments, and respectively extending the two line segments at the leftmost side and the rightmost side of the image to the edge of the image leftwards and rightwards to form a continuous line segment L;
2.2) because the line segment L is substantially coincident with the stratum corneum, for any point j, it is assigned to a stratum corneum feature point as long as the following equation is satisfied:
dis(j,L)<=v(j∈P0)
that is, the distance between the point j and the straight line L is less than the threshold value v equal to 1, and thus the stratum corneum feature point set P is obtainedSCSimultaneously to PSCFitting a quadratic polynomial to obtain the outline L of the stratum corneumSC
2.3) removal at LSCUpper and all and LSCFeature points with distance less than 10 to obtain feature point set P 'for extracting feature points of papillary layer'PL
3) Removing characteristic points outside the papillary layer contour from far to near to obtain an accurate papillary layer characteristic point set, and carrying out cubic interpolation fitting on the papillary layer characteristic point set to obtain a papillary layer contour; the method comprises the following steps:
3.1) remove feature points farther from the papillary layer first: point set P'PLAll points i satisfying the following formula are removed:
Figure BDA0002003238130000061
where W is the width of the image, (x)i,yi) Is the coordinate of point i; q is a point set P 'by using a least square method'PLFitting the obtained approximate quadratic curve to obtain a curve,
Figure BDA0002003238130000062
representing the ordinate value corresponding to the point i abscissa of Q; t is1Taking 25 as a distance threshold;
3.2) the feature points to be removed, which are closer to the papillary layer, are approximately horizontal straight segments of sweat glands, located above the papillary layer. Since the contour of the papillary layer is a continuous and smoothly-changing curve, the slope of two endpoints of the sweat gland segment is larger and opposite in sign, the distance between the two endpoints is smaller, and the segment is approximately horizontally distributed. The two endpoints of sweat glands near the papillary layer are thus defined as:
Figure BDA0002003238130000063
wherein (X)A,YA) And (X)B,YB) Respectively representing A and B endpoints, G(A)And G(B)Respectively representing the slopes, T, of the two end points of A, B210, Han-gland model parameter t1Taking the length slightly larger than the diameter of sweat gland, t2Take a very small value, here, t120 and t2Removing the characteristic points between the end points A and B in accordance with the formula 4, and keeping the characteristic point set P of the nipple layerPLUsing cubic spline interpolation to collect the characteristic point set P of the papillary layerPLFitting into a smooth continuous papillary layer profile LPL
4) Obtaining a sweat gland tangent line (consisting of W pixels) according to the positions of the two contours by using interpolation to obtain a papillary layer contour, splicing all the sweat gland tangent lines obtained in the OCT image into a subcutaneous sweat pore image with the size of W multiplied by N, and obtaining a final result by image enhancement; the method comprises the following steps:
4.1) obtaining a sweat gland tangent L by the following formulaSG
(LSG)x=(0.3*(LSC)x+0.7*(LPL)x)
Namely LSGBetween the two contour lines, but closer to the papillary layer contour. Mixing L withSGTaking out the pixels on the positions to obtain a straight line with the length of W, and splicing the straight line into a complete subcutaneous sweat pore image in a surface mode according to the image sequence;
4.2) obtaining a final subcutaneous sweat pore map using image enhancement on the obtained subcutaneous sweat pore image.

Claims (5)

1. An automatic extraction method of a subcutaneous sweat pore map based on OCT (optical coherence tomography), which is characterized by comprising the following steps:
1) setting the size of OCT fingerprint volume data as W multiplied by H multiplied by N, namely, the OCT fingerprint volume data is composed of N OCT images with the resolution ratio of W multiplied by H, representing the vertical section of N continuous finger fingerprints on the space, carrying out gray value difference operation on each pixel of each OCT image, and selecting the point with the result larger than the threshold value as an initial characteristic point set;
2) separating the stratum corneum feature point set from the initial feature point set by applying Hough transform, performing quadratic polynomial fitting on the stratum corneum feature point set to obtain a stratum corneum contour, and removing feature points positioned near and above the periphery of the stratum corneum contour;
3) removing characteristic points outside the papillary layer contour from far to near to obtain an accurate papillary layer characteristic point set, and carrying out cubic interpolation fitting on the papillary layer characteristic point set to obtain a papillary layer contour;
4) and obtaining sweat gland tangents according to the positions of the two contours, wherein the sweat gland tangents consist of W pixels, splicing all the sweat gland tangents obtained in the OCT image into a subcutaneous sweat pore image with the size of W multiplied by N, and obtaining the final result through image enhancement.
2. The automatic extraction method of the subcutaneous sweat pore map based on the OCT as claimed in claim 1, wherein: in the step 1), because the gray values of the stratum corneum and the papillary layer have large changes, the characteristic points of the OCT image are extracted by using gray value difference operation:
Figure FDA0002977074440000011
wherein g (x, y) is represented by the gray scale value of the coordinate point (x, y), and 0 ≦ x<W,0 is more than or equal to y and less than H-1, the larger the value of y is, the deeper the position is, namely the lower the position of the image is, and all the points in the formula are taken as an initial characteristic point set P0
3. The automatic extraction method of the subcutaneous sweat pore map based on the OCT as claimed in claim 1 or 2, wherein: the step 2) comprises the following steps:
2.1) because the gray value of the image at the cuticle and the papillary layer is changed greatly, the initial characteristic points are distributed on the cuticle and the papillary layer in a centralized way, and meanwhile, Hough transform is used for extracting the cuticle, the initial characteristic point image is converted into a binary image, then Hough transform is applied to the binary image, Hn is searched for 6 maximum values in a result matrix, 6 line segments distributed on the characteristic points of the cuticle are obtained, gaps among the line segments are connected, and meanwhile, two line segments at the leftmost side and the rightmost side of the image are respectively extended to the edge of the image leftwards and rightwards, so that a continuous line segment L is formed;
2.2) for any point j, it is assigned to the stratum corneum feature point as long as the following equation is satisfied:
dis(j,L)≤v(j∈P0) (2)
that is, the distance between the point j and the straight line L is less than the threshold value v equal to 1, and thus the stratum corneum feature point set P is obtainedSCAnd other feature point set P 'for extracting papillary layer feature points'PLSimultaneously to PSCFitting a quadratic polynomial to obtain the outline L of the stratum corneumSC
2.3) removal at LSCUpper and all and LSCFeature points with a distance of less than 10.
4. The automatic extraction method of the subcutaneous sweat pore map based on the OCT as claimed in claim 1 or 2, wherein: the step 3) comprises the following steps:
3.1) remove feature points farther from the papillary layer first: point set P'PLAll points i satisfying the following formula are removed:
Figure FDA0002977074440000021
where W is the width of the image, (x)i,yi) Is the coordinate of point i; q is a point set P 'by using a least square method'PLFitting the obtained approximate quadratic curve to obtain a curve,
Figure FDA0002977074440000022
representing the ordinate value corresponding to the point i abscissa of Q; t is1Taking 25 as a distance threshold;
3.2) the characteristic points to be removed, which are closer to the papillary layer, are approximately horizontal straight line segments of sweat glands, which are located above the papillary layer, and two end points of the sweat glands close to the papillary layer are defined as follows:
Figure FDA0002977074440000031
wherein (X)A,YA) And (X)B,YB) Respectively representing A and B endpoints, G(A)And G(B)Respectively representing the slopes, T, of the two end points of A, B210, sweat gland model parameter t1Taking the length greater than the diameter of sweat gland, wherein t1=20,t2Removing the characteristic points between the end points A and B in accordance with the formula 4, and keeping the characteristic point set P of the nipple layerPLUsing cubic spline interpolation to collect the characteristic point set P of the papillary layerPLFitting into a smooth continuous papillary layer profile LPL
5. The automatic extraction method of the subcutaneous sweat pore map based on the OCT as claimed in claim 1 or 2, wherein: the step 4) comprises the following steps:
4.1) obtaining a sweat gland tangent L by the following formulaSG
(LSG)x=(0.3*(LSC)x+0.7*(LPL)x) (6) is LSGBetween the two contour lines, but closer to the papillary layer contour, will LSGTaking out the pixels on the positions to obtain a straight line with the length of W, and splicing the straight line into a complete subcutaneous sweat pore image in a surface mode according to the image sequence;
4.2) obtaining a final subcutaneous sweat pore map using image enhancement on the obtained subcutaneous sweat pore image.
CN201910219874.9A 2019-03-22 2019-03-22 Automatic extraction method of subcutaneous sweat pore map based on OCT Active CN110309699B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910219874.9A CN110309699B (en) 2019-03-22 2019-03-22 Automatic extraction method of subcutaneous sweat pore map based on OCT

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910219874.9A CN110309699B (en) 2019-03-22 2019-03-22 Automatic extraction method of subcutaneous sweat pore map based on OCT

Publications (2)

Publication Number Publication Date
CN110309699A CN110309699A (en) 2019-10-08
CN110309699B true CN110309699B (en) 2021-06-18

Family

ID=68074343

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910219874.9A Active CN110309699B (en) 2019-03-22 2019-03-22 Automatic extraction method of subcutaneous sweat pore map based on OCT

Country Status (1)

Country Link
CN (1) CN110309699B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112528728B (en) * 2020-10-16 2024-03-29 深圳银星智能集团股份有限公司 Image processing method and device for visual navigation and mobile robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102846309A (en) * 2011-05-23 2013-01-02 索尼公司 Image processing device, image processing method, program, and recording medium
CN107563364A (en) * 2017-10-23 2018-01-09 清华大学深圳研究生院 The discriminating conduct of the fingerprint true and false and fingerprint identification method based on sweat gland
CN108446633A (en) * 2018-03-20 2018-08-24 深圳大学 A kind of method, system and device of novel finger print automatic anti-fake and In vivo detection
CN108932507A (en) * 2018-08-06 2018-12-04 深圳大学 A kind of automatic anti-fake method and system based on OCT fingerprint image
CN109247911A (en) * 2018-07-25 2019-01-22 浙江工业大学 A kind of multi-modal characteristic synchronization acquisition system of finger
CN109310337A (en) * 2016-06-20 2019-02-05 公立大学法人大阪市立大学 Skin diagnosis device, skin condition output method, program and recording medium
CN109377549A (en) * 2018-09-29 2019-02-22 浙江工业大学 A kind of real-time processing of OCT finger tip data and three-dimensional visualization method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7116806B2 (en) * 2003-10-23 2006-10-03 Lumeniq, Inc. Systems and methods relating to AFIS recognition, extraction, and 3-D analysis strategies

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102846309A (en) * 2011-05-23 2013-01-02 索尼公司 Image processing device, image processing method, program, and recording medium
CN109310337A (en) * 2016-06-20 2019-02-05 公立大学法人大阪市立大学 Skin diagnosis device, skin condition output method, program and recording medium
CN107563364A (en) * 2017-10-23 2018-01-09 清华大学深圳研究生院 The discriminating conduct of the fingerprint true and false and fingerprint identification method based on sweat gland
CN108446633A (en) * 2018-03-20 2018-08-24 深圳大学 A kind of method, system and device of novel finger print automatic anti-fake and In vivo detection
CN109247911A (en) * 2018-07-25 2019-01-22 浙江工业大学 A kind of multi-modal characteristic synchronization acquisition system of finger
CN108932507A (en) * 2018-08-06 2018-12-04 深圳大学 A kind of automatic anti-fake method and system based on OCT fingerprint image
CN109377549A (en) * 2018-09-29 2019-02-22 浙江工业大学 A kind of real-time processing of OCT finger tip data and three-dimensional visualization method

Also Published As

Publication number Publication date
CN110309699A (en) 2019-10-08

Similar Documents

Publication Publication Date Title
CN107862282B (en) Finger vein identification and security authentication method, terminal and system
US10762366B2 (en) Finger vein identification method and device
Wu et al. Fingerprint image enhancement method using directional median filter
CN108875629B (en) Palm vein identification method based on multi-sample feature fusion
CN103646398A (en) Demoscopy focus automatic segmentation method
CN109376708B (en) Method for extracting ROI
CN107749049B (en) Vein distribution display method and device
CN109934118A (en) A kind of hand back vein personal identification method
Khutlang et al. Novelty detection-based internal fingerprint segmentation in optical coherence tomography images
CN112330561B (en) Medical image segmentation method based on interactive foreground extraction and information entropy watershed
CN110309699B (en) Automatic extraction method of subcutaneous sweat pore map based on OCT
CN111105427B (en) Lung image segmentation method and system based on connected region analysis
CN107516083B (en) Recognition-oriented remote face image enhancement method
CN110136139B (en) Dental nerve segmentation method in facial CT image based on shape feature
CN113256673A (en) Intelligent wrinkle removing system based on infrared laser
Lim et al. Enhancing fingerprint recognition using minutiae-based and image-based matching techniques
Rew et al. Hybrid Segmentation Scheme for Skin Features Extraction Using Dermoscopy Images.
CN112070684A (en) Method for repairing nail carving words based on morphological prior characteristics
Costa et al. Towards biometric identification using 3D epidermal and dermal fingerprints
Khutlang et al. High resolution feature extraction from optical coherence tomography acquired internal fingerprint
CN107122710B (en) Finger vein feature extraction method based on scattering convolution network
CN112330704A (en) Plantar contour expression method
Kumar et al. Finger vein based human identification and recognition using gabor filter
DR et al. Fingerprint verification based on fusion of minutiae and ridges using strength factors
CN113011361B (en) OCT fingerprint-based internal maximum intensity projection imaging method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant