WO2008054396A1 - Procédé et appareil pour extraction et correspondance d'un détail biométrique - Google Patents

Procédé et appareil pour extraction et correspondance d'un détail biométrique Download PDF

Info

Publication number
WO2008054396A1
WO2008054396A1 PCT/US2006/043091 US2006043091W WO2008054396A1 WO 2008054396 A1 WO2008054396 A1 WO 2008054396A1 US 2006043091 W US2006043091 W US 2006043091W WO 2008054396 A1 WO2008054396 A1 WO 2008054396A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
key
filters
recited
image
Prior art date
Application number
PCT/US2006/043091
Other languages
English (en)
Inventor
Peter M. Meenen
Original Assignee
Snowflake Technologies Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Snowflake Technologies Corporation filed Critical Snowflake Technologies Corporation
Priority to KR1020097011232A priority Critical patent/KR20090087895A/ko
Priority to MX2009004749A priority patent/MX2009004749A/es
Priority to CNA2006800568598A priority patent/CN101595493A/zh
Priority to PCT/US2006/043091 priority patent/WO2008054396A1/fr
Priority to JP2009536206A priority patent/JP2010509672A/ja
Priority to CA002671561A priority patent/CA2671561A1/fr
Priority to EP06836936A priority patent/EP2092460A1/fr
Priority to AU2006350242A priority patent/AU2006350242A1/en
Publication of WO2008054396A1 publication Critical patent/WO2008054396A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Definitions

  • the present invention relates, in general, to identification of individuals using biometric information and, in particular, to identification and authentication of individuals using subcutaneous vein images. From a technical point of view, the invention addresses a method of identifying a person by extracting and matching biometric detail from a subcutaneous vein infrared image of a person.
  • Biometrics which refers to identification or authentication based on physical or behavioral characteristics, is being increasingly adopted to provide positive identification with a high degree of confidence, and it is often desired to identify and/or authenticate the identity of individuals using biometric information, whether by 1:1 (one to one) authentication or l:n (one to many) matching/identification.
  • identity and “identifying”, as used herein, refer both to authentication (verification that a person is who he or she purports to be) and to identification (determining which of a set of possible individuals a person is).
  • Prior art solutions are known that use biometric information from iris images, images of palm print creases, and fingerprint images.
  • Patent 5,291,560 discloses performing biometric identification using analysis of oriented textures in iris images with a Hamming Distance metric, and it is known to use fixed-length keys when performing biometric identification based upon iris images.
  • Zhang et al U.S. Patent Application Publication No. 2005/0281438 (published December 22, 2005), discloses biometric identification using analysis of images of palm print creases with a neurophysiology-based Gabor Filter and an angular distance metric.
  • Patent 6,556,858 (issued April 29, 2003), fully incorporated herein by reference, disclose using infrared light to view subcutaneous veins, with subsequent re- projection of the vein image onto the surface of the skin, but does not disclose identification or authentication of individuals using the vein images.
  • Cross, J.M.; and Smith, C. L., "Thermographic Imaging of the Subcutaneous Vascular Network of the Back of the Hand for Biometric Identification", Proc. IEEE 1995 Int'l Carnahan Conference on Security Technology, pp. 20-35 (Oct. 18-20, 1995) discloses making an infrared image of subcutaneous veins on the back of the hand and then segmenting the vein pattern to obtain a medial axis representation of the vein pattern.
  • Contrast enhancement, filtering to remove hair and artifacts, and separation of the hand from a background is disclosed.
  • the medial axis representations are compared against stored signatures in a database.
  • Patent 6,301,375 (issued October 9, 2001), fully included by reference herein, utilize information such as points where veins intersect or cross, or, as disclosed in Clayden, U.S. Patent 5,787,185 (issued July 28, 1998), fully included by reference herein, utilize directionally-weighted vector representations of the veins, or other so-called "point-based" techniques well-known in the prior art.
  • a point-based vein biometric system can be defined as a system that performs biometric identification based on a selected series of critical points from a vein structure, for example, where the veins branch or where veins have maximal points of curvature. The typical approach to finding these points involves first segmenting the vein structure from the rest of he image.
  • the segmented vein structure is then typically reduced to a binary image and subsequently thinned to a series of single pixel lines. From this thinned version of the vein structure, vein intersection points can be easily identified. Other features, such as line curvature and line orientation, are also easily determined. The positions of these critical points along with other measures describing them (for example orientation angle or curvature value) are arranged into a vector and stored. Because these systems often miss some points or detect new points when processing different images of the same vein structure, the vectors that are constructed are of variable length, which makes quick database searches difficult. [0200] When performing point-based matching, the input point set is first compared to a reference point set during an alignment phase. This typically occurs through the use of an aff ⁇ ne transform, or similar method.
  • the present invention uses a series of filters to extract useful information from an image containing subcutaneous vein patterns.
  • a region of interest (“ROI") of an image containing subcutaneous vein structures, obtained from a vein imaging device, is processed using a plurality of filters that are selective in both orientation and spatial frequency.
  • statistical measures are taken from a plurality of regions within each of the resulting filtered images. These statistical measures are then arranged in a specific order and used as a uniquely-identifying code that can be quickly and easily matched against other codes that were previously acquired. Due to the uniform key size and constant ordering of the values, a metric as simple as a Euclidean Distance or preferably a Pearson Correlation Distance may be used to determine key similarity.
  • the present invention extracts detail from images of subcutaneous veins. This extracted detail is used to form a fixed-length key of statistical values that are then ordered in a preselected manner.
  • the present invention enables rapid matching and searching of databases of fixed-length biometric keys generated by the present invention.
  • One use for the present invention is for one to one and one to many biometric comparisons of subcutaneous vein patterns.
  • the present invention has numerous advantages.
  • the method of the invention produces a fixed-length biometric key based on biometric detail extracted from subcutaneous vein images. The key, being of fixed length and in a constant order, permits rapid 1:1 (one to one) matching/authentication and makes the process of l:n (one to many) matching/identification extremely simple.
  • the present invention also has the advantage of being able to simultaneously capture detail information relating to not only the position of veins, but also information relating to their size and orientation. This is due primarily to the fact that the filters applied to the image can be tuned in size, spatial frequency, and orientation. In any biometric system, the more information that can be captured relating to the feature in question, the better chance the system has of performing accurate matching for identification and authentication. [0580] In the case of l:n matching implementations, the present invention provides many benefits. First, since the key is of fixed size and in a constant order, the matching process is simpler, and as a result, matches can be performed more quickly. This allows a brute-force comparison with an entire database of keys to execute more quickly than would be possible under other prior art approaches.
  • the present invention also allows for a more refined searching approach through a quick reduction of the size of the database that must be searched by matching on a subset of the key rather than on the full key. For example, to quickly narrow the search field down to a smaller subset of records, a comparison can be preformed using a smaller key generated from a subset of the filtered images such as, for example, a few strategically chosen filters. This results in fewer calculations than would have to be performed against all the records in the database. The full key can then be compared against the remaining records. In addition, by indexing the database of keys based upon specific features of the various sub-keys, comparisons against key values known to be substantially different from the key in question can be skipped. [0600] It is an object of the present invention to provide an apparatus and method for identifying a person by extracting and matching biometric detail from a subcutaneous vein image of the person. It is a further object of the present invention that the identification be rapid and efficient.
  • FIG. 1 is a schematic diagram of a preferred embodiment of the apparatus of the present invention, showing imaging of veins on a hand.
  • ""' [1010]
  • Fig- 2 is a view of a region of interest ("ROI") of veins in an image.
  • Fig. 3 is a flowchart showing steps in the preferred embodiment of the method of the present invention.
  • Fig. 4 is a flowchart showing steps in the image preprocessing of Fig. 3.
  • Fig. 5 is a flowchart showing steps in the contrast enhancement of Fig. 4.
  • FIG. 7 is a two-dimensional directional (oriented) filter constructed from the Wavelet shown in Fig. 6.
  • Fig. 8 shows a representative pre-processed image before filtering.
  • Fig. 9 shows the image of Fig. 8 after filtering with an Even-Symmetric Gabor Filter.
  • Fig. 10 shows the image of Fig. 8 after filtering with two-dimensional Oriented Mexican Hat Wavelet Filter.
  • Fig. 11 shows how an image is processed into a key using the method of the present invention.
  • Fig. 12 is a flowchart showing steps in the key matching/verification.
  • Fig. 13A shows an image and Fig.
  • FIG. 13B shows the resulting key produced by the image of Fig. 13 A using the method of the present invention.
  • Fig. 14A shows another image from the same person as Fig. 13A but taken from a slightly different view
  • Fig. 14B shows the resulting key produced by the image of Fig. 14A, showing how similar images (Figs. 13A and 14A) produce similar keys (Figs. 13B and 14B).
  • Figs. 15A and 15B are the same as Figs. 13A and 13B, and are for comparison purposes with the different image of Fig. 16A that produces the different key of Fig. 16B, showing how dissimilar vein patterns generate dissimilar keys.
  • Figs. 16A shows another image from the same person as Fig. 13A but taken from a slightly different view
  • Fig. 14B shows the resulting key produced by the image of Fig. 14A, showing how similar images (Figs. 13A and 14A) produce similar keys (Figs. 13B and 14B).
  • FIG. 17A, 18A, 19A, and 2OA are different images with respective keys 17B, 18B, 19B, and 20B, for purposes of showing how similar images generate similar keys.
  • the images of Figs. 17A and 18A are somewhat similar, while the images of Figs. 19A and 2OA are very different from each other and from the images of Figs. 17 A and 18 A.
  • Figs. 21, 22, 23, and 24 show the match scores for the keys of Figs. 17B, 18B, 19B, and 2OB using various distance metrics. All match scores except those for the Pearson Correlation (Fig. 24) are normalized (divided) by the length of the key.
  • Fig. 24 Pearson Correlation
  • FIG. 25 shows comparison of two key subsets ("sub-keys") using the keys shown in Figs. 17B and 18B.
  • Fig. 26 shows comparison of one key subset against key subsets in a database of stored keys.
  • Fig. 27 shows rapid evaluation of one key subset against a database of stored keys indexed by subset key value counts to determine eligible keys for subsequent distance comparison.
  • Fig. 28 shows the combination of the techniques of Figs. 26 and 27 for rapid evaluation of one key subset against a database of stored keys indexed by subset key value counts, which determines subsets eligible for subsequent key subset distance comparison, which determines keys eligible for full key distance comparison.
  • the method of the preferred embodiment of the invention has the steps shown in Fig. 3, and the apparatus of the present invention also operates according to the flow chart shown in Fig. 3.
  • the steps include an input image 24 for which a region of interest ("ROI") 30, perferably 400 x 400 pixels, has been identified and cropped; subdivision of the ROI into a tessellation pattern (such as preferably a 20 pixel x 20 pixel square grid region, but polar sector regions, triangular regions, rectangular regions, or hexagonal regions, etc., may also be used) so as to form a plurality of regions; a bank of filters 40 (which can be selected from a wide variety of compatible filter types including, as described herein, Symmetric Gabor Filters, Complex Gabor Filters, Log Gabor Filters, Oriented Gaussian Functions, Adapted Wavelets (as that term is defined and used herein), etc.); and a statistical measure formed for each region in the tessellation pattern (preferably, statistical variance of
  • apparatus 20 for identifying a person is seen to include means 22, such as a well-known infrared camera, for capturing a subcutaneous vein infrared image 24 of the person.
  • veins blood vessels such as capillaries, veins, and arteries.
  • a portion of the person, such as a hand 26, is illuminated by infrared light and then the image is captured by a camera 22 or other well-known sensing device.
  • broad-spectrum light such as room light
  • a suitable infrared filter could be placed in front of camera 22 to cause only the infrared image to be seen by camera 22.
  • Camera 22 may include, as is well-known to those skilled in the art, charge-coupled device (“CCD”) elements, as are often found in well-known CCD / CMOS cameras, to capture the image and pass it on to a well-known computer 28.
  • CCD charge-coupled device
  • CCD charge-coupled device
  • the present disclosure uses the example of vein patterns on the back of the hand for purposes of illustration, it should be understood that the present invention is easily adapted to work on other parts of the body where subcutaneous veins may be viewed.
  • preprocessing 35 is preferably performed on the image, and the preferred preprocessing steps of Fig. 3 are shown in greater detail in the flowchart of Fig. 4.
  • a region of interest ("ROI") 30 of the image 24 is identified for which processing will be performed.
  • ROI region of interest
  • this is done by segmenting the person's body part, e.g., hand 26, from any background image.
  • certain landmarks or feature locations such as fingertip points 32 or preferably points 34 on the inter-finger webbing, are located.
  • Fingertip points 32 are less desirable for landmarks than are inter-finger webbing points 34 because finger spread will cause greater variability of the location of fingertip points 32 with respect to ROI 30 than of the location of inter-finger points 34.
  • the image is adjusted to a pre-defined location and orientation, preferably with adjustments for scaling, horizontal and vertical translation, and rotation, so that the ROI 30 will present a uniform image area of interest for evaluation.
  • the ROI 30 is identified as follows: First, the raw image 24 is received from the image capture means 22. Preferably a dark background is placed below the imaged hand 26 so that background pixels will be darker than foreground pixels of the hand.
  • a histogram of the image 24 is taken, and the histogram is analyzed to find two peaks, one peak near zero (the background) and one peak in the higher-intensity range (the hand), and an appropriate threshold is determined that will discriminate the two peaks. If more than two peaks are found, the bin count is reduced by 5 until a two-peak histogram is achieved. If two peaks still cannot be found, a default threshold of 10 greater than the minimum pixel intensity (i.e., somewhat above the background intensity) is used. When two peaks are found, the threshold is set to the minimum value between these two peaks, and a binary mask is created with pixels having an intensity above the threshold being set to 1 and those equal or below the threshold set to 0.
  • the inter-finger points 34 are then located by tracing along the edge of the binary hand outline while noting local maximum and minimum elevations, where a "high" elevation is defined as being closer to the top of the image (using an orientation for the image capture means 22 such that the fingertips 32 are generally oriented toward the "top” of the image). A minimum elevation (closer to the "bottom” of the image) between two maximum elevations thus indicates an inter-finger point 34 on the inter-finger webbing.
  • an affine transform is used to rotate and scale the image 24 to match a set of pre-determined standardized points, with transform coefficients being determined using a least- squares mapping between the points on the imaged hand and the pre-determined standardized points.
  • the region of interest (ROI) 30 is then determined to be a 400 x 400 pixel region anchored at the inter-finger point 34 furthest from the thumb, noting that the thumb's inter- finger point has the lowest elevation in the y direction.
  • ROI region of interest
  • Apparatus 20 is thus seen to include means 36 for identifying a region-of-interest.
  • the ROI image 30 may also be preprocessed (filtered) to remove artifacts in the image such as hair, as by well-known artifact removal means 38, to create an artifact-removed image 24'.
  • artifact removal is not only to provide a more stable image for comparison, but also to prevent attempts by individuals to avoid identification by shaving the hair off of parts of their body.
  • an adequate approach has been found to be by use of a simple and well-known 10 x 10 median filter. While this causes loss of some vein detail, the veins on the back of the hand (where hair removal is most needed) are large enough to easily survive a filter of size 10 x 10.
  • adaptive contrast enhancement 39 is then preferably performed on the image 24' using steps as shown in detail in Fig. 5.
  • An algorithm is used that first applies a blur filter 41 to the image to create a blurred version of the image, and then this blurred image is subtracted from the original image to create an unsharp version 24a of the image.
  • the absolute value 43 is then taken of this unsharp image 24a, the absolute value processed image is then blurred 46, and the original unsharp image is divided 48 (point by point) by this blurred absolute value image, producing a contrast-enhanced image 24".
  • Additional image smoothing may also be used to clean up image artifacts produced by the hair removal.
  • the "Pre-Processed Image” shown in Fig. 11 is an example of an initial image, shown at the top of Fig. 11 , to which only contrast enhancement preprocessing has been done without removal of hair artifacts.
  • Image pre-processing while preferred, is not essential to the present invention.
  • the approach of the present invention is sensitive to changes in the positions of vein detail in the image. In other words, the images that are compared by the present invention must be aligned very closely in order to allow for optimal matching. A good alignment method in the pre-processing stage is essential so that the ROI 30 location will be consistent.
  • the ROI portion 30 of the image is ready for the application of a first plurality of enhancement filters 40.
  • this plurality of filters 40 may be implemented serially, at the expense of greater elapsed filtering time, or in parallel, at the expense of greater concurrent processing requirements.
  • the preferred embodiment, for each of the filters 40 uses an Even Symmetric Gabor filter, which is of the form:
  • Filters other than Even Symmetric Gabor filters may be substituted for one or all of the filters 40, such as, for example, Complex Gabor Filters, Log Gabor filters, Oriented Gaussian Filters, and, as hereinafter explained in greater detail, filters constructed from Wavelets.
  • the filters 40 are of the same class of filters but differing in orientation and/or frequency, and it shall be understood that other filters may be substituted that enhance image features for a given orientation angle.
  • the well-known equations for several exemplary filters other than Even Symmetric Gabor filters will now be given.
  • the Complex Gabor Filter has the form:
  • g ⁇ x,y is the spatial representation of the Gabor filter
  • is the orientation angle of the desired filter
  • / is the desired spatial frequency (in degrees)
  • ⁇ and ⁇ represent the standard deviations of the Gaussian envelope in the x and y directions respectively.
  • the Log-Gabor filter is typically defined in the frequency domain. If spatial filtering is performed, the special filter is determined via inverse FFT.
  • the frequency-domain representation of a 2-D Log-Gabor filter is:
  • ZG(w,v) is the frequency domain representation of the log-Gabor Filter
  • r(u, v) represents the radius of a given point in the filter from the filter's center
  • ⁇ (u,v) is represents the desired orientation angle of the filter
  • ⁇ . represents the spatial frequency bandwidth of the filter
  • represents the orientation bandwidth of the filter.
  • a Gabor filter is a preferable implementation of the key generation filter for the present invention because it is very effective at providing directionally selective enhancement.
  • Most common two-dimensional wavelets are not extremely useful as directionally-enhancing filters because they typically provide little to no ability to easily select an orientation direction. There are, however, several one-dimensional wavelets that can, as hereinafter described, be adapted in such a way as to make them useful as directional key generation filters for the present invention.
  • Adapted Wavelets shall be understood to refer to one-dimensional wavelets adapted in accordance with the present invention, in a manner that can now be described in detail.
  • a wavelet filter must be capable of being oriented to a specific angle and must be scalable so that it can detect objects of varying size.
  • wavelets are scalable and thus readily adaptable for detecting objects of varying size. Adaptation of a wavelet to be angularly selective for use with the present invention can be done in a manner that will now be described.
  • a one-dimensional wavelet is selected that has desired properties.
  • a Mexican Hat Wavelet has the following equation:
  • t time and ⁇ is the standard deviation.
  • a two dimensional directional filter is created for an angle of zero degrees by repeating the one dimensional wavelet on every column of the two dimensional filter matrix as shown in Fig. 7.
  • a rotation operator is applied to the filter to orient it in the desired angle.
  • An example of such a rotation operator utilizes a basic rotation matrix which is defined as:
  • Hermitian Wavelets which are a family of wavelets of which the Mexican hat is a member. The n Hermitian wavelet is simply the n derivative of a Gaussian, and has an equation of the form:
  • discrete one-dimensional wavelets such as the well-known ⁇ aar, Daubechies, Coiflet, and Symmlet wavelets, may also and equivalently be used. These wavelets are typically defined as a series of discrete values for which tables are well known to those skilled in the art.
  • the adapted wavelets heretofore described are intended to be examples of one- dimensional wavelets that can be adapted in accordance with the present invention to perform directional filtering enhancement of images in the matter heretofore described. Other one- dimensional wavelets having similar characteristics could be used in the manner heretofore described without departing from the spirit and scope of the present invention.
  • each filter is divided into a plurality of regions (20 x 20 pixels in the preferred embodiment), with each region having at least one pixel therewithin, and with each pixel having a pixel intensity. It should be understood that this region block size will change depending on the size of the features to be extracted.
  • a statistical measure preferably the statistical variance, of the pixel intensity values within the region is calculated. Note that, while the statistical measure used in the preferred embodiment is the statistical variance, many other statistical measures can be used, including standard deviation, mean, absolute average deviation, etc.
  • an enrollment key by using several of these measures together, yielding several statistical measures for each region.
  • the important feature of the statistical measure is that areas of the image with high variance represent areas that were enhanced by the filter while areas of low variance were not. Thus, areas of high variance are statistically likely to represent the presence of a vein in an orientation similar to that of the filter.
  • the magnitude of the variance is also an indicator of how closely the angle in which the vein is running matches the angle of the filter. Veins that run at an angle reasonably close to that of the filter will still show some response, but veins running at exactly the same angle will show a much larger response. [2350]
  • the statistical measures of the regions are then ordered so as to define an enrollment key vector 42, as by storing them in an array in a pre-set order.
  • the ordering of the statistical measures of the enrollment key match the ordering of the statistical measures in the stored verification keys.
  • the variance values may be scaled so that the largest is equal to 255 and the smallest is equal to zero, thereby allowing each value to occupy only one byte of storage space.
  • These regions are visually represented in the drawings as patches of varying intensity, with black patches have a value of zero and white patches having a value of 255, within the eight key subsets that together comprise key vector 40 shown in Fig. 11.
  • the keys can be reduced to a binary representation by applying a threshold value. In this binary version of the key, each block representation occupies only one bit, and thus, a large reduction in storage space is achieved.
  • the enrollment key 42 may then be stored to a disk 44 (joining a database of verification or second keys) or matched against an existing verification key to perform a verification or identification function.
  • the process of matching two keys i.e., enrollment and verification keys
  • a simple Euclidian distance calculation is performed on the keys as a whole.
  • the first (or enrollment) key is represented as:
  • FIG. 12 A flowchart of the matching steps performed by the present invention is shown in Fig. 12.
  • the generated enrollment, or first, key is compared to a stored verification, or second, key using the chosen distance metric 52 as described above, and the determination of whether the two keys match is made using a preselected threshold distance comparison 54. If the distance between the two keys is larger than the preselected threshold distance, they do not match. If the calculated distance is below the threshold, the keys do match.
  • threshold values are set after running a tuning subset of vein images through the apparatus / method of the present invention and evaluating the resulting scores. A threshold score is then chosen to best reflect chosen security goals for false positive (acceptance/match) and false negative (missed match).
  • a preferred implementation using a Pearson Correlation utilizes a threshold score of 0.5. Anything below this distance (score) is a match, and anything above this distance is a non- match.
  • Typical scores for matching keys have been found to range from about 0.15 to 0.3 and typical scores for non-matching keys have been found to range from about 0.6 to 1.0.
  • Each of the distance metrics (scoring methods) described herein produces an output with a slightly different numerical range, and it is necessary that a particular implementation of the present invention determine acceptable match and non-match score thresholds that reflect the desired security goals of the implementation.
  • the matching step can be repeated across the database for a one to many match, or for more sophisticated matching for one to many approaches, some of the methods of database indexing using properties of the key could be employed.
  • An example of keys generated from similar but non-identical vein images can be seen by comparison of Fig. 13A with Fig. 14A and of Fig. 13B with Fig. 14B.
  • Fig. 13A shows an image
  • Fig. 13B shows the resulting key produced by the image of Fig. 13A using the method of the present invention. It shall be understood that all of the keys shown in the drawings are pictorial representations of the key vectors themselves, normalized to range from 0 (black) to 255 (white) for ease of visual comparison between keys.
  • Fig. 13A shows an image
  • Fig. 13B shows the resulting key produced by the image of Fig. 13A using the method of the present invention. It shall be understood that all of the keys shown in the drawings are pictorial representations of the key vectors themselves, normalized to range from
  • FIG. 14A shows another image from the same person as Fig. 13A but taken from a slightly different view
  • Fig. 14B shows the resulting key produced by the image of Fig. 14A, showing how similar images (Figs. 13 A and 14A) produce similar keys (Figs. 13B and 14B).
  • Figs. 15A and 15B are the same as Figs. 13A and 13B, and are for comparison purposes with the different image of Fig. 16A that produces the different key of Fig. 16B, showing how dissimilar vein patterns generate dissimilar keys.
  • the Pearson Squared Correlation Distance has the form:
  • the Chebychev Distance (or Maximum Single-Dimensional Distance) has the form:
  • Figs. 17A, 18 A, 19A, and 20A These eight filters were independently applied to each respective image (Figs. 17A, 18 A, 19A, and 20A) to yield eight respective separate filter outputs (Figs. 17B, 18B, 19B, and 20B).
  • Figs. 17A and 18A are similar views from the same hand and Figs. 19A and 2OA are from two different hands.
  • Figs. 21-24 list the scores generated by matching different combinations of these images with different distance metrics, and the threshold for each comparison is shown below each respective table. All match scores in Figs. 21-23 are normalized (divided) by the length of the key; it is not necessary to normalize the scores for the Pearson Correlation (Fig. 24) because the Pearson Correlation produces normalized scores.
  • Figs. 25-28 show preferred embodiments of how partial key matching may be used in accordance with the present invention in order to reduce the computational burden.
  • the first key 70 simply for purposes of explanation, is the same as the key shown in Fig. 17B, and the second key 72 is the same as the key shown in Fig. 18B.
  • Partial key matching is performed as follows. First, a key subset portion 74 of the input key is selected. In the example shown in Figs.
  • the key subset (“sub-key”) 74 generated from the seventh filter output (the filter whose angular orientation is 135 degrees) is selected.
  • sub key 74 contains 400 values, but, in practice, any subset portion of key 70 of a reasonable size can be used.
  • this subset portion 74 of the key 70 is compared 54 to the corresponding portion 76 of the other keys, e.g., key 72, in the key database by using a distance metric, as heretofore described, and the resulting distance scores are noted.
  • the example of Figs. 25 and 26 uses the preferred distance metric of a Pearson Correlation to compare the key subsets 74, 76, 80.
  • a match is declared, otherwise the next set of sub- keys is compared.
  • a second threshold distance in this case, 0.5
  • a list of candidates may be formed and either further reduced by additional partial key matching or processed for full key matching to find the best match. If desired, a list of close matches can be provided to a human operator for further investigation.
  • the use of sub-keys to filter down the set of keys on which to perform full matching provides a substantial savings in computation when employed in large database environments. This form of preliminary comparison can also be used with other final comparison systems.
  • the sub-key matching of the present invention can be used to limit the search space and then a point-based matching algorithm could perform the final comparisons, as explained more fully hereinafter, for greater accuracy.
  • a major benefit of the present invention is an approach to indexing large databases using a fixed-length key.
  • the key subset values (“sub-keys") may be pre-indexed to permit the ignoring of keys that are known to have substantially different properties than the current enrollment key, thereby avoiding the computation expense of comparison with these ineligible keys.
  • the pre-indexing of the database to permit rapid ignoring of keys that have substantially different properties than the current enrollment key is equally applicable when the full key is used for the key subset ("sub- key"), such that the database indexing is done based on features of the full key rather than on a proper subset of the full key.
  • the examples of indexing are shown using key subsets ("sub-keys") that are proper subsets of the full keys rather than the full keys themselves.
  • the examples shown in Figs. 27 and 28 are for a key subset database indexed by decreasing non-zero counts within the key subset cells. As heretofore described, the statistical measure "counts" for each element of the key vector
  • Figs. 27 and 28 may preferably be a statistical variance, and the key indexing examples shown in Figs. 27 and 28 show that this indexing may be based upon the statistical measures of individual key subsets or even parts of those key subsets. For example, if the maximum variance for a key in question is large in the sub-key related to the filter taken at 45 degrees, comparisons with keys that have little to no variance in that sub-key can be ignored.
  • indexes can also be used to build indexes to help limit computations such as, for example, the total number of key or sub-key values above or below a given threshold value (the "feature threshold value”), the distance from the zero point, and the current areas of the key containing high or low response values, such that it may be determined whether a group of keys or sub-keys have similar features.
  • feature threshold value a threshold value
  • the upper right quadrant of a key associated with a filter angle of 45 degrees might contain no normalized key value larger than 25 (out of a range of 0 .. 255). This would indicate that there is little to no vein presence at that filter orientation in the upper right quadrant of the image.
  • Providing a series of indices constructed from several key features of the keys makes it possible to quickly focus the key matching on the correct subset of keys for full comparison (i.e. those having similar features), thereby drastically reducing search times for large databases of key values.
  • a subset of the full key is examined and features of this key are used to form an index.
  • the sub-key portion associated with the seventh filter output is once again used in the examples of Figs 27 and 28.
  • key matching can be limited to a subset of the database with keys that have similar features.
  • the number of non-zero key values was used as an indexing measurement and the distance metric was chosen as the Pearson Correlation, which is the preferred distance metric, for comparison with the example of Figs. 25 and 26.
  • the number of non-zero key values is representative of the magnitude of the filter response which, in this example using the seventh filter output, is an indicator of the amount of vein detail that runs at an angle of 135 degrees.
  • the example shows that the input image's sub-key 74 has 56 non-zero values. Because the example database is indexed by the number of non-zero sub-key values, it is only necessary to compare the input sub-key 74 to sub-keys with similar properties.
  • the range of keys to inspect is limited to those with a non-zero key value count that is within 20 of the input sub-key's count of 56 non-zero values (i.e., within the range 36 to 76)
  • only the key subset 76 for Key 2 and the key subset 78 for Key 3 need to be considered, and the key subset 80 for Key 4 can be ignored because it is not within the selected count distance of 20 of key subset 74 for Key 1.
  • a threshold of zero i.e., a tally of non-zero statistical measure key value counts
  • any value between the maximum and minimum statistical measure key value counts could be used as, for example, if the database were to be indexed by statistical measure key value counts greater than a threshold of 10 or 20 to require significant filter response before a key cell region is considered meaningful.
  • indexes on multiple sub-keys within the full key or using multiple measures for example, non- zero value count and mean value or values above a given threshold and variance
  • these techniques can be employed to further limit the search space.
  • any applicable statistical measure such as, for example, max value, min value, mean value, median value, variance, standard deviation, etc., may be used on a sub-key either alone or in combination as a metric of similar features to limit the search space in a database.
  • the field can either be further narrowed using sub-key matching or the remaining keys can be fully compared for a final result. Both of these options are illustrated in the examples shown in Figs. 25-28. [2600] It should be noted that any number of sub-keys may be extracted and used to narrow the set of candidate keys for final matching.
  • the group of candidates for a full key match could be first narrowed by comparing the portions of the keys that are generated by the 135 degree orientation filter output (as heretofore explained in connection with the example of Fig. 25).
  • the match scores generated from this set of comparisons are then compared to a threshold value, and keys that score outside the threshold are excluded from further consideration.
  • a threshold value e.g. 135 degree orientation filter output
  • keys that score outside the threshold are excluded from further consideration.
  • This second-level sub-key comparison and subsequent score threshold will result in a further reduction of the candidate set eligible for full-key comparison.
  • indexes generated from sub-keys can be combined to better limit the candidate subset of the database used for full-key matching.
  • an initial first-level index (index one) may be based upon the non-zero element count in a specific sub-key, denoted as sub-key one.
  • sub-key one By filtering the database to only look at keys with an index value within a certain range of the index generated for a specific candidate key, the search space may be limited. However, if additional indexes are available within the database (index two, three, etc.), these additional indexes may be used to further limit the candidate search set.
  • index two is the non-zero element count for sub-key two
  • different features for the same sub-key for example, where index two is the mean value of sub-key one
  • a combination of the two for example, where index two is the non-zero element count of sub-key two and index three is the mean value of sub-key one, etc.
  • index two is the non-zero element count of sub-key two and index three is the mean value of sub-key one, etc.
  • index three is the mean value of sub-key one, etc.
  • search methods can also be combined. For example, a number of different index values may first be compared to provide a quick candidate search space limitation. Then several partial key matches may be performed to further limit the candidate space eligible for full-key matching.
  • the remaining candidates are then compared with full-key matching.
  • the order in which the index comparisons and sub-key matches occur may be mixed (as, for example, first an index search, then a sub-key match, then another index search, etc.).
  • the index searches involve single-value comparisons, they tend to be faster and less computationally involved than the sub-key matches, and thus it is usually advantageous to perform the index search comparisons first, before the sub-key matches (distance calculations) are done.
  • Another benefit of the present invention is greater accuracy and speed than heretofore possible in the prior art, when the present invention is used in conjunction with a prior art point-based vein matching system.
  • the fixed length key generated using the present invention is used to quickly limit the search space (as heretofore described, by key subset matching and/or partial key indexing) while the point-based information is used to match the remaining eligible candidates.
  • the present invention adds additional information to the information available to a point-based approach, leading to more accurate matching. When used as a pure index, the number of filters used can also be reduced to lessen computation time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Vascular Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Collating Specific Patterns (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé et un appareil d'identification en extrayant et en mettant en correspondance un détail biométrique provenant de l'image infrarouge d'une veine sous-cutanée. La zone d'intérêt de l'image est identifiée, et des artefacts sont enlevés. Une banque de filtres, tels des filtres de Gabor symétriques, des filtres de Gabor complexes, des filtres de Gabor Log, des fonctions Gaussiennes orientées ou des ondelettes, filtrent l'image en un ensemble d'images de valeur clé qui sont subdivisées en région. Une clé d'inscription, définie par des mesures statistiques ordonnées d'intensités de pixel dans les zones, est comparée, en utilisant une métrique de distance, à une clé de vérification mémorisée. Diverses mesures statistiques peuvent être utilisées, telles une variance, un écart type, une moyenne, un écart moyen absolu, une valeur maximum, une valeur minimum, une valeur absolue maximum, une valeur médiane ou une combinaison de ces mesures statistiques. Diverses métriques de distance peuvent être utilisées, telles une Euclidienne, de Hamming, une Euclidienne carrée, de Manhattan, une corrélation de Pearson, une corrélation de Pearson carrée, de Chebychev, ou une corrélation de rang de Spearman.
PCT/US2006/043091 2006-11-03 2006-11-03 Procédé et appareil pour extraction et correspondance d'un détail biométrique WO2008054396A1 (fr)

Priority Applications (8)

Application Number Priority Date Filing Date Title
KR1020097011232A KR20090087895A (ko) 2006-11-03 2006-11-03 생체인식정보의 추출과 대조를 위한 방법 및 장치
MX2009004749A MX2009004749A (es) 2006-11-03 2006-11-03 Metodo y aparato para extraccion y correspondencia de detalle biometrico.
CNA2006800568598A CN101595493A (zh) 2006-11-03 2006-11-03 用于提取和匹配生物测定细节的方法和设备
PCT/US2006/043091 WO2008054396A1 (fr) 2006-11-03 2006-11-03 Procédé et appareil pour extraction et correspondance d'un détail biométrique
JP2009536206A JP2010509672A (ja) 2006-11-03 2006-11-03 生体認識情報の抽出と対照のための方法及び装置
CA002671561A CA2671561A1 (fr) 2006-11-03 2006-11-03 Procede et appareil pour extraction et correspondance d'un detail biometrique
EP06836936A EP2092460A1 (fr) 2006-11-03 2006-11-03 Procédé et appareil pour extraction et correspondance d'un détail biométrique
AU2006350242A AU2006350242A1 (en) 2006-11-03 2006-11-03 Method and apparatus for extraction and matching of biometric detail

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2006/043091 WO2008054396A1 (fr) 2006-11-03 2006-11-03 Procédé et appareil pour extraction et correspondance d'un détail biométrique

Publications (1)

Publication Number Publication Date
WO2008054396A1 true WO2008054396A1 (fr) 2008-05-08

Family

ID=39344577

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/043091 WO2008054396A1 (fr) 2006-11-03 2006-11-03 Procédé et appareil pour extraction et correspondance d'un détail biométrique

Country Status (8)

Country Link
EP (1) EP2092460A1 (fr)
JP (1) JP2010509672A (fr)
KR (1) KR20090087895A (fr)
CN (1) CN101595493A (fr)
AU (1) AU2006350242A1 (fr)
CA (1) CA2671561A1 (fr)
MX (1) MX2009004749A (fr)
WO (1) WO2008054396A1 (fr)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010062883A1 (fr) * 2008-11-26 2010-06-03 Bioptigen, Inc. Procédés, systèmes et produits de programme informatique d'identification biométrique par imagerie de tissus par tomographie à cohérence optique (oct)
WO2012078114A1 (fr) * 2010-12-09 2012-06-14 Nanyang Technological University Procédé et appareil servant à déterminer des tracés de veines à partir d'une image en couleur
WO2012112788A2 (fr) * 2011-02-17 2012-08-23 Eyelock Inc. Procédé et système efficaces pour l'acquisition d'images d'un lieu et d'images d'un iris à l'aide d'un seul capteur
WO2012157835A1 (fr) * 2011-05-16 2012-11-22 동국대학교 산학협력단 Procédé de gestion d'une image vasculaire médicale en utilisant une technique de fusion d'images
CN102890772A (zh) * 2011-07-21 2013-01-23 常熟安智生物识别技术有限公司 掌静脉识别技术方案
CN103136522A (zh) * 2011-11-28 2013-06-05 常熟安智生物识别技术有限公司 指静脉识别技术方案
US8553948B2 (en) 2007-09-01 2013-10-08 Eyelock, Inc. System and method for iris data acquisition for biometric identification
US8787623B2 (en) 2008-11-26 2014-07-22 Bioptigen, Inc. Methods, systems and computer program products for diagnosing conditions using unique codes generated from a multidimensional image of a sample
US8803963B2 (en) 2008-09-22 2014-08-12 Kranthi Kiran Pulluru Vein pattern recognition based biometric system and methods thereof
US8958606B2 (en) 2007-09-01 2015-02-17 Eyelock, Inc. Mirror system and method for acquiring biometric data
US9002073B2 (en) 2007-09-01 2015-04-07 Eyelock, Inc. Mobile identity platform
US9036871B2 (en) 2007-09-01 2015-05-19 Eyelock, Inc. Mobility identity platform
US9117119B2 (en) 2007-09-01 2015-08-25 Eyelock, Inc. Mobile identity platform
US9152867B2 (en) 2011-04-27 2015-10-06 Los Angeles Biomedical Research Institute At Harbor-Ucla Medical Center Use of relatively permanent pigmented or vascular skin mark patterns in images for personal identification
US10019619B2 (en) 2014-03-25 2018-07-10 Fujitsu Frontech Limited Biometrics authentication device and biometrics authentication method
US10095912B2 (en) 2016-03-03 2018-10-09 Fujitsu Limited Biological image processing device, biological image processing method, and computer-readable non-transitory medium
CN110944118A (zh) * 2018-09-25 2020-03-31 富士施乐株式会社 存储介质、图像处理装置及图像处理方法
US20200234309A1 (en) * 2017-10-03 2020-07-23 Visa International Service Association System, Method, and Computer Program Product for Authenticating Identification Documents

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8631053B2 (en) * 2009-08-31 2014-01-14 Mitsubishi Electric Research Laboratories, Inc. Method for securely determining Manhattan distances
KR101352769B1 (ko) * 2012-05-09 2014-01-22 서강대학교산학협력단 배경과 관심조직을 구별하는 방법 및 장치
KR101413853B1 (ko) * 2012-06-20 2014-07-01 고려대학교 산학협력단 적외선 영상을 이용한 생체 신호 측정 방법 및 장치
DE102014017004A1 (de) 2014-11-12 2016-05-12 Frank Nitschke Verfahren zur biometrischen personenbeziehbaren Überwachung der Händedesinfektionshäufigkeit
CN108021912B (zh) * 2015-10-19 2021-06-29 Oppo广东移动通信有限公司 一种指纹识别的方法和装置
JP6658188B2 (ja) * 2016-03-24 2020-03-04 富士通株式会社 画像処理装置、画像処理方法および画像処理プログラム
EP3413235A1 (fr) * 2017-06-08 2018-12-12 Moqi Inc. Système et procédé de reconnaissance d'empreintes digitales
CN108090336B (zh) * 2017-12-19 2021-06-11 西安易朴通讯技术有限公司 一种应用在电子设备中的解锁方法及电子设备
KR102145041B1 (ko) * 2018-10-30 2020-08-14 박지원 어류 혈관 인식 시스템
CN111199527B (zh) * 2020-01-04 2021-02-02 圣点世纪科技股份有限公司 一种基于多方向自适应阈值的指静脉图像噪声检测方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5291560A (en) * 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
US6301375B1 (en) * 1997-04-14 2001-10-09 Bk Systems Apparatus and method for identifying individuals through their subcutaneous vein patterns and integrated system using said apparatus and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3586431B2 (ja) * 2001-02-28 2004-11-10 松下電器産業株式会社 個人認証方法および装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5291560A (en) * 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
US6301375B1 (en) * 1997-04-14 2001-10-09 Bk Systems Apparatus and method for identifying individuals through their subcutaneous vein patterns and integrated system using said apparatus and method

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9792498B2 (en) 2007-09-01 2017-10-17 Eyelock Llc Mobile identity platform
US9036871B2 (en) 2007-09-01 2015-05-19 Eyelock, Inc. Mobility identity platform
US9117119B2 (en) 2007-09-01 2015-08-25 Eyelock, Inc. Mobile identity platform
US10296791B2 (en) 2007-09-01 2019-05-21 Eyelock Llc Mobile identity platform
US9946928B2 (en) 2007-09-01 2018-04-17 Eyelock Llc System and method for iris data acquisition for biometric identification
US9192297B2 (en) 2007-09-01 2015-11-24 Eyelock Llc System and method for iris data acquisition for biometric identification
US9626563B2 (en) 2007-09-01 2017-04-18 Eyelock Llc Mobile identity platform
US8553948B2 (en) 2007-09-01 2013-10-08 Eyelock, Inc. System and method for iris data acquisition for biometric identification
US9095287B2 (en) 2007-09-01 2015-08-04 Eyelock, Inc. System and method for iris data acquisition for biometric identification
US9055198B2 (en) 2007-09-01 2015-06-09 Eyelock, Inc. Mirror system and method for acquiring biometric data
US8958606B2 (en) 2007-09-01 2015-02-17 Eyelock, Inc. Mirror system and method for acquiring biometric data
US9002073B2 (en) 2007-09-01 2015-04-07 Eyelock, Inc. Mobile identity platform
US8803963B2 (en) 2008-09-22 2014-08-12 Kranthi Kiran Pulluru Vein pattern recognition based biometric system and methods thereof
US8787623B2 (en) 2008-11-26 2014-07-22 Bioptigen, Inc. Methods, systems and computer program products for diagnosing conditions using unique codes generated from a multidimensional image of a sample
US8687856B2 (en) 2008-11-26 2014-04-01 Bioptigen, Inc. Methods, systems and computer program products for biometric identification by tissue imaging using optical coherence tomography (OCT)
WO2010062883A1 (fr) * 2008-11-26 2010-06-03 Bioptigen, Inc. Procédés, systèmes et produits de programme informatique d'identification biométrique par imagerie de tissus par tomographie à cohérence optique (oct)
US9361518B2 (en) 2008-11-26 2016-06-07 Bioptigen, Inc. Methods, systems and computer program products for diagnosing conditions using unique codes generated from a multidimensional image of a sample
WO2012078114A1 (fr) * 2010-12-09 2012-06-14 Nanyang Technological University Procédé et appareil servant à déterminer des tracés de veines à partir d'une image en couleur
US9317761B2 (en) 2010-12-09 2016-04-19 Nanyang Technological University Method and an apparatus for determining vein patterns from a colour image
US9280706B2 (en) 2011-02-17 2016-03-08 Eyelock Llc Efficient method and system for the acquisition of scene imagery and iris imagery using a single sensor
US10116888B2 (en) 2011-02-17 2018-10-30 Eyelock Llc Efficient method and system for the acquisition of scene imagery and iris imagery using a single sensor
WO2012112788A2 (fr) * 2011-02-17 2012-08-23 Eyelock Inc. Procédé et système efficaces pour l'acquisition d'images d'un lieu et d'images d'un iris à l'aide d'un seul capteur
WO2012112788A3 (fr) * 2011-02-17 2013-01-03 Eyelock Inc. Procédé et système efficaces pour l'acquisition d'images d'un lieu et d'images d'un iris à l'aide d'un seul capteur
US9607231B2 (en) 2011-04-27 2017-03-28 Los Angeles Biomedical Research Institute At Harbor-Ucla Medical Center Use of relatively permanent pigmented or vascular skin mark patterns in images for personal identification
US9152867B2 (en) 2011-04-27 2015-10-06 Los Angeles Biomedical Research Institute At Harbor-Ucla Medical Center Use of relatively permanent pigmented or vascular skin mark patterns in images for personal identification
WO2012157835A1 (fr) * 2011-05-16 2012-11-22 동국대학교 산학협력단 Procédé de gestion d'une image vasculaire médicale en utilisant une technique de fusion d'images
CN102890772A (zh) * 2011-07-21 2013-01-23 常熟安智生物识别技术有限公司 掌静脉识别技术方案
CN103136522A (zh) * 2011-11-28 2013-06-05 常熟安智生物识别技术有限公司 指静脉识别技术方案
US10019619B2 (en) 2014-03-25 2018-07-10 Fujitsu Frontech Limited Biometrics authentication device and biometrics authentication method
US10095912B2 (en) 2016-03-03 2018-10-09 Fujitsu Limited Biological image processing device, biological image processing method, and computer-readable non-transitory medium
US20200234309A1 (en) * 2017-10-03 2020-07-23 Visa International Service Association System, Method, and Computer Program Product for Authenticating Identification Documents
US11620657B2 (en) * 2017-10-03 2023-04-04 Visa International Service Association System, method, and computer program product for authenticating identification documents
US11978299B2 (en) 2017-10-03 2024-05-07 Visa International Service Association System, method, and computer program product for authenticating identification documents
CN110944118A (zh) * 2018-09-25 2020-03-31 富士施乐株式会社 存储介质、图像处理装置及图像处理方法
CN110944118B (zh) * 2018-09-25 2024-01-26 富士胶片商业创新有限公司 计算机可读存储介质、图像处理装置及图像处理方法

Also Published As

Publication number Publication date
JP2010509672A (ja) 2010-03-25
CA2671561A1 (fr) 2008-05-08
EP2092460A1 (fr) 2009-08-26
KR20090087895A (ko) 2009-08-18
MX2009004749A (es) 2009-09-28
AU2006350242A1 (en) 2008-05-08
CN101595493A (zh) 2009-12-02

Similar Documents

Publication Publication Date Title
US20080298642A1 (en) Method and apparatus for extraction and matching of biometric detail
WO2008054396A1 (fr) Procédé et appareil pour extraction et correspondance d'un détail biométrique
Syarif et al. Enhanced maximum curvature descriptors for finger vein verification
JP5107045B2 (ja) 目に関して取得される画像中の虹彩を表す画素を特定する方法
Ng et al. A review of iris recognition algorithms
JP2009523265A (ja) 画像中の虹彩の特徴を抽出する方法
Chen et al. Iris recognition based on bidimensional empirical mode decomposition and fractal dimension
Hartung et al. Spectral minutiae for vein pattern recognition
JP2007188504A (ja) 画像中の画素強度をフィルタリングする方法
WO2008100329A2 (fr) Dispositif d'imagerie multispectral multibiométrique
Chuang Vein recognition based on minutiae features in the dorsal venous network of the hand
Nagwanshi et al. Biometric authentication using human footprint
Al-Nima et al. Finger texture biometric characteristic: a survey
Uriarte-Antonio et al. Vascular biometrics based on a minutiae extraction approach
Donida Labati et al. A scheme for fingerphoto recognition in smartphones
Hiew et al. Preprocessing of fingerprint images captured with a digital camera
Kushwaha et al. Person identification using footprint minutiae
Kumar et al. Biometric authentication based on infrared thermal hand vein patterns
Gupta et al. A vein biometric based authentication system
BENzIANE et al. Biometric technology based on hand vein
Chopra et al. Finger print and finger vein recognition using repeated line tracking and minutiae
Kovac et al. Multimodal biometric system based on fingerprint and finger vein pattern
Poornima et al. Versatile and economical acquisition setup for dorsa palm vein authentication
Linsangan et al. Comparing local invariant algorithms for dorsal hand vein recognition system
Nivas et al. Real-time finger-vein recognition system

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200680056859.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06836936

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: MX/A/2009/004749

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2009536206

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2006350242

Country of ref document: AU

Ref document number: 2006836936

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 1020097011232

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 3134/CHENP/2009

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 2671561

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2006350242

Country of ref document: AU

Date of ref document: 20061103

Kind code of ref document: A

ENPW Started to enter national phase and was withdrawn or failed for other reasons

Ref document number: PI0622110

Country of ref document: BR

Kind code of ref document: A2

Free format text: PEDIDO CONSIDERADO RETIRADO EM RELACAO AO BRASIL E ARQUIVADO POR NAO ATENDER A EXIGENCIA PUBLICADA NA RPI NO 2047 DE 30/03/2010 QUANTO AS DETERMINACOES REFERENTES A ENTRADA DO PEDIDO NA FASE NACIONAL.