US20020150281A1 - Method of recognizing human iris using daubechies wavelet transform - Google Patents
Method of recognizing human iris using daubechies wavelet transform Download PDFInfo
- Publication number
- US20020150281A1 US20020150281A1 US09/946,714 US94671401A US2002150281A1 US 20020150281 A1 US20020150281 A1 US 20020150281A1 US 94671401 A US94671401 A US 94671401A US 2002150281 A1 US2002150281 A1 US 2002150281A1
- Authority
- US
- United States
- Prior art keywords
- iris
- characteristic vector
- image
- characteristic
- values
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
Definitions
- the present invention relates to a method of recognizing the human iris and, more particularly, to a method of recognizing the human iris using the Daubechies wavelet transform to reduce the dimensions of characteristic vectors to improve the processing time.
- An iris recognition system is used for performing the identification of an individual based on the information obtained from the analysis of the iris patterns, which are different for each individual.
- the iris recognition system has superior identification accuracy and thus provides excellent security when compared to other biometric methods that use voice and fingerprints for identification.
- a wavelet transform is typically used to extract the characteristics of the iris images and involves analyzing signals in a multi-resolution mode.
- the wavelet transform is a mathematical theory used for formulating a model for systems, signals, and a series of processes using selected signals based on the Fourier transform. These signals are referred to as little waves or wavelets.
- the wavelet transform is widely employed in the field of signal and image processing as it has a faster rate when compared with the traditional signal processing algorithm, and it can efficiently achieve signal localization in time and frequency domains.
- the images are obtained by extracting the iris patterns from an iris image that are acquired by an image acquisition device, then patterns normalized in the 450 ⁇ 60 size are used to extract the characteristic values using the wavelet transform.
- the Harr wavelet transform has been widely used also in the conventional iris recognition systems, image processing, and the like.
- the Harr wavelet transform has disadvantages in that the characteristic values change irregularly and rapidly.
- a high resolution of the images cannot be obtained if the images are decompressed again after they have been compressed.
- the Daubechies wavelet transform is a continuous function, thus the disadvantages associated with the Harr wavelet functions can be avoided in certain instances for extracting more accurate and delicate characteristic values.
- the images can be restored with a high resolution quality back to the original images if the Harr wavelet transform is used.
- the Daubechies wavelet functions are generally more complicated than the Harr wavelet functions, there is a disadvantage in that a larger arithmetic quantity may be needed.
- a main advantage of the Daubechies wavelet transform is that it provides fine characteristic values when performing the wavelet transform to extract the characteristic values. That is, if the Daubechies wavelet transform is used, the identification of the iris features can be made with a lower number of data, and the extraction of the iris features can be made accurately.
- Another method of extracting the characteristic values indicative of the iris patterns and forming the characteristic vectors uses the Gabor transform.
- the characteristic vectors generated by this method require 256 or more dimensions and at least 256 bytes, where one byte is assigned to one dimension.
- practicability and efficiency are undermined when the Gabor transform is used in the field if low capacity information is required.
- the Hamming distance (HD) is used to verify the two characteristic vectors generated in the form of binary vectors.
- the method of measuring a distance, such as the Hamming distance (HD) between two characteristic vectors (i.e., characteristic vectors relevant to the input pattern and the stored reference characteristic vectors) for the pattern classification is disclosed in U.S. Pat. No. 5,291,560, the teachings of which are incorporated herein by reference.
- the bit values assigned according to the respective dimension are compared with each other. If they are identical to each other, 0 is given; and if they are different from each other, 1 is given. Then, the value divided by the total number of dimensions is obtained as a final result.
- this method is simple and useful in discriminating the degree of similarity between the characteristic vectors consisting of binary codes.
- the comparison result of all the bits becomes 0 if identical data are compared with each other.
- the result approaching 0 implies that the data belong to the persons themselves. If the data do indeed belong to the person, the probability of the degree of similarity will be 0.5. Accordingly, a proper limit set between 0 and 0.5 will be a boundary for differentiating between people.
- the Hamming distance (HD) is also excellent for application with the extracted iris features by subdividing the data, but it is not suitable when low capacity data is to be used. If the total number of the bits of the characteristic vectors with 256-byte information is 2048, considerably high acceptance rates are realized even though the Hamming distance is applied.
- the accuracy of differentiating characteristic vectors is poor due to an increase in lost information.
- a method of preventing information loss while maintaining the minimum capacity of the characteristic vectors is needed in generating the characteristic vectors. Accordingly, there is a need for a method of forming the low capacity characteristic vectors, so that the processing, storage, transfer, search, and the like of the pattern information can be achieved efficiently.
- the present invention is directed to a method of forming low capacity characteristic vectors, so that the false acceptance rate (FAR) and the false rejection rate (FRR) can be remarkably reduced as compared to the conventional Harr wavelet transform.
- FAR false acceptance rate
- FRR false rejection rate
- the iris features from inputted iris image signals are extracted using the Daubechies wavelet transform.
- One aspect of the present invention provides a method for measuring the similarity between the characteristic vectors, wherein the low capacity characteristic vectors can be properly used for the similarity measurement while the loss of information can be minimized.
- Another aspect of the present invention provides a method for recognizing the human iris using the Daubechies wavelet transform, wherein the iris image from an eye using an image acquisition device with a halogen lamp illuminator is provided.
- the method includes the steps of: (a) repeatedly performing the Daubechies wavelet transform of the iris image at predetermined times to multi-divide the iris image, and extracting an image including the high frequency components from the multi-divided image to extract iris features; (b) extracting the characteristic values of a characteristic vector from the extracted image with the high frequency components, and generating a binary characteristic vector by quantizing the relevant characteristic values; and, (c) determining the user as an enrollee based on the similarity between the generated characteristic vector and a previously registered characteristic vector.
- the iris image is acquired through an image acquisition device utilizing a halogen lamp as an illuminator.
- the iris image is multi-divided, and the iris features with optimized sizes are extracted.
- the characteristic vector which is effective in displaying and processing the image, is then formed by quantizing the extracted characteristic values.
- the dimension of the characteristic vector is reduced by quantizing the extracted characteristic values into binary values—that is, when a low capacity characteristic vector is formed, the method of measuring the similarity between the weight registered and the inputted characteristic vectors is used to prevent the reduction of acceptance resulting from the formation of the low capacity characteristic vector.
- the user authenticity is, therefore, determined by the foregoing method.
- FIG. 1 is a view illustrating the constitution of the image acquisition equipment used for performing an iris recognition method according to the present invention.
- FIG. 2 is a flowchart illustrating the process of verifying an iris image according to the present invention.
- FIG. 3 is a flowchart illustrating the process of multi-dividing the iris image using the Daubechies wavelet transform according to the present invention.
- FIG. 4 shows an example of multi-dividing the iris image using the Daubechies wavelet transform.
- FIG. 5 is a flowchart illustrating the process of forming the characteristic vector of an iris image based on the data acquired from the multi-dividing operation according to the present invention.
- FIG. 6 a shows a distribution example of the characteristic values of the extracted iris image.
- FIG. 6 b shows the quantization function for generating a binary characteristic vector from the distribution example of FIG. 6 a.
- FIG. 7 is a flowchart showing the procedures for determining user authenticity through a similarity test between the characteristic vectors.
- FIG. 1 shows the exemplary embodiment of the image acquisition equipment for use in recognizing a human iris according to the present invention.
- the image acquisition equipment includes a halogen lamp 11 for illuminating the iris in order to acquire clear iris patterns, a CCD camera 13 for photographing the eye 10 of a user through a lens 12 , a frame grabber 14 connected to the CCD camera 12 for acquiring the iris image, and a monitor 15 for showing the image to the user so that the acquisition of correct images and the position of the user can be obtained as the images are acquired.
- the CCD camera 13 is used to acquire the eye image, and the iris recognition is made through the pattern analysis of iridial folds.
- the iris image is acquired indoors using an ordinary illuminator, it is difficult to extract the desired pattern information as the iris image is generally gloomy. Additional illuminators should therefore be used so that the information on the iris image cannot be lost and a clear iris pattern can be obtained.
- the halogen lamp 11 with strong floodlighting effects is preferably used as a main illuminator so that the iris pattern can be clearly shown.
- other light sources known to those skilled in this art can be successfully used.
- the loss of the iris image information and eye fatigue of the user can be avoided by placing the halogen lamp illuminators on the left and right sides of the eye in order to cause the reflective light from the lamp to be formed on the outer portions of the iris region.
- FIG. 2 is a flowchart showing the operation steps for verifying the iris image for identification purposes according to the present invention.
- the eye image is acquired through the image acquisition equipment shown in FIG. 1 in step 200 .
- the images of the iris regions are extracted from the acquired eye image through pre-processing and transformed into a polar coordinate system, then the transformed iris pattern is inputted to a module for extracting the features.
- Acquiring the iris image and transforming the image into a polar coordinate system are well known in the art that can be performed in a variety of ways.
- step 220 the Daubechies wavelet transform of the inputted iris pattern transformed into the polar coordinate system is performed, and the features of the iris regions are then extracted.
- the extracted features would have real numbers.
- step 230 a binary characteristic vector is generated by applying a K-level quantization function to the extracted features.
- step 240 the similarity between the generated characteristic vector and the previously registered data of the user is measured. Through the similarity measurement, user authenticity is determined and then the verification results are obtained.
- the Daubechies wavelet function with eight, sixteen, or more coefficients can extract more delicate characteristic values than the Daubechies wavelet function with four coefficients, even though the former method is more complicated than the latter.
- the Daubechies wavelet function with eight or more coefficients has been used and tested in the present invention, greater performance improvement was not obtained and the arithmetic quantity and processing time are increased, as compared with a case where the Daubechies wavelet function with four coefficients is tested.
- the Daubechies wavelet function with four coefficients may be used for extracting the characteristic values indicative of the iris patterns.
- FIG. 3 is a flowchart showing the process of multi-dividing the iris image by performing the Daubechies wavelet transform according to the present invention.
- FIG. 4 shows an image divided using the Daubechies wavelet transform.
- LPF low-pass filter
- HH high-pass filter
- the subscript numerals signify image-dividing stages. For example, “LH 2 ” means that the image has passed through the low-pass filter in the x direction and through the high-pass filter in the y direction during the 2-stage wavelet division.
- the inputted iris image is multi-divided using the Daubechies wavelet transform.
- the iris image is considered a two-dimensional signal in which one-dimensional signals are arrayed in the x and y directions, quarterly divided components of one image should be extracted by passing through the LPF and HPF in all x and y directions in order to analyze the iris image. That is, one two-dimensional image signal is wavelet-transformed in vertical and horizontal directions, and the image is divided into four regions: LL, LH, HL, and HH after the wavelet transform has been performed once. At this time, using the Daubechies wavelet transform, the signal is divided into a differential component thereof that has passed through the high-pass filter and an average component that has passed through the low-pass filter.
- the performance of the iris recognition system is evaluated in view of two factors; a false acceptance rate (FAR) and a false rejection rate (FRR).
- FAR means the probability that the entrance of unregistered persons (imposters) may be accepted due to the false recognition of unregistered persons as registered persons
- FRR means the probability that entrance of registered persons (enrollees) is rejected due to false recognition of the registered persons as unregistered ones.
- the FAR has been reduced from 5.5% to 3.07% and the FRR has also been reduced from 5.0% to 2.25%, as compared with the method of recognizing the human iris using the conventional Harr wavelet transform.
- step 320 a region HH including only the high frequency components in the x and y directions are extracted from the divided iris image.
- step 330 after increasing the iterative number of times of dividing the iris image, the processing step is completed when the iterative number is greater than a predetermined number. Alternatively, if the iterative number is lower than the predetermined number, the information on the region HH is stored for use in extracting the iris features in step 340 .
- step 350 the region LL comprising only low frequency components in the x and y directions is extracted from the multi-divided iris image.
- the extracted region LL (corresponding to the image reduced in a fourth size as compared with the previous image) includes major information on the iris image, it is provided as an image to be newly processed so that the wavelet transform can be applied again to the relevant region. Thereafter, the Daubechies wavelet transform is repeated again from step 310 .
- the region between the inner and outer boundaries of the iris is divided into 60 segments in the r direction and 450 segments in the ⁇ direction by varying the angles by 0.8 degrees.
- the information on the iris image is acquired and normalized as 450 ⁇ 60 ( ⁇ r) data.
- the characteristics of the 225 ⁇ 30 region HH 1 of which size is reduced by half are obtained, namely, the 225 ⁇ 30 information is used as a characteristic vector.
- This information may be used as it is, but the process of dividing the signals is repeatedly performed in order to reduce the information size. Since the region LL includes major information on the iris image, the characteristic values of further reduced regions, such as HH 2 , HH 3 , and HH 4 , are obtained by successively applying the wavelet transform to the respective relevant regions.
- FIG. 5 is a flowchart showing the process of forming the characteristic vector of the iris image using the data acquired from the multi-divided iris image according to the present invention.
- the information on the n characteristic vector extracted from the above process i.e., the information on the regions HH 1 , HH 2 , HH 3 , and HH 4 is inputted in step 510 .
- step 520 in order to acquire the characteristic information on the regions HH 1 , HH 2 , and HH 3 excluding the information on the region HH 4 obtained through the last wavelet transform among the n characteristic vector, each average value of the regions HH 1 , HH 2 , and HH 3 is calculated and assigned one dimension.
- step 530 all values of the final obtained region HH 4 are extracted as the characteristic values thereof.
- the characteristic vector is generated based on these characteristics.
- a module for generating the characteristic vector mainly performs the processes of extracting the characteristic values in the form of real numbers and then transforming them to binary codes consisting of 0 and 1.
- step 550 the values of the previously obtained characteristic vector, i.e., the respective component values of the characteristic vector expressed in the form of the real numbers, are quantized into binary values 0 or 1.
- step 560 the resultant (M+N ⁇ 1)-bit characteristic vector is generated by the quantized values. That is, according to the present invention, the resultant 87-bit characteristic vector is generated.
- FIG. 6 a shows a distribution example of the characteristic values of the extracted iris image.
- the distribution roughly takes the shape of FIG. 6 a .
- the binary vector including all the dimensions is generated by the following Equation 1.
- f(n) is a characteristic value of the n-th dimension
- f n is the value of the n-th characteristic vector
- Equation 2 [0044] where f n represents the n-th dimension of the previously registered characteristic vector f R of the user or the characteristic vector f T of the user generated from the iris image of the eye image of the user.
- the value of the i-th dimension f Ri or f Ti is converted and assigned “4” if the value is “11.”
- the value of the i-th dimension f Ri or f Ti is converted and assigned “1” if the value is “10.”
- the value of the i-th dimension f Ri or f Ti is converted and assigned ⁇ 1 if the value is “01.”
- the value of the i-th dimension f Ri or f Ti is converted and assigned ⁇ 4 if the value is “
- FIG. 7 is a flowchart showing the procedures for discriminating user authenticity through the similarity measurement test between the characteristic vectors.
- the characteristic vector f T of the user is generated from the iris image of the eye image of the user.
- Step 720 searches the previously registered characteristic vector f R of the user.
- the weights are assigned to the characteristic vectors f R and f T depending on the value of the binary characteristic vector based on Equation 2.
- step 740 an inner product or scalar product S of the two characteristic vectors is calculated and the similarity is finally measured.
- the measures generally used for determining the correlation between the registered characteristic vector f R and the characteristic vector f T of the user it is the inner product S of the two characteristic vectors that indicate the most direct association. That is, after the weights have been assigned to the respective data of the characteristic vector in step 730 , the inner product S of the two characteristic vectors is used to measure the similarity between the two vectors.
- Equation 3 is used for calculating the inner product of the two characteristic vectors.
- f R is the characteristic vector of the user that has been already registered
- f T is the characteristic vector of the user that is generated from the iris image of the eye of the user.
- one effect which can be obtained by the quantization according to the sign of the characteristic vector values as in the method in which the binary vector, is generated with respect to the values of the characteristic vector extracted from the iris image according to the respective dimensions. That is, like the Hamming distance, the difference between 0 and 1 can be expressed.
- the two characteristic vectors have the same-signed values with respect to each dimension, positive values are added to the inner product S of the two characteristic vectors. Otherwise, negative values are added to the inner product S of the two vectors. Consequently, the inner product S of the two characteristic vectors increases if the two data belong to an identical person, while the inner product S of the two characteristic vectors decreases if the two data do not belong to an identical person.
- step 750 the user authenticity is determined according to the measured similarity obtained from the inner product S of the two characteristic vectors. At this time, the determination of the user authenticity based on the measured similarity depends on the following Equation 4.
- C is a reference value for verifying the similarity between the two characteristic vectors.
- the method of recognizing the human iris using the Daubechies wavelet transform according to the present invention has an advantage in that FAR and FRR can be remarkably reduced as compared with the method using the conventional Harr wavelet transform, as the iris features are extracted from the inputted iris image signals through the Daubechies wavelet transform.
- the inner product S of the two characteristic vectors is calculated, and the user authenticity is determined based on the measured similarity obtained by the calculated inner product S of the two vectors. Therefore, there is provided a method of measuring the similarity between the characteristic vectors wherein the loss of the information, which may be produced by forming the low capacity characteristic vectors, can be minimized.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Collating Specific Patterns (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Analysis (AREA)
- Image Input (AREA)
- Complex Calculations (AREA)
Abstract
The present invention relates to a method of recognizing the human iris using the Daubechies wavelet transform. The dimensions of characteristic vectors are initially reduced by extracting iris features from the inputted iris image signals through the Daubechies wavelet transform. Then, the binary characteristic vectors are generated by applying quantization functions to the extracted characteristic values so that the utility of human iris recognition can be improved as the storage capacity and processing time thereof can be reduced by generating low capacity characteristic vectors. By measuring the similarity between the generated characteristic vectors and the previously registered characteristic vectors, characteristic vectors indicative of the iris patterns can be realized.
Description
- This application makes reference to, incorporates the same herein, and claims all benefits accruing under 35 U.S.C. Section 119 from an application for “Method of Recognizing Human Iris Using Daubechies Wavelet Transform,” filed earlier in the Korean Industrial Property Office on Mar. 6, 2001, and there duly assigned Serial No. 2001-11440.
- 1. Field of Invention
- The present invention relates to a method of recognizing the human iris and, more particularly, to a method of recognizing the human iris using the Daubechies wavelet transform to reduce the dimensions of characteristic vectors to improve the processing time.
- 2. Description of the Related Art
- An iris recognition system is used for performing the identification of an individual based on the information obtained from the analysis of the iris patterns, which are different for each individual. The iris recognition system has superior identification accuracy and thus provides excellent security when compared to other biometric methods that use voice and fingerprints for identification.
- A wavelet transform is typically used to extract the characteristics of the iris images and involves analyzing signals in a multi-resolution mode. The wavelet transform is a mathematical theory used for formulating a model for systems, signals, and a series of processes using selected signals based on the Fourier transform. These signals are referred to as little waves or wavelets. Recently, the wavelet transform is widely employed in the field of signal and image processing as it has a faster rate when compared with the traditional signal processing algorithm, and it can efficiently achieve signal localization in time and frequency domains. The images are obtained by extracting the iris patterns from an iris image that are acquired by an image acquisition device, then patterns normalized in the 450×60 size are used to extract the characteristic values using the wavelet transform.
- There are other types of wavelet transmform known in the art. For example, the Harr wavelet transform has been widely used also in the conventional iris recognition systems, image processing, and the like. However, the Harr wavelet transform has disadvantages in that the characteristic values change irregularly and rapidly. In addition, a high resolution of the images cannot be obtained if the images are decompressed again after they have been compressed. In contrast, the Daubechies wavelet transform is a continuous function, thus the disadvantages associated with the Harr wavelet functions can be avoided in certain instances for extracting more accurate and delicate characteristic values. If the images are decompressed again after they have been compressed using the Daubechies wavelet transform, the images can be restored with a high resolution quality back to the original images if the Harr wavelet transform is used. However, as the Daubechies wavelet functions are generally more complicated than the Harr wavelet functions, there is a disadvantage in that a larger arithmetic quantity may be needed. A main advantage of the Daubechies wavelet transform is that it provides fine characteristic values when performing the wavelet transform to extract the characteristic values. That is, if the Daubechies wavelet transform is used, the identification of the iris features can be made with a lower number of data, and the extraction of the iris features can be made accurately.
- Another method of extracting the characteristic values indicative of the iris patterns and forming the characteristic vectors uses the Gabor transform. However, the characteristic vectors generated by this method require 256 or more dimensions and at least 256 bytes, where one byte is assigned to one dimension. Thus, there is a problem in that practicability and efficiency are undermined when the Gabor transform is used in the field if low capacity information is required.
- The Hamming distance (HD) is used to verify the two characteristic vectors generated in the form of binary vectors. The method of measuring a distance, such as the Hamming distance (HD) between two characteristic vectors (i.e., characteristic vectors relevant to the input pattern and the stored reference characteristic vectors) for the pattern classification is disclosed in U.S. Pat. No. 5,291,560, the teachings of which are incorporated herein by reference. The bit values assigned according to the respective dimension are compared with each other. If they are identical to each other, 0 is given; and if they are different from each other, 1 is given. Then, the value divided by the total number of dimensions is obtained as a final result. Hence, this method is simple and useful in discriminating the degree of similarity between the characteristic vectors consisting of binary codes. The comparison result of all the bits becomes 0 if identical data are compared with each other. Thus, the result approaching 0 implies that the data belong to the persons themselves. If the data do indeed belong to the person, the probability of the degree of similarity will be 0.5. Accordingly, a proper limit set between 0 and 0.5 will be a boundary for differentiating between people. The Hamming distance (HD) is also excellent for application with the extracted iris features by subdividing the data, but it is not suitable when low capacity data is to be used. If the total number of the bits of the characteristic vectors with 256-byte information is 2048, considerably high acceptance rates are realized even though the Hamming distance is applied. In addition, there are disadvantages in that the formation of the reference characteristic vectors through generalizing the pattern information cannot be easily made, and one can not rely upon the information characteristics of each dimension of the characteristic vectors.
- Accordingly, if the low capacity characteristic vectors are used, the accuracy of differentiating characteristic vectors is poor due to an increase in lost information. Thus, a method of preventing information loss while maintaining the minimum capacity of the characteristic vectors is needed in generating the characteristic vectors. Accordingly, there is a need for a method of forming the low capacity characteristic vectors, so that the processing, storage, transfer, search, and the like of the pattern information can be achieved efficiently.
- The present invention is directed to a method of forming low capacity characteristic vectors, so that the false acceptance rate (FAR) and the false rejection rate (FRR) can be remarkably reduced as compared to the conventional Harr wavelet transform. To this end, the iris features from inputted iris image signals are extracted using the Daubechies wavelet transform.
- One aspect of the present invention provides a method for measuring the similarity between the characteristic vectors, wherein the low capacity characteristic vectors can be properly used for the similarity measurement while the loss of information can be minimized.
- Another aspect of the present invention provides a method for recognizing the human iris using the Daubechies wavelet transform, wherein the iris image from an eye using an image acquisition device with a halogen lamp illuminator is provided. The method includes the steps of: (a) repeatedly performing the Daubechies wavelet transform of the iris image at predetermined times to multi-divide the iris image, and extracting an image including the high frequency components from the multi-divided image to extract iris features; (b) extracting the characteristic values of a characteristic vector from the extracted image with the high frequency components, and generating a binary characteristic vector by quantizing the relevant characteristic values; and, (c) determining the user as an enrollee based on the similarity between the generated characteristic vector and a previously registered characteristic vector.
- According to another aspect of the present invention, the iris image is acquired through an image acquisition device utilizing a halogen lamp as an illuminator. By repeatedly performing the Daubechies wavelet transform of the inputted iris image, the iris image is multi-divided, and the iris features with optimized sizes are extracted. The characteristic vector, which is effective in displaying and processing the image, is then formed by quantizing the extracted characteristic values. Furthermore, the dimension of the characteristic vector is reduced by quantizing the extracted characteristic values into binary values—that is, when a low capacity characteristic vector is formed, the method of measuring the similarity between the weight registered and the inputted characteristic vectors is used to prevent the reduction of acceptance resulting from the formation of the low capacity characteristic vector. The user authenticity is, therefore, determined by the foregoing method.
- FIG. 1 is a view illustrating the constitution of the image acquisition equipment used for performing an iris recognition method according to the present invention.
- FIG. 2 is a flowchart illustrating the process of verifying an iris image according to the present invention.
- FIG. 3 is a flowchart illustrating the process of multi-dividing the iris image using the Daubechies wavelet transform according to the present invention.
- FIG. 4 shows an example of multi-dividing the iris image using the Daubechies wavelet transform.
- FIG. 5 is a flowchart illustrating the process of forming the characteristic vector of an iris image based on the data acquired from the multi-dividing operation according to the present invention.
- FIG. 6a shows a distribution example of the characteristic values of the extracted iris image.
- FIG. 6b shows the quantization function for generating a binary characteristic vector from the distribution example of FIG. 6a.
- FIG. 7 is a flowchart showing the procedures for determining user authenticity through a similarity test between the characteristic vectors.
- Hereinafter, a method of recognizing a human iris using the Daubechies wavelet transform according to the present invention will be explained in detail with reference to the accompanying drawings.
- FIG. 1 shows the exemplary embodiment of the image acquisition equipment for use in recognizing a human iris according to the present invention. The image acquisition equipment includes a
halogen lamp 11 for illuminating the iris in order to acquire clear iris patterns, aCCD camera 13 for photographing theeye 10 of a user through alens 12, aframe grabber 14 connected to theCCD camera 12 for acquiring the iris image, and amonitor 15 for showing the image to the user so that the acquisition of correct images and the position of the user can be obtained as the images are acquired. - In the embodiment, the
CCD camera 13 is used to acquire the eye image, and the iris recognition is made through the pattern analysis of iridial folds. However, where the iris image is acquired indoors using an ordinary illuminator, it is difficult to extract the desired pattern information as the iris image is generally gloomy. Additional illuminators should therefore be used so that the information on the iris image cannot be lost and a clear iris pattern can be obtained. In the present invention, thehalogen lamp 11 with strong floodlighting effects is preferably used as a main illuminator so that the iris pattern can be clearly shown. However, it should be noted that other light sources known to those skilled in this art can be successfully used. Furthermore, as shown in FIG. 1, the loss of the iris image information and eye fatigue of the user can be avoided by placing the halogen lamp illuminators on the left and right sides of the eye in order to cause the reflective light from the lamp to be formed on the outer portions of the iris region. - FIG. 2 is a flowchart showing the operation steps for verifying the iris image for identification purposes according to the present invention. Referring to FIG. 2, the eye image is acquired through the image acquisition equipment shown in FIG. 1 in
step 200. Instep 210, the images of the iris regions are extracted from the acquired eye image through pre-processing and transformed into a polar coordinate system, then the transformed iris pattern is inputted to a module for extracting the features. Acquiring the iris image and transforming the image into a polar coordinate system are well known in the art that can be performed in a variety of ways. Instep 220, the Daubechies wavelet transform of the inputted iris pattern transformed into the polar coordinate system is performed, and the features of the iris regions are then extracted. The extracted features would have real numbers. Instep 230, a binary characteristic vector is generated by applying a K-level quantization function to the extracted features. Instep 240, the similarity between the generated characteristic vector and the previously registered data of the user is measured. Through the similarity measurement, user authenticity is determined and then the verification results are obtained. - In a case where the features of the iris regions are extracted by performing the Daubechies wavelet transform as described above, the Daubechies wavelet function with eight, sixteen, or more coefficients can extract more delicate characteristic values than the Daubechies wavelet function with four coefficients, even though the former method is more complicated than the latter. Although the Daubechies wavelet function with eight or more coefficients has been used and tested in the present invention, greater performance improvement was not obtained and the arithmetic quantity and processing time are increased, as compared with a case where the Daubechies wavelet function with four coefficients is tested. Hence, the Daubechies wavelet function with four coefficients may be used for extracting the characteristic values indicative of the iris patterns.
- FIG. 3 is a flowchart showing the process of multi-dividing the iris image by performing the Daubechies wavelet transform according to the present invention. FIG. 4 shows an image divided using the Daubechies wavelet transform. As shown in FIG. 4, when “L” and “H” are respectively used to indicated low frequency and high frequency components, the term “LL” indicates the component that has passed through a low-pass filter (LPF) in all x and y directions, whereas the term “HH” indicates the component that has passed through a high-pass filter (HPF) in the x and y directions. The subscript numerals signify image-dividing stages. For example, “LH2” means that the image has passed through the low-pass filter in the x direction and through the high-pass filter in the y direction during the 2-stage wavelet division.
- Referring back to FIG. 3, in
step 310, the inputted iris image is multi-divided using the Daubechies wavelet transform. As the iris image is considered a two-dimensional signal in which one-dimensional signals are arrayed in the x and y directions, quarterly divided components of one image should be extracted by passing through the LPF and HPF in all x and y directions in order to analyze the iris image. That is, one two-dimensional image signal is wavelet-transformed in vertical and horizontal directions, and the image is divided into four regions: LL, LH, HL, and HH after the wavelet transform has been performed once. At this time, using the Daubechies wavelet transform, the signal is divided into a differential component thereof that has passed through the high-pass filter and an average component that has passed through the low-pass filter. - The performance of the iris recognition system is evaluated in view of two factors; a false acceptance rate (FAR) and a false rejection rate (FRR). Here, the FAR means the probability that the entrance of unregistered persons (imposters) may be accepted due to the false recognition of unregistered persons as registered persons, and the FRR means the probability that entrance of registered persons (enrollees) is rejected due to false recognition of the registered persons as unregistered ones. In simulation, when the method of recognizing the human iris using the Daubechies wavelet transform according to the present invention was employed, the FAR has been reduced from 5.5% to 3.07% and the FRR has also been reduced from 5.0% to 2.25%, as compared with the method of recognizing the human iris using the conventional Harr wavelet transform.
- In
step 320, a region HH including only the high frequency components in the x and y directions are extracted from the divided iris image. - In
step 330, after increasing the iterative number of times of dividing the iris image, the processing step is completed when the iterative number is greater than a predetermined number. Alternatively, if the iterative number is lower than the predetermined number, the information on the region HH is stored for use in extracting the iris features instep 340. - In
step 350, the region LL comprising only low frequency components in the x and y directions is extracted from the multi-divided iris image. As the extracted region LL (corresponding to the image reduced in a fourth size as compared with the previous image) includes major information on the iris image, it is provided as an image to be newly processed so that the wavelet transform can be applied again to the relevant region. Thereafter, the Daubechies wavelet transform is repeated again fromstep 310. - In a case where the iris image is transformed from the Cartesian coordinate system to the polar coordinate system, in order to avoid changes in the iris features according to variations in the size of the pupil, the region between the inner and outer boundaries of the iris is divided into 60 segments in the r direction and 450 segments in the θ direction by varying the angles by 0.8 degrees. Finally, the information on the iris image is acquired and normalized as 450×60 (θ×r) data. Then, if the acquired iris image is once again wavelet-transformed, the characteristics of the 225×30 region HH1 of which size is reduced by half are obtained, namely, the 225×30 information is used as a characteristic vector. This information may be used as it is, but the process of dividing the signals is repeatedly performed in order to reduce the information size. Since the region LL includes major information on the iris image, the characteristic values of further reduced regions, such as HH2, HH3, and HH4, are obtained by successively applying the wavelet transform to the respective relevant regions.
- The iterative number, which is provided as a discriminating criterion for repeatedly performing the wavelet transform, should be set as an optimal value in consideration of the loss of the information and the size of the characteristic vector. Therefore, in the present invention, the region HH4 obtained by performing the wavelet transform four times becomes a major characteristic region, and the values thereof are selected as the components of the characteristic vector. At this time, the region HH4 contains the information having 84 (=28×3) data.
- FIG. 5 is a flowchart showing the process of forming the characteristic vector of the iris image using the data acquired from the multi-divided iris image according to the present invention. Referring to FIG. 5, the information on the n characteristic vector extracted from the above process, i.e., the information on the regions HH1, HH2, HH3, and HH4 is inputted in
step 510. Instep 520, in order to acquire the characteristic information on the regions HH1, HH2, and HH3 excluding the information on the region HH4 obtained through the last wavelet transform among the n characteristic vector, each average value of the regions HH1, HH2, and HH3 is calculated and assigned one dimension. Instep 530, all values of the final obtained region HH4 are extracted as the characteristic values thereof. After extraction of the characteristics of the iris image signals has been completed, the characteristic vector is generated based on these characteristics. A module for generating the characteristic vector mainly performs the processes of extracting the characteristic values in the form of real numbers and then transforming them to binary codes consisting of 0 and 1. - However, in
step 540, the N−1 characteristic values extracted fromstep 520 and the M (the size of the final obtained region HH) characteristic values extracted fromstep 530 are combined and (M+N−1)-dimensional characteristic vector is generated. That is, the total 87 data, which the 84 data of the region HH4 and the 3 average data of the regions HH1, HH2, and HH3 are combined, are used as a characteristic vector in the present invention. - In
step 550, the values of the previously obtained characteristic vector, i.e., the respective component values of the characteristic vector expressed in the form of the real numbers, are quantized intobinary values step 560, the resultant (M+N−1)-bit characteristic vector is generated by the quantized values. That is, according to the present invention, the resultant 87-bit characteristic vector is generated. - FIG. 6a shows a distribution example of the characteristic values of the extracted iris image. When the values of the 87-dimensional characteristic vector are distributed according to the respective dimensions, the distribution roughly takes the shape of FIG. 6a. The binary vector including all the dimensions is generated by the following
Equation 1. - f n=0iff(n)<0
- f n=1 if f(n)>0 (1),
- where f(n) is a characteristic value of the n-th dimension, and fn is the value of the n-th characteristic vector
- When the 87-bit characteristic vector that is obtained by assigning one bit to the total 87 dimensions are generated in order to use a low capacity characteristic vector, the improvement of the recognition rate is limited to some extent as loss of the information on the iris image is increased. Therefore, when generating the characteristic vector, it is necessary to prevent information loss while maintaining the minimum capacity of the characteristic vector.
- FIG. 6b shows a quantization function for generating a binary characteristic vector from the distribution example of the characteristic values shown in FIG. 6a. The extracted (M+N−1)-dimensional characteristic vector shown in FIG. 6a is evenly distributed mostly between 1 and −1 in view of its magnitude. Then, the binary vector is generated by applying the K-level quantization function shown in FIG. 6a to the characteristic vector. Since only signs of the characteristic values are obtained through the process of
Equation 1, it is understood that information on the magnitude has been discarded. Thus, in order to accept the magnitude of the characteristic vector, a 4-level quantization process was utilized in the present invention. - As described above, in order to efficiently compare the characteristic vector generated through the 4-level quantization with the registered characteristic vector, the quantization levels have the weights expressed in the following
Equation 2. - f n=4 if f(n)≧0.5 (level 4)
- f n=1 if 0.5>f(n)≧0 (level 3)
- f n=−1 if 0>f(n)>−0.5 (level 2)
- f n=−4 if f(n)≦−0.5 (level 1) (2),
- where fn represents the n-th dimension of the previously registered characteristic vector fR of the user or the characteristic vector fT of the user generated from the iris image of the eye image of the user. An explanation of how to use the weights expressed in
Equation 2, is as follows. - In a case where the n-th dimensional characteristic value f(n) is equal or more than 0.5 (level 4), the value of the i-th dimension fRi or fTi is converted and assigned “4” if the value is “11.” In a case where the n-th dimensional characteristic value f(n) is more than 0 and less than 0.5 (level 3), the value of the i-th dimension fRi or fTi is converted and assigned “1” if the value is “10.” In a case where the n-th dimensional characteristic value f(n) is more than −0.5 and less than 0 (level 2), the value of the i-th dimension fRi or fTi is converted and assigned −1 if the value is “01.” In a case where the n-th dimensional characteristic value f(n) is equal to or less than −0.5 (level 1), the value of the i-th dimension fRi or fTi is converted and assigned −4 if the value is “00.” This is due to the weights being applied to the respective values as expressed in
Equation 2 as it is suitable for the following verification method of the present invention. - FIG. 7 is a flowchart showing the procedures for discriminating user authenticity through the similarity measurement test between the characteristic vectors. Referring to FIG. 7, in
step 710, the characteristic vector fT of the user is generated from the iris image of the eye image of the user.Step 720, searches the previously registered characteristic vector fR of the user. Instep 730, in order to measure the similarity between the two characteristic vectors, the weights are assigned to the characteristic vectors fR and fT depending on the value of the binary characteristic vector based onEquation 2. - In
step 740, an inner product or scalar product S of the two characteristic vectors is calculated and the similarity is finally measured. Among the measures generally used for determining the correlation between the registered characteristic vector fR and the characteristic vector fT of the user, it is the inner product S of the two characteristic vectors that indicate the most direct association. That is, after the weights have been assigned to the respective data of the characteristic vector instep 730, the inner product S of the two characteristic vectors is used to measure the similarity between the two vectors. -
- where fR is the characteristic vector of the user that has been already registered, and fT is the characteristic vector of the user that is generated from the iris image of the eye of the user.
- According to the above processes, one effect, which can be obtained by the quantization according to the sign of the characteristic vector values as in the method in which the binary vector, is generated with respect to the values of the characteristic vector extracted from the iris image according to the respective dimensions. That is, like the Hamming distance, the difference between 0 and 1 can be expressed. In a case where the two characteristic vectors have the same-signed values with respect to each dimension, positive values are added to the inner product S of the two characteristic vectors. Otherwise, negative values are added to the inner product S of the two vectors. Consequently, the inner product S of the two characteristic vectors increases if the two data belong to an identical person, while the inner product S of the two characteristic vectors decreases if the two data do not belong to an identical person.
- In
step 750, the user authenticity is determined according to the measured similarity obtained from the inner product S of the two characteristic vectors. At this time, the determination of the user authenticity based on the measured similarity depends on the following Equation 4. - If S>C, then TRUE or else FALSE (4),
- where C is a reference value for verifying the similarity between the two characteristic vectors.
- That is, if the inner product S of the two characteristic vectors is equal to or more than the verification reference value C, the user is determined as an enrollee. Otherwise, the user is determined as an imposter.
- As described above, the method of recognizing the human iris using the Daubechies wavelet transform according to the present invention has an advantage in that FAR and FRR can be remarkably reduced as compared with the method using the conventional Harr wavelet transform, as the iris features are extracted from the inputted iris image signals through the Daubechies wavelet transform.
- Furthermore, in order to verify the similarity between the registered and extracted characteristic vectors fR and fT, the inner product S of the two characteristic vectors is calculated, and the user authenticity is determined based on the measured similarity obtained by the calculated inner product S of the two vectors. Therefore, there is provided a method of measuring the similarity between the characteristic vectors wherein the loss of the information, which may be produced by forming the low capacity characteristic vectors, can be minimized.
- The foregoing is a mere embodiment for embodying the method of recognizing the human iris using the Daubechies wavelet transform according to the present invention. However, the present invention is not limited to the embodiment described above. A person skilled in the art can make various modifications and changes to the present invention without departing from the technical spirit and the scope of the present invention defined by the appended claims.
Claims (8)
1. A method of recognizing a human iris using the Daubechies wavelet transform, the method comprising the steps of:
(a) obtaining an iris image from a user's eye using an image acquisition device;
(b) repeatedly performing said Daubechies wavelet transform on said iris image so as to multi-divide said iris image for a predetermined number of times;
(c) extracting image with high frequency components from said multi-divided image so as to extract iris features;
(d) extracting characteristic values of a characteristic vector from said extracted image with said high frequency components;
(e) generating a binary characteristic vector by quantizing said extracted characteristic values; and,
(f) determining whether said user as an enrollee by measuring a similarity between said generated characteristic vector and a previously registered characteristic vector.
2. The method of claim 1 , further comprising the step of illuminating said user's eye.
3. The method of claim 2 , wherein the step of illuminating said user's eye comprises the step of placing a halogen lamp at both ends of said user's eye.
4. The method of claim 1 , wherein said step (b) comprises the steps of: extracting a region HH from said multi-divided image having said high frequency components in both x and y directions; storing information of said region HH for use in extracting iris features; performing multi-division of a region LL from said multi-divided image having low frequency components in both x and y directions.
5. The method of claim 2 , wherein said predetermined number of times is set at four.
6. The method of claim 1 , wherein said step (c) comprises the steps of: receiving multi-divided images of a plurality of high frequency regions HHi formed by said multi-division in said step (b); calculating the average values of regions HH1 to HHn−1 excluding the last region HHN; assigning said calculated average values to the components of said characteristic vector, respectively; assigning said calculated value M of said last region HHN to the components of said binary characteristic vector; combining said N−1 average values and said M values so as to generate a (M+N−1)-dimensional characteristic vector; and, quantizing all values of said generated characteristic vector into binary values so as to generate a final (M+N−1)-dimensional characteristic vector.
7. The method of claim 1 , wherein said step (f) comprises the steps of: applying predetermined weights to the i-th dimensions of said generated characteristic vector generated from said step (c) and said previously registered characteristic vector; calculating the inner product S of said two weighted characteristic vectors; and determining said user as an enrollee if said inner product S is more than a verification reference value C.
8. The method of claim 1 , wherein said image acquisition device comprises a halogen lamp.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2001-0011440A KR100374707B1 (en) | 2001-03-06 | 2001-03-06 | Method of recognizing human iris using daubechies wavelet transform |
KR2001-11440 | 2001-03-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20020150281A1 true US20020150281A1 (en) | 2002-10-17 |
Family
ID=19706518
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/946,714 Abandoned US20020150281A1 (en) | 2001-03-06 | 2001-09-05 | Method of recognizing human iris using daubechies wavelet transform |
US10/656,885 Expired - Fee Related US7302087B2 (en) | 2001-03-06 | 2003-09-05 | Daubechies wavelet transform of iris image data for use with iris recognition system |
US11/941,019 Abandoned US20100290676A1 (en) | 2001-03-06 | 2007-11-15 | Daubechies wavelet transform of iris image data for use with iris recognition system |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/656,885 Expired - Fee Related US7302087B2 (en) | 2001-03-06 | 2003-09-05 | Daubechies wavelet transform of iris image data for use with iris recognition system |
US11/941,019 Abandoned US20100290676A1 (en) | 2001-03-06 | 2007-11-15 | Daubechies wavelet transform of iris image data for use with iris recognition system |
Country Status (6)
Country | Link |
---|---|
US (3) | US20020150281A1 (en) |
EP (1) | EP1374145A4 (en) |
JP (2) | JP2004527832A (en) |
KR (1) | KR100374707B1 (en) |
CN (1) | CN1258733C (en) |
WO (1) | WO2002071317A1 (en) |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040071300A1 (en) * | 2002-10-10 | 2004-04-15 | Texas Instruments Incorporated | Sharing wavelet domain components among encoded signals |
WO2004084726A1 (en) * | 2003-03-25 | 2004-10-07 | Bausch & Lomb Incorporated | Positive patient identification |
ES2224838A1 (en) * | 2003-02-21 | 2005-03-01 | Universidad Politecnica De Madrid | System for biometric identification of people, by analyzing iris, for use in access control in e.g. buildings, has determination unit determining signature of iris, and comparison unit comparing captured image with stored image |
US20060222212A1 (en) * | 2005-04-05 | 2006-10-05 | Yingzi Du | One-dimensional iris signature generation system and method |
CN100351852C (en) * | 2006-07-11 | 2007-11-28 | 电子科技大学 | Iris recognition method based on wavelet transform and maximum detection |
CN100373396C (en) * | 2006-06-27 | 2008-03-05 | 电子科技大学 | Iris identification method based on image segmentation and two-dimensional wavelet transformation |
US20080159600A1 (en) * | 2001-03-06 | 2008-07-03 | Senga Advisors, Llc. | Iris image data processing for use with iris recognition system |
US20090324064A1 (en) * | 2006-08-02 | 2009-12-31 | Japan Science And Technology Agency | Image feature extraction method and image compression method |
US20100002913A1 (en) * | 2005-01-26 | 2010-01-07 | Honeywell International Inc. | distance iris recognition |
US7761453B2 (en) | 2005-01-26 | 2010-07-20 | Honeywell International Inc. | Method and system for indexing and searching an iris image database |
US20100260390A1 (en) * | 2005-11-30 | 2010-10-14 | The Research Foundation Of State University Of New York | System and method for reduction of false positives during computer aided polyp detection |
US20100290676A1 (en) * | 2001-03-06 | 2010-11-18 | Senga Advisors, Llc | Daubechies wavelet transform of iris image data for use with iris recognition system |
US7933507B2 (en) | 2006-03-03 | 2011-04-26 | Honeywell International Inc. | Single lens splitter camera |
US8045764B2 (en) | 2005-01-26 | 2011-10-25 | Honeywell International Inc. | Expedient encoding system |
US8049812B2 (en) | 2006-03-03 | 2011-11-01 | Honeywell International Inc. | Camera with auto focus capability |
US8050463B2 (en) | 2005-01-26 | 2011-11-01 | Honeywell International Inc. | Iris recognition system having image quality metrics |
US8063889B2 (en) | 2007-04-25 | 2011-11-22 | Honeywell International Inc. | Biometric data collection system |
US8064647B2 (en) | 2006-03-03 | 2011-11-22 | Honeywell International Inc. | System for iris detection tracking and recognition at a distance |
US8085993B2 (en) | 2006-03-03 | 2011-12-27 | Honeywell International Inc. | Modular biometrics collection system architecture |
US8090246B2 (en) | 2008-08-08 | 2012-01-03 | Honeywell International Inc. | Image acquisition system |
US8090157B2 (en) | 2005-01-26 | 2012-01-03 | Honeywell International Inc. | Approaches and apparatus for eye detection in a digital image |
US8098901B2 (en) | 2005-01-26 | 2012-01-17 | Honeywell International Inc. | Standoff iris recognition system |
US8213782B2 (en) | 2008-08-07 | 2012-07-03 | Honeywell International Inc. | Predictive autofocusing system |
US8280119B2 (en) | 2008-12-05 | 2012-10-02 | Honeywell International Inc. | Iris recognition system using quality metrics |
US8436907B2 (en) | 2008-05-09 | 2013-05-07 | Honeywell International Inc. | Heterogeneous video capturing system |
US8442276B2 (en) | 2006-03-03 | 2013-05-14 | Honeywell International Inc. | Invariant radial iris segmentation |
US8472681B2 (en) | 2009-06-15 | 2013-06-25 | Honeywell International Inc. | Iris and ocular recognition system using trace transforms |
US8630464B2 (en) | 2009-06-15 | 2014-01-14 | Honeywell International Inc. | Adaptive iris matching using database indexing |
US8705808B2 (en) | 2003-09-05 | 2014-04-22 | Honeywell International Inc. | Combined face and iris recognition system |
US8742887B2 (en) | 2010-09-03 | 2014-06-03 | Honeywell International Inc. | Biometric visitor check system |
US10523668B2 (en) | 2016-04-04 | 2019-12-31 | Nhn Payco Corporation | Authentication method with enhanced security based on eye recognition and authentication system thereof |
US20220004758A1 (en) * | 2015-10-16 | 2022-01-06 | Magic Leap, Inc. | Eye pose identification using eye features |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100476406B1 (en) * | 2002-12-03 | 2005-03-17 | 이일병 | Iris identification system and method using wavelet packet transformation, and storage media having program thereof |
WO2005008590A1 (en) * | 2003-07-17 | 2005-01-27 | Matsushita Electric Industrial Co.,Ltd. | Iris code generation method, individual authentication method, iris code entry device, individual authentication device, and individual certification program |
US7336806B2 (en) * | 2004-03-22 | 2008-02-26 | Microsoft Corporation | Iris-based biometric identification |
GB0412175D0 (en) * | 2004-06-01 | 2004-06-30 | Smart Sensors Ltd | Identification of image characteristics |
GB0427737D0 (en) * | 2004-12-17 | 2005-01-19 | Univ Cambridge Tech | Method of identifying features within a dataset |
CN1324518C (en) * | 2005-04-07 | 2007-07-04 | 上海邦震科技发展有限公司 | Iris geometrical property extracting method based on property edge distribution |
KR20060111812A (en) * | 2005-04-25 | 2006-10-30 | 자바정보기술 주식회사 | Iris cognition method using comparison area embossed peculiar characteristic of iris pattern |
JP4664147B2 (en) * | 2005-07-29 | 2011-04-06 | 株式会社山武 | Iris authentication device |
KR100734857B1 (en) | 2005-12-07 | 2007-07-03 | 한국전자통신연구원 | Method for verifying iris using CPAChange Point Analysis based on cumulative sum and apparatus thereof |
CN101093538B (en) * | 2006-06-19 | 2011-03-30 | 电子科技大学 | Method for identifying iris based on zero crossing indication of wavelet transforms |
JP2008090483A (en) * | 2006-09-29 | 2008-04-17 | Oki Electric Ind Co Ltd | Personal identification system and personal identification method |
US9846739B2 (en) | 2006-10-23 | 2017-12-19 | Fotonation Limited | Fast database matching |
US7809747B2 (en) * | 2006-10-23 | 2010-10-05 | Donald Martin Monro | Fuzzy database matching |
US20100202669A1 (en) * | 2007-09-24 | 2010-08-12 | University Of Notre Dame Du Lac | Iris recognition using consistency information |
JP2009080522A (en) * | 2007-09-25 | 2009-04-16 | Mitsubishi Electric Corp | Object image recognition device |
US20100278394A1 (en) * | 2008-10-29 | 2010-11-04 | Raguin Daniel H | Apparatus for Iris Capture |
US8317325B2 (en) | 2008-10-31 | 2012-11-27 | Cross Match Technologies, Inc. | Apparatus and method for two eye imaging for iris identification |
US8577094B2 (en) | 2010-04-09 | 2013-11-05 | Donald Martin Monro | Image template masking |
CN102314731A (en) * | 2010-07-06 | 2012-01-11 | 中国银联股份有限公司 | Mobile payment method and equipment for implementing same |
US9412022B2 (en) * | 2012-09-06 | 2016-08-09 | Leonard Flom | Iris identification system and method |
CN102902967B (en) * | 2012-10-16 | 2015-03-11 | 第三眼(天津)生物识别科技有限公司 | Method for positioning iris and pupil based on eye structure classification |
KR20150003573A (en) * | 2013-07-01 | 2015-01-09 | 한국전자통신연구원 | Method and apparatus for extracting pattern of image |
US10698918B2 (en) * | 2013-11-20 | 2020-06-30 | Qliktech International Ab | Methods and systems for wavelet based representation |
KR101476173B1 (en) * | 2013-11-28 | 2014-12-24 | 서강대학교산학협력단 | User authentication method and system using iris characteristic |
US9928422B2 (en) * | 2014-10-15 | 2018-03-27 | Samsung Electronics Co., Ltd. | User terminal apparatus and IRIS recognition method thereof |
CN109034206A (en) * | 2018-06-29 | 2018-12-18 | 泰康保险集团股份有限公司 | Image classification recognition methods, device, electronic equipment and computer-readable medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5291560A (en) * | 1991-07-15 | 1994-03-01 | Iri Scan Incorporated | Biometric personal identification system based on iris analysis |
US6090051A (en) * | 1999-03-03 | 2000-07-18 | Marshall; Sandra P. | Method and apparatus for eye tracking and monitoring pupil dilation to evaluate cognitive activity |
US6247813B1 (en) * | 1999-04-09 | 2001-06-19 | Iritech, Inc. | Iris identification system and method of identifying a person through iris recognition |
US6424727B1 (en) * | 1998-11-25 | 2002-07-23 | Iridian Technologies, Inc. | System and method of animal identification and animal transaction authorization using iris patterns |
US6643406B1 (en) * | 1999-07-28 | 2003-11-04 | Polaroid Corporation | Method and apparatus for performing linear filtering in wavelet based domain |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5572596A (en) * | 1994-09-02 | 1996-11-05 | David Sarnoff Research Center, Inc. | Automated, non-invasive iris recognition system and method |
JP3610234B2 (en) * | 1998-07-17 | 2005-01-12 | 株式会社メディア・テクノロジー | Iris information acquisition device and iris identification device |
KR20010006975A (en) * | 1999-04-09 | 2001-01-26 | 김대훈 | A method for identifying the iris of persons based on the reaction of the pupil and autonomous nervous wreath |
KR20020065249A (en) * | 2001-02-06 | 2002-08-13 | 이승재 | Human Iris Verification Using Similarity between Feature Vectors |
KR100374707B1 (en) * | 2001-03-06 | 2003-03-04 | 에버미디어 주식회사 | Method of recognizing human iris using daubechies wavelet transform |
KR100453943B1 (en) * | 2001-12-03 | 2004-10-20 | 주식회사 세넥스테크놀로지 | Iris image processing recognizing method and system for personal identification |
US8023699B2 (en) * | 2007-03-09 | 2011-09-20 | Jiris Co., Ltd. | Iris recognition system, a method thereof, and an encryption system using the same |
-
2001
- 2001-03-06 KR KR10-2001-0011440A patent/KR100374707B1/en not_active IP Right Cessation
- 2001-07-31 WO PCT/KR2001/001303 patent/WO2002071317A1/en not_active Application Discontinuation
- 2001-07-31 CN CNB01822993XA patent/CN1258733C/en not_active Expired - Fee Related
- 2001-07-31 EP EP01953361A patent/EP1374145A4/en not_active Withdrawn
- 2001-07-31 JP JP2002570166A patent/JP2004527832A/en active Pending
- 2001-08-10 JP JP2001243653A patent/JP2002269564A/en active Pending
- 2001-09-05 US US09/946,714 patent/US20020150281A1/en not_active Abandoned
-
2003
- 2003-09-05 US US10/656,885 patent/US7302087B2/en not_active Expired - Fee Related
-
2007
- 2007-11-15 US US11/941,019 patent/US20100290676A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5291560A (en) * | 1991-07-15 | 1994-03-01 | Iri Scan Incorporated | Biometric personal identification system based on iris analysis |
US6424727B1 (en) * | 1998-11-25 | 2002-07-23 | Iridian Technologies, Inc. | System and method of animal identification and animal transaction authorization using iris patterns |
US6090051A (en) * | 1999-03-03 | 2000-07-18 | Marshall; Sandra P. | Method and apparatus for eye tracking and monitoring pupil dilation to evaluate cognitive activity |
US6247813B1 (en) * | 1999-04-09 | 2001-06-19 | Iritech, Inc. | Iris identification system and method of identifying a person through iris recognition |
US6643406B1 (en) * | 1999-07-28 | 2003-11-04 | Polaroid Corporation | Method and apparatus for performing linear filtering in wavelet based domain |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100290676A1 (en) * | 2001-03-06 | 2010-11-18 | Senga Advisors, Llc | Daubechies wavelet transform of iris image data for use with iris recognition system |
US20080159600A1 (en) * | 2001-03-06 | 2008-07-03 | Senga Advisors, Llc. | Iris image data processing for use with iris recognition system |
US20040071300A1 (en) * | 2002-10-10 | 2004-04-15 | Texas Instruments Incorporated | Sharing wavelet domain components among encoded signals |
US7890335B2 (en) * | 2002-10-10 | 2011-02-15 | Texas Instruments Incorporated | Sharing wavelet domain components among encoded signals |
ES2224838A1 (en) * | 2003-02-21 | 2005-03-01 | Universidad Politecnica De Madrid | System for biometric identification of people, by analyzing iris, for use in access control in e.g. buildings, has determination unit determining signature of iris, and comparison unit comparing captured image with stored image |
WO2004084726A1 (en) * | 2003-03-25 | 2004-10-07 | Bausch & Lomb Incorporated | Positive patient identification |
KR101126017B1 (en) * | 2003-03-25 | 2012-03-19 | 보오슈 앤드 롬 인코포레이팃드 | Positive patient identification |
CN100403981C (en) * | 2003-03-25 | 2008-07-23 | 博士伦公司 | Positive patient identification |
US7436986B2 (en) | 2003-03-25 | 2008-10-14 | Bausch & Lomb Incorporated | Positive patient identification |
US8705808B2 (en) | 2003-09-05 | 2014-04-22 | Honeywell International Inc. | Combined face and iris recognition system |
US8488846B2 (en) | 2005-01-26 | 2013-07-16 | Honeywell International Inc. | Expedient encoding system |
US7761453B2 (en) | 2005-01-26 | 2010-07-20 | Honeywell International Inc. | Method and system for indexing and searching an iris image database |
US20100002913A1 (en) * | 2005-01-26 | 2010-01-07 | Honeywell International Inc. | distance iris recognition |
US8285005B2 (en) * | 2005-01-26 | 2012-10-09 | Honeywell International Inc. | Distance iris recognition |
US8045764B2 (en) | 2005-01-26 | 2011-10-25 | Honeywell International Inc. | Expedient encoding system |
US8098901B2 (en) | 2005-01-26 | 2012-01-17 | Honeywell International Inc. | Standoff iris recognition system |
US8050463B2 (en) | 2005-01-26 | 2011-11-01 | Honeywell International Inc. | Iris recognition system having image quality metrics |
US8090157B2 (en) | 2005-01-26 | 2012-01-03 | Honeywell International Inc. | Approaches and apparatus for eye detection in a digital image |
US20060222212A1 (en) * | 2005-04-05 | 2006-10-05 | Yingzi Du | One-dimensional iris signature generation system and method |
US20100260390A1 (en) * | 2005-11-30 | 2010-10-14 | The Research Foundation Of State University Of New York | System and method for reduction of false positives during computer aided polyp detection |
US8064647B2 (en) | 2006-03-03 | 2011-11-22 | Honeywell International Inc. | System for iris detection tracking and recognition at a distance |
US8442276B2 (en) | 2006-03-03 | 2013-05-14 | Honeywell International Inc. | Invariant radial iris segmentation |
US8761458B2 (en) | 2006-03-03 | 2014-06-24 | Honeywell International Inc. | System for iris detection, tracking and recognition at a distance |
US8049812B2 (en) | 2006-03-03 | 2011-11-01 | Honeywell International Inc. | Camera with auto focus capability |
US7933507B2 (en) | 2006-03-03 | 2011-04-26 | Honeywell International Inc. | Single lens splitter camera |
US8085993B2 (en) | 2006-03-03 | 2011-12-27 | Honeywell International Inc. | Modular biometrics collection system architecture |
CN100373396C (en) * | 2006-06-27 | 2008-03-05 | 电子科技大学 | Iris identification method based on image segmentation and two-dimensional wavelet transformation |
CN100351852C (en) * | 2006-07-11 | 2007-11-28 | 电子科技大学 | Iris recognition method based on wavelet transform and maximum detection |
US8160368B2 (en) * | 2006-08-02 | 2012-04-17 | Japan Science And Technology Agency | Image feature extraction method and image compression method |
US20090324064A1 (en) * | 2006-08-02 | 2009-12-31 | Japan Science And Technology Agency | Image feature extraction method and image compression method |
US8063889B2 (en) | 2007-04-25 | 2011-11-22 | Honeywell International Inc. | Biometric data collection system |
US8436907B2 (en) | 2008-05-09 | 2013-05-07 | Honeywell International Inc. | Heterogeneous video capturing system |
US8213782B2 (en) | 2008-08-07 | 2012-07-03 | Honeywell International Inc. | Predictive autofocusing system |
US8090246B2 (en) | 2008-08-08 | 2012-01-03 | Honeywell International Inc. | Image acquisition system |
US8280119B2 (en) | 2008-12-05 | 2012-10-02 | Honeywell International Inc. | Iris recognition system using quality metrics |
US8472681B2 (en) | 2009-06-15 | 2013-06-25 | Honeywell International Inc. | Iris and ocular recognition system using trace transforms |
US8630464B2 (en) | 2009-06-15 | 2014-01-14 | Honeywell International Inc. | Adaptive iris matching using database indexing |
US8742887B2 (en) | 2010-09-03 | 2014-06-03 | Honeywell International Inc. | Biometric visitor check system |
US20220004758A1 (en) * | 2015-10-16 | 2022-01-06 | Magic Leap, Inc. | Eye pose identification using eye features |
US11749025B2 (en) * | 2015-10-16 | 2023-09-05 | Magic Leap, Inc. | Eye pose identification using eye features |
US10523668B2 (en) | 2016-04-04 | 2019-12-31 | Nhn Payco Corporation | Authentication method with enhanced security based on eye recognition and authentication system thereof |
Also Published As
Publication number | Publication date |
---|---|
JP2004527832A (en) | 2004-09-09 |
JP2002269564A (en) | 2002-09-20 |
CN1258733C (en) | 2006-06-07 |
CN1493056A (en) | 2004-04-28 |
US20100290676A1 (en) | 2010-11-18 |
EP1374145A1 (en) | 2004-01-02 |
KR100374707B1 (en) | 2003-03-04 |
WO2002071317A1 (en) | 2002-09-12 |
EP1374145A4 (en) | 2006-10-18 |
US7302087B2 (en) | 2007-11-27 |
KR20020071329A (en) | 2002-09-12 |
US20040114781A1 (en) | 2004-06-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20020150281A1 (en) | Method of recognizing human iris using daubechies wavelet transform | |
Sanchez-Avila et al. | Two different approaches for iris recognition using Gabor filters and multiscale zero-crossing representation | |
US6243492B1 (en) | Image feature extractor, an image feature analyzer and an image matching system | |
JPH1055444A (en) | Recognition of face using feature vector with dct as base | |
Dey et al. | Improved feature processing for iris biometric authentication system | |
CN107798308B (en) | Face recognition method based on short video training method | |
CN100369047C (en) | Image identifying method based on Gabor phase mode | |
Zuobin et al. | Feature regrouping for cca-based feature fusion and extraction through normalized cut | |
Chirchi et al. | Feature extraction and pupil detection algorithm used for iris biometric authentication system | |
CN106845445A (en) | A kind of personal identification method based on wireless network and iris recognition | |
Monadjemi et al. | Experiments on high resolution images towards outdoor scene classification | |
Murty et al. | Iris recognition system using fractal dimensions of Haar patterns | |
Lee et al. | Fingerprint recognition using principal Gabor basis function | |
Fathee et al. | Efficient Unconstrained Iris Recognition System Based on CCT‐Like Mask Filter Bank | |
Ren et al. | A novel method of score level fusion using multiple impressions for fingerprint verification | |
KR20020023011A (en) | Human iris recognition method using harr wavelet transform and lvq | |
Triantafyllou et al. | Iris authentication utilizing co-occurrence matrices and textile features | |
Radouane et al. | Fusion of Gabor filter and steerable pyramid to improve iris recognition system | |
Ali et al. | Development of a enhanced ear recognition system for personal identification | |
Climent et al. | Approximate string matching for iris recognition by means of boosted Gabor wavelets | |
KR20020065249A (en) | Human Iris Verification Using Similarity between Feature Vectors | |
Supriya et al. | Efficient iris recognition by fusion of matching scores obtained by lifting DWT and Log-Gabor methods of feature extraction | |
Mandal et al. | A small scale fingerprint matching scheme using digital curvelet transform | |
Dutta et al. | TR-LBP: A modified Local Binary Pattem-based technique for 3D face recognition | |
RU2310910C1 (en) | Method for verification and identification of imprints of papillary patterns |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EVERMEDIA CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CHO, SEONG-WON;REEL/FRAME:012258/0232 Effective date: 20010810 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |