WO2024095462A1 - Comparison score calculation method, comparison score calculation device, comparison score calculation system, and comparison score calculation program - Google Patents

Comparison score calculation method, comparison score calculation device, comparison score calculation system, and comparison score calculation program Download PDF

Info

Publication number
WO2024095462A1
WO2024095462A1 PCT/JP2022/041196 JP2022041196W WO2024095462A1 WO 2024095462 A1 WO2024095462 A1 WO 2024095462A1 JP 2022041196 W JP2022041196 W JP 2022041196W WO 2024095462 A1 WO2024095462 A1 WO 2024095462A1
Authority
WO
WIPO (PCT)
Prior art keywords
feature point
matching
registered
feature
matching score
Prior art date
Application number
PCT/JP2022/041196
Other languages
French (fr)
Japanese (ja)
Inventor
隆浩 青木
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2022/041196 priority Critical patent/WO2024095462A1/en
Publication of WO2024095462A1 publication Critical patent/WO2024095462A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a matching score calculation method, a matching score calculation device, a matching score calculation system, and a matching score calculation program.
  • biometric authentication In biometric authentication and other fields, matching methods using feature points are commonly used.
  • characteristic points are extracted from a biometric image as feature points, and matching is performed using feature amounts calculated from pixel values near the feature points.
  • Biometric images include fingerprint images and palm vein images.
  • FIG. 1 shows examples of feature points and feature quantities in a biometric image.
  • FIG. 1(a) shows examples of feature points.
  • Curve 101 represents, for example, a fingerprint in a fingerprint image, or veins in a palm vein image.
  • branch point 111, end point 112, and end point 113 of curve 101 are extracted as feature point A1, feature point A2, and feature point A3, respectively.
  • FIG. 1(b) shows an example of feature data generated from the biometric image of FIG. 1(a).
  • the feature data includes coordinates (X, Y) indicating the position of each feature point and the feature amount of each feature point.
  • the feature amount is represented by, for example, a vector.
  • matching processing is performed using target data and registered data.
  • the target data is feature data generated from a biometric image of the target person
  • the registered data is feature data generated from a biometric image of the enrolled person.
  • the coordinates (X, Y) of the feature points are used to compare the feature amounts of the matching feature points contained in the target data with the feature amounts of the registered feature points contained in the registered data, and a matching score indicating the comparison result is calculated.
  • an authentication device that achieves highly accurate biometric authentication while reducing memory usage is known (see, for example, Patent Document 1).
  • a data generation method that suppresses accuracy degradation in biometric authentication using partial data is also known (see, for example, Patent Document 2).
  • a biometric image processing device that improves authentication accuracy in biometric authentication that is performed by binarizing feature amounts acquired from a biometric image is also known (see, for example, Patent Document 3).
  • JP 2006-39951 A International Publication No. 2022/074786 JP 2019-45969 A
  • the accuracy of the matching score based on the comparison between the feature values of matching features and the feature values of registered features may decrease.
  • this problem is not limited to feature values of feature points extracted from biometric images, but occurs when comparing feature values of various feature points.
  • the present invention aims to improve the accuracy of the matching score between matching features and registered features.
  • the computer when the computer acquires the coordinate information and feature amount information of the matching feature point, it performs coordinate transformation of either the matching feature point or the registered feature point based on an angle difference calculated using the coordinate information of the matching feature point and the coordinate information of the registered feature point stored in the storage unit. Using the angle difference and the acquired feature amount information of the matching feature point, the computer calculates a feature point obtained by coordinate transformation of either the matching feature point or the registered feature point, and a matching score between the other of the matching feature point and the registered feature point.
  • FIG. 2 is a diagram showing feature points and feature amounts in a biometric image.
  • FIG. 13 is a diagram illustrating a normalization process for a fingerprint image.
  • FIG. 13 is a diagram illustrating normalization processing for a palm vein image.
  • FIG. 1 illustrates a matching process using alignment search.
  • FIG. 13 is a diagram showing a correspondence relationship between positions of feature points.
  • FIG. 13 is a diagram illustrating a comparison of feature amounts.
  • FIG. 2 is a functional configuration diagram of a matching score calculation device according to an embodiment. 13 is a flowchart of a matching score calculation process.
  • FIG. 2 is a functional configuration diagram of an entrance/exit management system.
  • 11 is a flowchart of a first biometric authentication process.
  • 13 is a flowchart of a search process.
  • FIG. 13 is a flowchart of a calculation process.
  • FIG. 13 is a diagram showing a matching score using the Hamming distance.
  • FIG. 1 illustrates encryption of binary data.
  • FIG. 2 is a functional configuration diagram of a flapper gate management system.
  • FIG. 2 is a hardware configuration diagram of an information processing device.
  • Biometric images input for biometric authentication often contain posture variations. Posture variations, such as rotation or positional shift, occur due to unstable posture of the biometric image input by the person to be matched or the user who is registered. For this reason, in biometric authentication, normalization processing is performed to correct for rotation, positional shift, etc.
  • FIG. 2 shows an example of normalization processing for a fingerprint image.
  • Fingerprint image 201 is converted into fingerprint image 202 by performing corrections such as rotation or positional shift based on information such as the outline of the finger or the center coordinates of the fingerprint in fingerprint image 201.
  • FIG. 3 shows an example of normalization processing for a palm vein image.
  • palm vein image 301 is converted into palm vein image 302.
  • a registration search is performed to absorb the rotation error.
  • a coordinate transformation T is applied to the data to be matched, and the data to be matched after the coordinate transformation is compared with the registered data.
  • Figure 4 shows an example of matching processing using alignment search.
  • Matching target data 402 is generated by applying coordinate transformation T to matching target data 401, and matching target data 402 is compared with registered data 403.
  • Coordinate transformation T includes rotation, translation, etc.
  • Coordinate transformation T can be calculated by various methods. For example, a brute force search that tries multiple coordinate transformations in sequence may be used, or coordinate transformation T may be calculated from the correspondence between multiple feature points.
  • Figure 5 shows an example of the correspondence relationship between the positions of feature points in a biometric image.
  • the person being matched is the same person as the registrant.
  • the feature amount at position 512-i in palm vein image 502 is compared with the feature amount at position 511-i in palm vein image 501.
  • FIG. 6 shows an example of a comparison of features.
  • Image 601 is an image of a region corresponding to position 511-i in palm vein image 501 in FIG. 5, and image 602 is an image of a region corresponding to position 512-i in palm vein image 502.
  • Image 601 has no rotation error, and image 602 has a rotation error.
  • Feature point 621 is extracted from region 611 of image 601, and feature vector 641 representing the feature amount of feature point 621 is generated using image data 631 of region 611. Also, feature point 622 is extracted from region 612 of image 602, and feature vector 642 representing the feature amount of feature point 622 is generated using image data 632 of region 612.
  • feature vector 642 differs from feature vector 641, and the accuracy of the matching score decreases.
  • the person being matched is more likely to be determined to be a person different from the registrant, and the authentication accuracy decreases.
  • the processing time increases because the features must be extracted again.
  • the features are generally extracted by the client, but the angle difference ⁇ is calculated by a matching process on the server. Therefore, after the angle difference ⁇ is sent from the server to the client, the features are extracted again by the client. This adds communication time, further increasing the processing time. Also, the authentication protocol becomes very complicated.
  • the client will retain the biometric images in memory from the time it acquires the biometric image of each person to be matched until it receives the angle difference ⁇ sent by the server, which will increase memory consumption.
  • an operational system that stores biometric images of subjects for matching in the client's memory for long periods of time is not a desirable configuration.
  • FIG. 7 shows an example of the functional configuration of a matching score calculation device according to an embodiment.
  • the matching score calculation device 701 in FIG. 7 includes a storage unit 711, a coordinate conversion unit 712, and a matching score calculation unit 713.
  • the storage unit 711 stores coordinate information of registered feature points.
  • the coordinate conversion unit 712 and the matching score calculation unit 713 perform a matching score calculation process.
  • FIG. 8 is a flowchart showing an example of the matching score calculation process performed by the matching score calculation device 701 in FIG. 7.
  • the coordinate transformation unit 712 performs coordinate transformation of either the matching feature point or the registered feature point based on the angle difference calculated using the coordinate information of the matching feature point and the coordinate information of the registered feature point (step 801).
  • the matching score calculation unit 713 uses the angle difference and the feature amount information of the acquired matching feature point to calculate a feature point obtained by coordinate transformation of either the matching feature point or the registered feature point, and a matching score between the other of the matching feature point and the registered feature point (step 802).
  • the matching score calculation device 701 in FIG. 7 can improve the accuracy of the matching score between the matching feature points and the registered feature points.
  • FIG. 9 shows an example of the functional configuration of an entrance/exit management system including the matching score calculation device 701 of FIG. 7.
  • the entrance/exit management system of FIG. 9 includes a matching score calculation device 901 and a control device 902.
  • the matching score calculation device 901 corresponds to the matching score calculation device 701 of FIG. 7.
  • the matching score calculation device 901 calculates a matching score using matching target data generated from a biometric image of the person to be matched, performs biometric authentication on the person to be matched using the matching score, and transmits the authentication result to the control device 902.
  • the control device 902 controls the opening and closing of the door based on the authentication result. For example, if the authentication result indicates success, the control device 902 controls the door to open, and if the authentication result indicates failure, the control device 902 controls the door to close.
  • the matching score calculation device 901 includes a biometric image acquisition unit 911, a feature extraction unit 912, a coordinate transformation calculation unit 913, a coordinate transformation unit 914, a matching score calculation unit 915, a communication unit 916, and a memory unit 917.
  • the coordinate transformation unit 914, the matching score calculation unit 915, and the memory unit 917 correspond to the coordinate transformation unit 712, the matching score calculation unit 713, and the memory unit 711 in FIG. 7, respectively.
  • the memory unit 917 stores a registration template 921 and fitting information 922.
  • the registration template 921 includes registration data for each of a plurality of enrollees, and the enrollment data for each enrollee includes the coordinates and feature vectors of each of a plurality of feature points.
  • Each feature point included in the enrollment data is an example of a registered feature point, and is a feature point extracted from a biometric image of each enrollee.
  • the coordinates of each feature point included in the enrollment data is an example of coordinate information of the registered feature point.
  • the fitting information 922 is information for calculating correction parameters for the feature quantities from the angle difference ⁇ between the data to be matched and the registered data.
  • the correction parameters will be described later.
  • the biometric image acquisition unit 911 acquires a biometric image 923 of the person to be matched and stores it in the memory unit 917.
  • the biometric image 923 is, for example, a fingerprint image, a palm vein image, a palm print image, or a face image.
  • the biometric image acquisition unit 911 is, for example, a fingerprint sensor, a vein sensor, or an image sensor. By using a palm vein image as the biometric image 923, palm vein authentication can be performed.
  • the feature extraction unit 912 extracts multiple feature points from the biometric image 923 and calculates a feature vector from pixel values near each feature point.
  • the feature extraction unit 912 then generates matching target data 924 including the coordinates and feature vectors of each of the multiple feature points and stores it in the storage unit 917.
  • Each feature point extracted from the biometric image 923 is an example of a matching feature point
  • the coordinates of each feature point are an example of coordinate information of a matching feature point
  • the feature vector of each feature point is an example of feature amount information of a matching feature point.
  • the feature extraction unit 912 can calculate feature vectors, for example, using a transformation based on a basis for which orthonormality holds.
  • transformations include Principal Component Analysis (PCA), discrete cosine transform (DCT), and fast Fourier transform (FFT).
  • the coordinate transformation calculation unit 913 calculates a coordinate transformation T for matching the matching target data 924 with the registered data of each registered person in the registered template 921.
  • the coordinate transformation T includes, for example, a rotation angle ⁇ and a parallel translation amount ( ⁇ X, ⁇ Y) in the X direction and Y direction.
  • the rotation angle ⁇ corresponds to an angle difference calculated using the coordinate information of the matching feature point and the coordinate information of the registered feature point.
  • the coordinate transformation unit 914 uses coordinate transformation T to transform the coordinates of each feature point contained in the matching target data 924.
  • the matching score calculation unit 915 corrects the feature vector of each feature point included in the matching target data 924 using the rotation angle ⁇ included in the coordinate transformation T and the fitting information 922. Next, the matching score calculation unit 915 calculates a matching score 925 between the matching target data 924 and the registered data using the coordinates of each feature point after the coordinate transformation, the feature vector of each feature point after the correction, and the registered data of each registered person included in the registered template 921. Then, the matching score calculation unit 915 stores the matching score 925 in the memory unit 917.
  • the matching score calculation unit 915 performs biometric authentication on the person to be matched using the matching score 925 and generates an authentication result.
  • the communication unit 916 transmits the authentication result to the control device 902.
  • the coordinate transformation calculation unit 913 calculates the coordinate transformation T from the correspondence between multiple feature points included in the matching target data 924 and multiple feature points included in the registered data of each registered person.
  • the coordinate transformation calculation unit 913 generates feature points pairs that represent combinations of each feature point included in the matching target data 924 and each feature point included in the registered data. Next, for each feature points pair, the coordinate transformation calculation unit 913 calculates a feature points score between the feature vector of the feature point included in the matching target data 924 and the feature vector of the feature point included in the registered data. Then, the coordinate transformation calculation unit 913 selects the top X (X is an integer of 2 or more) feature points pairs in order from the best feature points score.
  • the feature point score can be the similarity or dissimilarity of two feature vectors.
  • the similarity can be the inner product of two feature vectors
  • the dissimilarity can be the inter-vector distance of two feature vectors.
  • the coordinate transformation calculation unit 913 uses the least squares method to find the coordinate transformation Ti that minimizes the distance between P1 and Q1 and the distance between P2 and Q2 after the coordinate transformation.
  • the coordinate transformation Ti which includes the rotation angle ⁇ and the translation amount ( ⁇ X, ⁇ Y), is expressed by the following formula:
  • the coordinate transformation calculation unit 913 calculates ⁇ , ⁇ X, and ⁇ Y that satisfy the formula (5) by the least squares method. To find the matrix A that minimizes the error by the least squares method, the pseudo inverse matrix B ⁇ 1 of the matrix B is found, and the matrix A is calculated by the following formula.
  • the image represented by image vector I corresponds to the image represented by image vector I( ⁇ ) rotated by - ⁇ .
  • P(i) represents the i-th basis vector (principal component vector).
  • N is set to an appropriate integer.
  • the right-hand side of equation (10) represents a linear combination of the N basis vectors P(i), and x'(i) represents a coefficient.
  • x'(i) is the i-th element of feature vector x' extracted using P( ⁇ , i).
  • the rotation error contained in x(i) can be cancelled.
  • feature vector x' may be referred to as corrected feature vector x'.
  • Corrected feature vector x' is an example of corrected feature amount information.
  • ⁇ ( ⁇ ,n,i) is the i-th element of the feature vector ⁇ ( ⁇ ,n) obtained by performing feature extraction on P( ⁇ ,n).
  • ⁇ ( ⁇ ,n,i) is a parameter that indicates how P(i) changes when it is rotated, and has a unique value for each P(i).
  • ⁇ ( ⁇ ,n,i) corresponds to the correction parameter for the feature.
  • x'(n) is expressed as the inner product of feature vector x and feature vector ⁇ ( ⁇ ,n).
  • ⁇ ( ⁇ ,n) is the feature vector of P( ⁇ ,n) and can be calculated in advance.
  • ⁇ ( ⁇ ,n) can be obtained numerically by fitting it as a function of ⁇ .
  • ⁇ ( ⁇ ,n,i) is expressed, for example, by the following equation.
  • F(n, i, ⁇ ) is a fitting function.
  • parameters such as coefficients that define the function form of F(n, i, ⁇ ) are stored in the storage unit 917 as fitting information 922.
  • F(n, i, ⁇ ) is a quadratic function of ⁇
  • ⁇ ( ⁇ , n, i) is expressed by the following equation.
  • ⁇ ( ⁇ , n, i) p1(n,i) ⁇ 2 + p2(n,i) ⁇ + p3(n,i) (22)
  • p1(n,i), p2(n,i), and p3(n,i) are fitting coefficients and are used as fitting information 922. Substituting equation (22) into equation (20), the following equation is obtained.
  • x'(n) ⁇ x(i)(p1(n,i) ⁇ 2 +p2(n,i) ⁇ +p3(n,i)) (23)
  • the corrected feature vector x' is a feature vector for the image vector I that represents an unrotated image, so the accuracy of the matching score can be improved by calculating the matching score using x'(n).
  • the matching score calculation unit 915 calculates ⁇ ( ⁇ , n, i) using the rotation angle ⁇ included in the coordinate transformation T and the fitting information 922. The matching score calculation unit 915 then calculates a corrected feature vector x' using the feature vector x of each feature point included in the matching target data 924 and ⁇ ( ⁇ , n, i) according to formula (20).
  • the matching score calculation unit 915 calculates a matching score using the corrected feature vector x' of each feature point included in the matching target data 924 and the feature vector of each feature point included in the registered data of each registrant.
  • the similarity or dissimilarity of two feature vectors can be used.
  • the similarity may be the inner product of two feature vectors, and the dissimilarity may be the vector distance between two feature vectors.
  • the matching score calculated using the corrected feature vector x' corresponds to the feature point obtained by transforming the coordinates of either the matching feature point or the registered feature point, and the matching score between the other of the matching feature point and the registered feature point.
  • FIG. 10 is a flowchart showing an example of the first biometric authentication process performed by the matching score calculation device 901 of FIG. 9.
  • the biometric image acquisition unit 911 acquires a biometric image 923 of the person to be matched (step 1001).
  • the feature extraction unit 912 extracts a plurality of feature points from the biometric image 923, calculates a feature vector for each feature point, and generates matching target data 924 including the coordinates and feature vector for each feature point (step 1002).
  • the coordinate transformation calculation unit 913 generates feature points pairs representing combinations of feature points included in the matching target data 924 and feature points included in the enrollment data of each enrolled person, and generates M (M is an integer equal to or greater than 1) combinations of two feature points pairs.
  • M is an integer equal to or greater than 1.
  • the coordinate transformation calculation unit 913 sets 1 to a control variable i (step 1003).
  • the coordinate transformation calculation unit 913 selects the i-th combination from the M combinations, and calculates the i-th coordinate transformation Ti by the least squares method using the coordinates of each feature point included in the two feature point pairs corresponding to the selected combination (step 1004).
  • the coordinate transformation calculation unit 913 then checks the validity of the coordinate transformation Ti (step 1005).
  • the coordinate transformation Ti is determined to be valid, and if the least squares residual is equal to or greater than the predetermined value, the coordinate transformation Ti is determined to be invalid.
  • step 1005 If the coordinate transformation Ti is not valid (step 1005, NO), the coordinate transformation calculation unit 913 increments i by 1 (step 1014), and the matching score calculation device 901 repeats the processing from step 1004 onwards.
  • the coordinate transformation unit 914 uses the coordinate transformation Ti to transform the coordinates of each feature point included in the matching target data 924 (step 1006).
  • the matching score calculation unit 915 uses the rotation angle ⁇ included in the coordinate transformation Ti and the fitting information 922 to calculate the corrected feature vectors of each feature point included in the matching target data 924 (step 1007).
  • the matching score calculation unit 915 performs a search process to match feature points included in the matching target data 924 with feature points included in the registered data (step 1008). Then, the matching score calculation unit 915 performs a calculation process to calculate a matching score Si for the coordinate transformation Ti from the matching scores of the two matched feature points (step 1009). The matching score Si is calculated for each registered user.
  • the matching score calculation unit 915 updates the matching score 925 between the matching target data 924 and the registered data of each registered person (step 1010).
  • the matching score 925 of each registered person is set to an initial value. If the matching score Si calculated in step 1009 is better than the matching score 925, the matching score calculation unit 915 updates the matching score 925 by setting the matching score Si to the matching score 925.
  • the matching score represents the similarity between two feature vectors, the larger the matching score, the better the value.
  • the matching score represents the dissimilarity between two feature vectors, the smaller the matching score, the better the value.
  • the coordinate transformation calculation unit 913 increments i by 1 (step 1011) and compares i with M (step 1012). If i is equal to or smaller than M (step 1012, NO), the matching score calculation device 901 repeats the processes from step 1004 onwards.
  • the matching score calculation unit 915 performs biometric authentication on the person to be matched using the matching score 925 and generates an authentication result (step 1013).
  • the matching score calculation unit 915 identifies registrants who satisfy a predetermined condition from among multiple registrants registered in the registration template 921.
  • the predetermined condition may be that the matching score 925 is greater than a judgment threshold.
  • the predetermined condition may be that the matching score 925 is less than a judgment threshold.
  • the matching score calculation unit 915 If a registered person who satisfies the specified conditions is identified, the matching score calculation unit 915 generates an authentication result indicating successful authentication, and if a registered person who satisfies the specified conditions is not identified, it generates an authentication result indicating unsuccessful authentication. This makes it possible to determine whether or not the person to be matched is registered in the registration template 921.
  • the communication unit 916 transmits the generated authentication result to the control device 902.
  • FIG. 11 is a flowchart showing an example of the search process in step 1008 of FIG. 10.
  • the matching score calculation unit 915 initializes the feature point pair list (step 1101), sets the control variable i to 1 (step 1102), and sets the control variable j to 1 (step 1103).
  • N1 represents the number of feature points included in the registered data
  • N2 represents the number of feature points included in the target data 924.
  • the matching score calculation unit 915 increments j by 1 (step 1105) and compares j with N2 (step 1106). If j is less than or equal to N2 (step 1106, NO), the matching score calculation unit 915 repeats the processing from step 1104 onwards.
  • the matching score calculation unit 915 increments i by 1 (step 1107) and compares i with N1 (step 1108). If i is less than or equal to N1 (step 1108, NO), the matching score calculation unit 915 repeats the processing from step 1103 onwards.
  • step 1104, YES the matching score calculation unit 915 performs the process of step 1109.
  • step 1109 the matching score calculation unit 915 adds the combination of the i-th feature point included in the registered data and the j-th feature point included in the matching target data 924 to the feature pair list as a corresponding feature pair.
  • step 1106 If i is greater than N1 (step 1106, YES), the matching score calculation unit 915 ends the process.
  • FIG. 12 is a flowchart showing an example of the calculation process in step 1009 of FIG. 10.
  • the matching score calculation unit 915 sets the matching score Si for the coordinate transformation Ti to 0, which indicates that the matching is invalid (step 1201).
  • the matching score calculation unit 915 compares the number of feature points pairs included in the feature points pair list with a threshold value TH2 (step 1202). If the number of feature points pairs is equal to or greater than TH2 (step 1202, NO), the matching score calculation unit 915 determines that the matching is valid.
  • the matching score calculation unit 915 calculates a matching score for each feature point pair using the corrected feature vector of the feature point included in the matching target data 924 and the feature vector of the feature point included in the registered data. Then, the matching score calculation unit 915 selects the top K matching scores (K is an integer equal to or greater than 1) in descending order of the best matching score, and sets the statistical value of these K matching scores to Si (step 1203). As the statistical value, the average, median, maximum value, minimum value, etc. are used.
  • the matching score calculation unit 915 determines that the matching is invalid and ends the process.
  • FIG. 13 shows an example of a matching score using the Hamming distance.
  • FIG. 13(a) shows an example of binary data representing a feature.
  • the feature 1301 is binarized in advance and converted into binary data 1302 consisting of 0s and 1s.
  • FIG. 13(b) shows an example of a matching score based on binary data.
  • bits of the same digit in binary data 1302 and binary data 1303 are compared.
  • the number of bits that differ between binary data 1302 and binary data 1303 is calculated as the Hamming distance and used as the matching score.
  • the exclusive OR (XOR) of binary data 1302 and binary data 1303 is calculated for each bit, and the number of logical values "1" contained in the resulting binary data 1304, "9", is found as the Hamming distance.
  • the Hamming distance can be calculated quickly by using the bit operation function of the CPU (Central Processing Unit).
  • the Hamming distance is particularly fast in CPUs for embedded devices that have poor floating-point arithmetic.
  • features may be encrypted to improve security.
  • FIG. 14 shows an example of binary data encryption.
  • Encrypted data 1411 is random number data used to encrypt the binary data.
  • Encrypted binary data 1421 is generated by calculating the XOR of binary data 1401 included in the registration data and encrypted data 1411 for each bit.
  • Encrypted binary data 1422 is generated by calculating the XOR of binary data 1402 included in the data to be matched and encrypted data 1411 for each bit.
  • Encrypted data 1411 is binary data of the same length as binary data 1401 and binary data 1402, and is generated based on AES (Advanced Encryption Standard) or the like.
  • the Hamming distance between binary data 1401 and binary data 1402 is HD
  • the Hamming distance between encrypted binary data 1421 and encrypted binary data 1422 will also be HD.
  • Equation (32) indicates that when n ⁇ i, ⁇ ( ⁇ , n, i) is close to 0.
  • ⁇ ( ⁇ ,n,n) represents the extent to which the basis vector P( ⁇ ,n) rotated by ⁇ contains the original basis vector P(n).
  • ⁇ ( ⁇ ,n,i) represents the extent to which the basis vector P( ⁇ ,n) rotated by ⁇ contains basis vectors P(i) other than the original basis vector P(n).
  • ⁇ ( ⁇ ,n,n) can also be considered as a coefficient that indicates the stability of x'(n) against rotation. For example, when ⁇ ( ⁇ ,n,n) is a large value (close to 1), x'(n) is less susceptible to the effects of rotation. Conversely, when ⁇ ( ⁇ ,n,n) is a small value (smaller than 1), x'(n) is more susceptible to the effects of rotation.
  • Figure 15 shows an example of the relationship between ⁇ ( ⁇ ,n,n) and the rotation angle ⁇ .
  • Curve 1501 represents ⁇ ( ⁇ ,n,n) that is not easily affected by rotation
  • curve 1502 represents ⁇ ( ⁇ ,n,n) that is easily affected by rotation. Whether ⁇ ( ⁇ ,n,n) is easily affected by rotation varies depending on the value of n, and is determined by the characteristics of P(i).
  • the matching score calculation unit 915 calculates the matching score SD between the feature vector x and the feature vector included in the registered data using the following formula.
  • d(i) represents the XOR of the i-th element x(i) of the feature vector x and the i-th element of the feature vector included in the registered data.
  • w(i) represents the weight for d(i)
  • the right-hand side of equation (34) represents the weighted sum of d(i).
  • ⁇ ( ⁇ ,i,i) is used as w(i).
  • ⁇ ( ⁇ ,i,i) is considered to be a coefficient that indicates the stability of x'(i) against rotation, so by using ⁇ ( ⁇ ,i,i) as a weight for each d(i), the accuracy of the matching score SD can be improved.
  • the method of calculating the matching score using x'(n) in formula (20) may be referred to as the first method, and the method of calculating the matching score SD using formula (34) may be referred to as the second method.
  • the second method has a limited effect on improving accuracy, but has the advantage of being applicable in a wider range of applications.
  • the matching score SD can be calculated even if the feature vectors contained in the matching target data and registered data remain encrypted.
  • Normalizing w(i) allows for a stable calculation of the matching score SD.
  • the weight w'(i) after normalization can be calculated, for example, by the following formula.
  • FIG. 16 is a flowchart showing an example of the second biometric authentication process performed by the matching score calculation device 901 in FIG. 9.
  • the second method is used, and the feature vectors included in the matching target data and the registered data are encrypted binary data.
  • steps 1601 to 1606, 1608, and 1610 to 1614 is similar to the processing in steps 1001 to 1006, 1008, and 1010 to 1014 in FIG. 10.
  • the matching score calculation unit 915 calculates ⁇ ( ⁇ , i, i) as the weight w(i) in equation (34).
  • the matching score calculation unit 915 calculates ⁇ ( ⁇ , i, i) using the rotation angle ⁇ included in the coordinate transformation Ti and the fitting information 922.
  • the matching score calculation unit 915 performs the calculation process of FIG. 12. In this case, in step 1203, the matching score calculation unit 915 calculates the matching score for each feature point pair using equation (34). Then, the matching score calculation unit 915 selects the top K matching scores in descending order of the best matching score, and sets the statistical value of these K matching scores to Si.
  • FIG. 17 shows an example of the functional configuration of a flapper gate management system including the matching score calculation device 701 of FIG. 7.
  • the flapper gate management system of FIG. 17 includes a feature acquisition device 1701, a control device 1702, and a matching score calculation device 1703.
  • the feature acquisition device 1701 is, for example, a client
  • the matching score calculation device 1703 is, for example, a server.
  • the feature vector of the person to be matched flows over a communication network, so the second method is used and the feature vector is encrypted.
  • the matching score calculation device 1703 corresponds to the matching score calculation device 701 in FIG. 7.
  • the flapper gate management system is an example of a matching score calculation system.
  • the feature acquisition device 1701 is installed near the flapper gate, and includes a biometric image acquisition unit 1711, a feature extraction unit 1712, a communication unit 1713, and a storage unit 1714.
  • the feature extraction unit 1712 is an example of an acquisition unit
  • the communication unit 1713 is an example of a transmission unit.
  • the biometric image acquisition unit 1711 acquires a biometric image 1721 of the person to be matched and stores it in the storage unit 1714.
  • the biometric image 1721 is, for example, a fingerprint image, a palm vein image, a palm print image, or a face image.
  • the biometric image acquisition unit 1711 is, for example, a fingerprint sensor, a vein sensor, or an image sensor.
  • the feature extraction unit 1712 extracts multiple feature points from the biometric image 1721 and calculates a feature vector from pixel values in the vicinity of each feature point. For example, encrypted binary data is calculated as the feature vector.
  • the feature extraction unit 1712 then generates matching target data 1722 including the coordinates and feature vectors of each of the multiple feature points, and stores the data in the memory unit 1714.
  • the communication unit 1713 transmits the matching target data 1722 to the matching score calculation device 1703 via a communication network.
  • the matching score calculation device 1703 calculates a matching score using the matching target data 1722, performs biometric authentication on the person to be matched using the matching score, and transmits the authentication result to the control device 1702.
  • the control device 1702 controls the opening and closing of the flapper gate based on the authentication result. For example, if the authentication result indicates success, the control device 1702 controls the opening of the flapper gate, and if the authentication result indicates failure, the control device 1702 controls the closing of the flapper gate.
  • the matching score calculation device 1703 includes a coordinate transformation calculation unit 1731, a coordinate transformation unit 1732, a matching score calculation unit 1733, a communication unit 1734, and a memory unit 1735.
  • the coordinate transformation unit 1732, the matching score calculation unit 1733, and the memory unit 1735 correspond to the coordinate transformation unit 712, the matching score calculation unit 713, and the memory unit 711 in FIG. 7, respectively.
  • the communication unit 1734 is an example of a receiving unit.
  • the storage unit 1735 stores a registration template 1741 and fitting information 1742.
  • the registration template 1741 includes registration data for each of a plurality of registered persons, and the registration data for each of the registered persons includes the coordinates and feature vectors of each of a plurality of feature points. For example, encrypted binary data is used as the feature vector.
  • the fitting information 1742 is similar to the fitting information 922 in FIG. 9.
  • the communication unit 1734 receives the matching target data 1722 from the feature acquisition device 1701, and the storage unit 1735 stores the matching target data 1722.
  • the coordinate transformation calculation unit 1731 calculates a coordinate transformation T for matching the matching target data 1722 with the registered data of each registered person in the registered template 1741.
  • the coordinate transformation T includes, for example, a rotation angle ⁇ and a translation amount ( ⁇ X, ⁇ Y) in the X and Y directions.
  • the coordinate transformation unit 1732 uses coordinate transformation T to transform the coordinates of each feature point contained in the matching target data 1722.
  • the matching score calculation unit 1733 calculates ⁇ ( ⁇ , i, i) using the rotation angle ⁇ included in the coordinate transformation T and the fitting information 1742. The matching score calculation unit 1733 then calculates the matching score 1743 between the matching target data 1722 and the registered data using ⁇ ( ⁇ , i, i) as the weight w(i) in equation (34).
  • the matching score calculation unit 1733 calculates the matching score 1743 using the coordinates after coordinate transformation of each feature point included in the matching target data 1722, the feature vectors of each feature point, and the enrollment data of each enrollee included in the enrollment template 1741. Then, the matching score calculation unit 1733 stores the matching score 1743 in the storage unit 1735.
  • the matching score calculation unit 1733 performs biometric authentication on the person to be matched using the matching score 1743 and generates an authentication result.
  • the communication unit 1734 transmits the authentication result to the control device 1702 via the communication network.
  • the processing performed by the feature acquisition device 1701 is similar to the processing in steps 1601 and 1602 in FIG. 16, and the processing performed by the matching score calculation device 1703 is similar to the processing in steps 1603 to 1614 in FIG. 16.
  • F(n, i, ⁇ ) in formula (21) does not have to be a quadratic function of ⁇ , and may be a function of the absolute value of ⁇ ,
  • F(n, i, ⁇ ) is a function of
  • ⁇ ( ⁇ , n, i) is expressed, for example, by the following formula.
  • the second method may be used to calculate the matching score with sufficiently high accuracy. Therefore, by using the second method, which is simple to calculate, it is possible to speed up the processing.
  • the first method may be used to calculate the matching score with higher accuracy.
  • the matching score calculation device 901 in FIG. 9 or the matching score calculation device 1703 in FIG. 17 may convert the coordinates of the feature points included in the registered data of each registered person instead of the feature points included in the data to be matched. In this case, the matching score calculation device 901 or the matching score calculation device 1703 calculates the matching score using the coordinates after coordinate conversion of each feature point included in the registered data, the feature vectors of each feature point, and the data to be matched, in a calculation method similar to the first method or the second method.
  • the configuration of the matching score calculation device 701 in FIG. 7 is merely an example, and some of the components may be omitted or changed depending on the use or conditions of the matching score calculation device 701.
  • the configurations of the entrance/exit management system in FIG. 9 and the flapper gate management system in FIG. 17 are merely an example, and some of the components may be omitted or changed depending on the use or conditions of the entrance/exit management system or the flapper gate management system.
  • the feature points and feature amounts shown in FIG. 1, the feature points shown in FIG. 5, and the feature amounts shown in FIG. 6, FIG. 13, and FIG. 14 are merely examples, and the feature points and feature amounts change depending on the biometric image.
  • the normalization process shown in FIG. 2 and FIG. 3 is merely an example, and the normalization process changes depending on the biometric image.
  • the matching process shown in FIG. 4 is merely an example, and the matching process changes depending on the data to be matched and the registered data.
  • Equations (1) to (36) are merely examples, and the entrance/exit management system and the flapper gate management system may use other calculation formulas to perform biometric authentication processing.
  • FIG. 18 shows an example of the hardware configuration of an information processing device used as the matching score calculation device 701 in FIG. 7, the matching score calculation device 901 in FIG. 9, the feature acquisition device 1701 in FIG. 17, and the matching score calculation device 1703 in FIG. 17.
  • the information processing device of FIG. 18 includes a CPU 1801, a memory 1802, an input device 1803, an output device 1804, an auxiliary storage device 1805, a medium drive device 1806, and a network connection device 1807. These components are hardware and are connected to each other via a bus 1808.
  • the biometric image acquisition unit 911 of FIG. 9 and the biometric image acquisition unit 1711 of FIG. 17 may be hardware sensors connected to the bus 1808.
  • Memory 1802 is, for example, a semiconductor memory such as a ROM (Read Only Memory), a RAM (Random Access Memory), or a flash memory, and stores programs and data used in processing. Memory 1802 may operate as memory unit 711 in FIG. 7, memory unit 917 in FIG. 9, memory unit 1714 in FIG. 17, or memory unit 1735 in FIG. 17.
  • the CPU 1801 (processor) operates as the coordinate conversion unit 712 and the matching score calculation unit 713 in FIG. 7 by, for example, executing a program using the memory 1802.
  • the CPU 1801 also operates as the feature extraction unit 912, the coordinate transformation calculation unit 913, the coordinate transformation unit 914, and the matching score calculation unit 915 in FIG. 9 by executing a program using the memory 1802.
  • the CPU 1801 also operates as the feature extraction unit 1712, the coordinate transformation calculation unit 1731, the coordinate transformation unit 1732, and the matching score calculation unit 1733 in FIG. 17 by executing a program using the memory 1802.
  • the input device 1803 is, for example, a keyboard, a pointing device, etc., and is used to input instructions or information from the operator.
  • the output device 1804 is, for example, a display device, a printer, a speaker, etc., and is used to output inquiries to the operator or processing results.
  • the processing results may be the matching score 925, the matching score 1743, or an authentication result.
  • the auxiliary storage device 1805 is, for example, a magnetic disk device, an optical disk device, a magneto-optical disk device, a tape device, etc.
  • the auxiliary storage device 1805 may be a hard disk drive or an SSD (Solid State Drive).
  • the information processing device can store programs and data in the auxiliary storage device 1805 and load them into the memory 1802 for use.
  • the auxiliary storage device 1805 may operate as the memory unit 711 in FIG. 7, the memory unit 917 in FIG. 9, the memory unit 1714 in FIG. 17, or the memory unit 1735 in FIG. 17.
  • the medium drive device 1806 drives the portable recording medium 1809 and accesses the recorded contents.
  • the portable recording medium 1809 is a memory device, a flexible disk, an optical disk, a magneto-optical disk, etc.
  • the portable recording medium 1809 may be a CD-ROM (Compact Disk Read Only Memory), a DVD (Digital Versatile Disk), a USB (Universal Serial Bus) memory, etc.
  • the operator can store programs and data in the portable recording medium 1809 and load them into the memory 1802 for use.
  • the computer-readable recording medium that stores the programs and data used in the processing is a physical (non-transitory) recording medium such as memory 1802, auxiliary storage device 1805, or portable recording medium 1809.
  • the network connection device 1807 is a communication interface circuit that is connected to a communication network such as a WAN (Wide Area Network) or a LAN (Local Area Network) and performs data conversion associated with communication.
  • the information processing device receives programs and data from an external device via the network connection device 1807 and can load them into the memory 1802 for use.
  • the network connection device 1807 may operate as the communication unit 916 in FIG. 9, the communication unit 1713 in FIG. 17, or the communication unit 1734 in FIG. 17.
  • the information processing device does not need to include all of the components in FIG. 18, and some components may be omitted or modified depending on the application or conditions. For example, if an interface with an operator is not required, the input device 1803 and the output device 1804 may be omitted. If the information processing device does not use the portable recording medium 1809 or a communication network, the medium drive device 1806 or the network connection device 1807 may be omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

In the present invention, after a computer acquires coordinate information and feature value information for a comparison feature point, the computer converts the coordinates of one among the comparison feature point and a registered feature point stored in a storage unit on the basis of an angular difference calculated using the coordinate information of the comparison feature point and the coordinate information of the registered feature point. The computer uses the angular difference and the acquired feature value information for the comparison feature point to calculate a comparison score between the other among the comparison feature point and the registered feature point and the feature point resulting from converting the coordinates of the one among the comparison feature point and the registered feature point.

Description

照合スコア算出方法、照合スコア算出装置、照合スコア算出システム、及び照合スコア算出プログラムMatching score calculation method, matching score calculation device, matching score calculation system, and matching score calculation program
 本発明は、照合スコア算出方法、照合スコア算出装置、照合スコア算出システム、及び照合スコア算出プログラムに関する。 The present invention relates to a matching score calculation method, a matching score calculation device, a matching score calculation system, and a matching score calculation program.
 生体認証等において、特徴点を用いた照合方法が一般に利用されている。生体認証における特徴点を用いた照合方法では、生体画像から特徴的な点が特徴点として抽出され、特徴点近傍の画素値から算出された特徴量を用いて照合が行われる。生体画像は、指紋画像、手のひら静脈画像等である。 In biometric authentication and other fields, matching methods using feature points are commonly used. In biometric authentication, characteristic points are extracted from a biometric image as feature points, and matching is performed using feature amounts calculated from pixel values near the feature points. Biometric images include fingerprint images and palm vein images.
 図1は、生体画像における特徴点及び特徴量の例を示している。図1(a)は、特徴点の例を示している。曲線101は、例えば、指紋画像に写っている指紋、又は手のひら静脈画像に写っている静脈を表す。この場合、曲線101の分岐点111、端点112、及び端点113が、それぞれ、特徴点A1、特徴点A2、及び特徴点A3として抽出される。 FIG. 1 shows examples of feature points and feature quantities in a biometric image. FIG. 1(a) shows examples of feature points. Curve 101 represents, for example, a fingerprint in a fingerprint image, or veins in a palm vein image. In this case, branch point 111, end point 112, and end point 113 of curve 101 are extracted as feature point A1, feature point A2, and feature point A3, respectively.
 図1(b)は、図1(a)の生体画像から生成される特徴データの例を示している。特徴データは、各特徴点の位置を示す座標(X,Y)と各特徴点の特徴量とを含む。特徴量は、例えば、ベクトルにより表される。 FIG. 1(b) shows an example of feature data generated from the biometric image of FIG. 1(a). The feature data includes coordinates (X, Y) indicating the position of each feature point and the feature amount of each feature point. The feature amount is represented by, for example, a vector.
 特徴点を用いた照合方法では、照合対象データ及び登録データを用いて、照合処理が行われる。照合対象データは、照合対象者の生体画像から生成される特徴データであり、登録データは、登録者の生体画像から生成される特徴データである。照合処理では、特徴点の座標(X,Y)を利用して、照合対象データに含まれる照合特徴点の特徴量と、登録データに含まれる登録特徴点の特徴量とが比較され、比較結果を示す照合スコアが算出される。 In matching methods using feature points, matching processing is performed using target data and registered data. The target data is feature data generated from a biometric image of the target person, and the registered data is feature data generated from a biometric image of the enrolled person. In the matching processing, the coordinates (X, Y) of the feature points are used to compare the feature amounts of the matching feature points contained in the target data with the feature amounts of the registered feature points contained in the registered data, and a matching score indicating the comparison result is calculated.
 生体認証に関連して、メモリ量を削減しながら、精度の高い生体認証を実現する認証装置が知られている(例えば、特許文献1を参照)。部分データを利用した生体認証の精度低下を抑制するデータ生成方法も知られている(例えば、特許文献2を参照)。生体画像から取得した特徴量をバイナリ化して行う生体認証において認証精度を向上させる生体画像処理装置も知られている(例えば、特許文献3を参照)。 In relation to biometric authentication, an authentication device that achieves highly accurate biometric authentication while reducing memory usage is known (see, for example, Patent Document 1). A data generation method that suppresses accuracy degradation in biometric authentication using partial data is also known (see, for example, Patent Document 2). A biometric image processing device that improves authentication accuracy in biometric authentication that is performed by binarizing feature amounts acquired from a biometric image is also known (see, for example, Patent Document 3).
特開2006-39951号公報JP 2006-39951 A 国際公開第2022/074786号International Publication No. 2022/074786 特開2019-45969号公報JP 2019-45969 A
 生体認証において入力される生体画像に姿勢変動が含まれている場合、照合特徴点の特徴量と登録特徴点の特徴量との比較に基づく照合スコアの精度が低下することがある。 If the biometric image input for biometric authentication contains posture variations, the accuracy of the matching score based on the comparison between the feature values of matching features and the feature values of registered features may decrease.
 なお、かかる問題は、生体画像から抽出された特徴点の特徴量に限らず、様々な特徴点の特徴量を比較する場合において生ずるものである。 Note that this problem is not limited to feature values of feature points extracted from biometric images, but occurs when comparing feature values of various feature points.
 1つの側面において、本発明は、照合特徴点と登録特徴点との照合スコアの精度を向上させることを目的とする。 In one aspect, the present invention aims to improve the accuracy of the matching score between matching features and registered features.
 1つの案では、コンピュータは、照合特徴点の座標情報と特徴量情報とを取得した場合、照合特徴点の座標情報と記憶部に記憶された登録特徴点の座標情報とを用いて算出される角度差に基づいて、照合特徴点と登録特徴点との何れか一方を座標変換する。コンピュータは、角度差と取得した照合特徴点の特徴量情報とを用いて、照合特徴点と登録特徴点との何れか一方を座標変換した特徴点と、照合特徴点と登録特徴点との他方との照合スコアを算出する。 In one proposal, when the computer acquires the coordinate information and feature amount information of the matching feature point, it performs coordinate transformation of either the matching feature point or the registered feature point based on an angle difference calculated using the coordinate information of the matching feature point and the coordinate information of the registered feature point stored in the storage unit. Using the angle difference and the acquired feature amount information of the matching feature point, the computer calculates a feature point obtained by coordinate transformation of either the matching feature point or the registered feature point, and a matching score between the other of the matching feature point and the registered feature point.
 1つの側面によれば、照合特徴点と登録特徴点との照合スコアの精度を向上させることができる。 According to one aspect, it is possible to improve the accuracy of the matching score between matching features and registered features.
生体画像における特徴点及び特徴量を示す図である。FIG. 2 is a diagram showing feature points and feature amounts in a biometric image. 指紋画像に対する正規化処理を示す図である。FIG. 13 is a diagram illustrating a normalization process for a fingerprint image. 手のひら静脈画像に対する正規化処理を示す図である。FIG. 13 is a diagram illustrating normalization processing for a palm vein image. 位置合わせ探索を用いた照合処理を示す図である。FIG. 1 illustrates a matching process using alignment search. 特徴点の位置の対応関係を示す図である。FIG. 13 is a diagram showing a correspondence relationship between positions of feature points. 特徴量の比較を示す図である。FIG. 13 is a diagram illustrating a comparison of feature amounts. 実施形態の照合スコア算出装置の機能的構成図である。FIG. 2 is a functional configuration diagram of a matching score calculation device according to an embodiment. 照合スコア算出処理のフローチャートである。13 is a flowchart of a matching score calculation process. 入退室管理システムの機能的構成図である。FIG. 2 is a functional configuration diagram of an entrance/exit management system. 第1の生体認証処理のフローチャートである。11 is a flowchart of a first biometric authentication process. 探索処理のフローチャートである。13 is a flowchart of a search process. 計算処理のフローチャートである。13 is a flowchart of a calculation process. ハミング距離を用いた照合スコアを示す図である。FIG. 13 is a diagram showing a matching score using the Hamming distance. バイナリデータの暗号化を示す図である。FIG. 1 illustrates encryption of binary data. α(θ,n,n)と回転角θの関係を示す図である。FIG. 13 is a diagram illustrating the relationship between α(θ, n, n) and the rotation angle θ. 第2の生体認証処理のフローチャートである。13 is a flowchart of a second biometric authentication process. フラッパーゲート管理システムの機能的構成図である。FIG. 2 is a functional configuration diagram of a flapper gate management system. 情報処理装置のハードウェア構成図である。FIG. 2 is a hardware configuration diagram of an information processing device.
 以下、図面を参照しながら、実施形態を詳細に説明する。 The following describes the embodiment in detail with reference to the drawings.
 生体認証において入力される生体画像には、姿勢変動が含まれていることが多い。姿勢変動は、例えば、回転又は位置ずれであり、照合対象者又は登録者であるユーザにより入力される生体画像の姿勢が不安定であるために生ずる。このため、生体認証では、正規化処理を行うことで、回転又は位置ずれ等の補正が行われる。 Biometric images input for biometric authentication often contain posture variations. Posture variations, such as rotation or positional shift, occur due to unstable posture of the biometric image input by the person to be matched or the user who is registered. For this reason, in biometric authentication, normalization processing is performed to correct for rotation, positional shift, etc.
 図2は、指紋画像に対する正規化処理の例を示している。指紋画像201における指の輪郭又は指紋の中心座標等の情報を基に、回転又は位置ずれ等の補正を行うことで、指紋画像201が指紋画像202に変換される。 FIG. 2 shows an example of normalization processing for a fingerprint image. Fingerprint image 201 is converted into fingerprint image 202 by performing corrections such as rotation or positional shift based on information such as the outline of the finger or the center coordinates of the fingerprint in fingerprint image 201.
 図3は、手のひら静脈画像に対する正規化処理の例を示している。手のひら静脈画像301に対して正規化処理を適用することで、手のひら静脈画像301が手のひら静脈画像302に変換される。 FIG. 3 shows an example of normalization processing for a palm vein image. By applying normalization processing to palm vein image 301, palm vein image 301 is converted into palm vein image 302.
 しかしながら、正規化処理では、おおよその補正を行うことができるに過ぎず、完全な補正は行われないため、回転誤差等の姿勢変動が残ることが多い。生体認証の場合、登録データ及び照合対象データそれぞれに対して回転補正を適用するため、双方の回転誤差の和が照合スコアに影響する。例えば、登録データが+3°の回転誤差を含み、照合対象データが-3°の回転誤差を含む場合、照合処理では、6°の角度誤差を有する2つのデータが比較される。 However, normalization processing can only perform approximate correction, not complete correction, so posture variations such as rotational errors often remain. In the case of biometric authentication, rotational correction is applied to both the enrollment data and the data to be matched, so the sum of the rotational errors of both affects the matching score. For example, if the enrollment data contains a rotational error of +3° and the data to be matched contains a rotational error of -3°, the matching process compares two pieces of data with an angle error of 6°.
 そこで、照合処理において、回転誤差を吸収するために位置合わせ探索が行われる。位置合わせ探索では、照合対象データに対して座標変換Tが適用され、座標変換後の照合対象データが登録データと比較される。 Therefore, in the matching process, a registration search is performed to absorb the rotation error. In the registration search, a coordinate transformation T is applied to the data to be matched, and the data to be matched after the coordinate transformation is compared with the registered data.
 図4は、位置合わせ探索を用いた照合処理の例を示している。照合対象データ401に対して座標変換Tを適用することで、照合対象データ402が生成され、照合対象データ402が登録データ403と比較される。座標変換Tは、回転、平行移動等を含む。座標変換Tは、様々な方法で求めることができる。例えば、複数の座標変換を順番に試行するブルートフォース探索が用いられることもあり、複数の特徴点の対応関係から座標変換Tが算出されることもある。 Figure 4 shows an example of matching processing using alignment search. Matching target data 402 is generated by applying coordinate transformation T to matching target data 401, and matching target data 402 is compared with registered data 403. Coordinate transformation T includes rotation, translation, etc. Coordinate transformation T can be calculated by various methods. For example, a brute force search that tries multiple coordinate transformations in sequence may be used, or coordinate transformation T may be calculated from the correspondence between multiple feature points.
 照合対象データに含まれる特徴点の座標に関しては、このような座標変換Tを適用することで、回転誤差の影響を吸収することができる。一方、照合対象データに含まれる特徴量を、回転誤差を有する生体画像から抽出した場合、対応する誤差が特徴量に含まれてしまうため、照合スコアの精度が低下する。 By applying this type of coordinate transformation T to the coordinates of feature points contained in the data to be matched, the effects of rotation error can be absorbed. On the other hand, if the feature amounts contained in the data to be matched are extracted from a biometric image that has rotation error, the corresponding error will be included in the feature amount, reducing the accuracy of the matching score.
 図5は、生体画像における特徴点の位置の対応関係の例を示している。照合対象者の手のひら静脈画像502内の位置512-i(i=1~3)は、登録者の手のひら静脈画像501内の位置511-iに対応している。照合対象者は、登録者と同一人物である。この場合、生体認証において、手のひら静脈画像502内の位置512-iの特徴量は、手のひら静脈画像501内の位置511-iの特徴量と比較される。 Figure 5 shows an example of the correspondence relationship between the positions of feature points in a biometric image. Positions 512-i (i = 1 to 3) in palm vein image 502 of the person being matched correspond to positions 511-i in palm vein image 501 of the registrant. The person being matched is the same person as the registrant. In this case, in biometric authentication, the feature amount at position 512-i in palm vein image 502 is compared with the feature amount at position 511-i in palm vein image 501.
 図6は、特徴量の比較の例を示している。画像601は、図5の手のひら静脈画像501内の何れかの位置511-iに対応する領域の画像であり、画像602は、手のひら静脈画像502内の位置512-iに対応する領域の画像である。画像601には回転誤差がなく、画像602には回転誤差がある。 FIG. 6 shows an example of a comparison of features. Image 601 is an image of a region corresponding to position 511-i in palm vein image 501 in FIG. 5, and image 602 is an image of a region corresponding to position 512-i in palm vein image 502. Image 601 has no rotation error, and image 602 has a rotation error.
 画像601の領域611から特徴点621が抽出され、領域611の画像データ631を用いて、特徴点621の特徴量を表す特徴ベクトル641が生成される。また、画像602の領域612から特徴点622が抽出され、領域612の画像データ632を用いて、特徴点622の特徴量を表す特徴ベクトル642が生成される。 Feature point 621 is extracted from region 611 of image 601, and feature vector 641 representing the feature amount of feature point 621 is generated using image data 631 of region 611. Also, feature point 622 is extracted from region 612 of image 602, and feature vector 642 representing the feature amount of feature point 622 is generated using image data 632 of region 612.
 この場合、照合対象者が登録者と同一人物であり、かつ、特徴点622が特徴点621に対応しているにもかかわらず、特徴ベクトル642が特徴ベクトル641と異なるため、照合スコアの精度が低下する。このため、照合対象者が登録者とは異なる人物であると判定される可能性が高くなり、認証精度が低下する。 In this case, even though the person being matched is the same person as the registrant and feature point 622 corresponds to feature point 621, feature vector 642 differs from feature vector 641, and the accuracy of the matching score decreases. As a result, the person being matched is more likely to be determined to be a person different from the registrant, and the authentication accuracy decreases.
 なお、照合対象データに含まれる複数の特徴点と、登録データに含まれる複数の特徴点との対応関係から、角度差θを算出した後に、照合対象者の生体画像を-θだけ回転させて、特徴量を再度抽出することで、角度差θを補正することも可能ではある。しかしながら、この方法には、処理時間、メモリ消費量、及びセキュリティの観点から多くの問題がある。 It is also possible to correct the angle difference θ by calculating the angle difference θ from the correspondence between multiple feature points included in the data to be matched and multiple feature points included in the registered data, then rotating the biometric image of the person to be matched by -θ and re-extracting the feature amounts. However, this method has many problems in terms of processing time, memory consumption, and security.
 まず、特徴量を再度抽出するために処理時間が増大する。特に、クライアント-サーバ構成で生体認証システムを構築する場合、一般的には、クライアントにより特徴量が抽出されるが、角度差θはサーバにおける照合処理で算出される。そこで、サーバからクライアントへ角度差θを送信した後に、クライアントにより特徴量が再度抽出される。このため、通信時間が加算されて、処理時間がさらに増大する。また、認証プロトコルが非常に複雑になる。 First, the processing time increases because the features must be extracted again. In particular, when building a biometric authentication system with a client-server configuration, the features are generally extracted by the client, but the angle difference θ is calculated by a matching process on the server. Therefore, after the angle difference θ is sent from the server to the client, the features are extracted again by the client. This adds communication time, further increasing the processing time. Also, the authentication protocol becomes very complicated.
 クライアントは、各照合対象者の生体画像を取得してから、サーバが送信する角度差θを受信するまでの間、生体画像をメモリに保持することになり、メモリ消費量が増大する。特に、認証結果に基づいてフラッパーゲートを開閉するような運用システムでは、複数のユーザが迅速にフラッパーゲートを通過できることが望ましいため、通過するユーザの人数に応じてメモリ消費量がさらに増大する。 The client will retain the biometric images in memory from the time it acquires the biometric image of each person to be matched until it receives the angle difference θ sent by the server, which will increase memory consumption. In particular, in an operational system that opens and closes a flapper gate based on the authentication result, it is desirable for multiple users to be able to pass through the flapper gate quickly, so memory consumption will increase further depending on the number of users passing through.
 また、照合対象者の生体画像をクライアントのメモリに長時間にわたって保持する運用システムは、セキュリティの観点から望ましい構成とは言えない。 Furthermore, from a security standpoint, an operational system that stores biometric images of subjects for matching in the client's memory for long periods of time is not a desirable configuration.
 図7は、実施形態の照合スコア算出装置の機能的構成例を示している。図7の照合スコア算出装置701は、記憶部711、座標変換部712、及び照合スコア算出部713を含む。記憶部711は、登録特徴点の座標情報を記憶する。座標変換部712及び照合スコア算出部713は、照合スコア算出処理を行う。 FIG. 7 shows an example of the functional configuration of a matching score calculation device according to an embodiment. The matching score calculation device 701 in FIG. 7 includes a storage unit 711, a coordinate conversion unit 712, and a matching score calculation unit 713. The storage unit 711 stores coordinate information of registered feature points. The coordinate conversion unit 712 and the matching score calculation unit 713 perform a matching score calculation process.
 図8は、図7の照合スコア算出装置701が行う照合スコア算出処理の例を示すフローチャートである。まず、座標変換部712は、照合特徴点の座標情報と特徴量情報とを取得した場合、照合特徴点の座標情報と登録特徴点の座標情報とを用いて算出される角度差に基づいて、照合特徴点と登録特徴点との何れか一方を座標変換する(ステップ801)。 FIG. 8 is a flowchart showing an example of the matching score calculation process performed by the matching score calculation device 701 in FIG. 7. First, when the coordinate information and feature amount information of the matching feature point are acquired, the coordinate transformation unit 712 performs coordinate transformation of either the matching feature point or the registered feature point based on the angle difference calculated using the coordinate information of the matching feature point and the coordinate information of the registered feature point (step 801).
 次に、照合スコア算出部713は、角度差と取得した照合特徴点の特徴量情報とを用いて、照合特徴点と登録特徴点との何れか一方を座標変換した特徴点と、照合特徴点と登録特徴点との他方との照合スコアを算出する(ステップ802)。 Next, the matching score calculation unit 713 uses the angle difference and the feature amount information of the acquired matching feature point to calculate a feature point obtained by coordinate transformation of either the matching feature point or the registered feature point, and a matching score between the other of the matching feature point and the registered feature point (step 802).
 図7の照合スコア算出装置701によれば、照合特徴点と登録特徴点との照合スコアの精度を向上させることができる。 The matching score calculation device 701 in FIG. 7 can improve the accuracy of the matching score between the matching feature points and the registered feature points.
 図9は、図7の照合スコア算出装置701を含む入退室管理システムの機能的構成例を示している。図9の入退室管理システムは、照合スコア算出装置901及び制御装置902を含む。照合スコア算出装置901は、図7の照合スコア算出装置701に対応する。 FIG. 9 shows an example of the functional configuration of an entrance/exit management system including the matching score calculation device 701 of FIG. 7. The entrance/exit management system of FIG. 9 includes a matching score calculation device 901 and a control device 902. The matching score calculation device 901 corresponds to the matching score calculation device 701 of FIG. 7.
 照合スコア算出装置901は、照合対象者の生体画像から生成された照合対象データを用いて照合スコアを算出し、照合スコアを用いて照合対象者に対する生体認証を行って、認証結果を制御装置902へ送信する。 The matching score calculation device 901 calculates a matching score using matching target data generated from a biometric image of the person to be matched, performs biometric authentication on the person to be matched using the matching score, and transmits the authentication result to the control device 902.
 制御装置902は、認証結果に基づいてドアの開閉を制御する。制御装置902は、例えば、認証結果が成功を示す場合、ドアを開く制御を行い、認証結果が失敗を示す場合、ドアを閉じる制御を行う。 The control device 902 controls the opening and closing of the door based on the authentication result. For example, if the authentication result indicates success, the control device 902 controls the door to open, and if the authentication result indicates failure, the control device 902 controls the door to close.
 照合スコア算出装置901は、生体画像取得部911、特徴抽出部912、座標変換算出部913、座標変換部914、照合スコア算出部915、通信部916、及び記憶部917を含む。座標変換部914、照合スコア算出部915、及び記憶部917は、図7の座標変換部712、照合スコア算出部713、及び記憶部711にそれぞれ対応する。 The matching score calculation device 901 includes a biometric image acquisition unit 911, a feature extraction unit 912, a coordinate transformation calculation unit 913, a coordinate transformation unit 914, a matching score calculation unit 915, a communication unit 916, and a memory unit 917. The coordinate transformation unit 914, the matching score calculation unit 915, and the memory unit 917 correspond to the coordinate transformation unit 712, the matching score calculation unit 713, and the memory unit 711 in FIG. 7, respectively.
 記憶部917は、登録テンプレート921及びフィッティング情報922を記憶する。登録テンプレート921は、複数の登録者それぞれの登録データを含み、各登録者の登録データは、複数の特徴点それぞれの座標及び特徴ベクトルを含む。登録データに含まれる各特徴点は、登録特徴点の一例であり、各登録者の生体画像から抽出された特徴点である。登録データに含まれる各特徴点の座標は、登録特徴点の座標情報の一例である。 The memory unit 917 stores a registration template 921 and fitting information 922. The registration template 921 includes registration data for each of a plurality of enrollees, and the enrollment data for each enrollee includes the coordinates and feature vectors of each of a plurality of feature points. Each feature point included in the enrollment data is an example of a registered feature point, and is a feature point extracted from a biometric image of each enrollee. The coordinates of each feature point included in the enrollment data is an example of coordinate information of the registered feature point.
 フィッティング情報922は、照合対象データと登録データとの間の角度差θから、特徴量の補正パラメータを計算するための情報である。補正パラメータについては、後述する。 The fitting information 922 is information for calculating correction parameters for the feature quantities from the angle difference θ between the data to be matched and the registered data. The correction parameters will be described later.
 生体画像取得部911は、照合対象者の生体画像923を取得して、記憶部917に格納する。生体画像923は、例えば、指紋画像、手のひら静脈画像、掌紋画像、又は顔画像である。生体画像取得部911は、例えば、指紋センサ、静脈センサ、又は画像センサである。生体画像923として手のひら静脈画像を用いることで、手のひら静脈認証を行うことができる。 The biometric image acquisition unit 911 acquires a biometric image 923 of the person to be matched and stores it in the memory unit 917. The biometric image 923 is, for example, a fingerprint image, a palm vein image, a palm print image, or a face image. The biometric image acquisition unit 911 is, for example, a fingerprint sensor, a vein sensor, or an image sensor. By using a palm vein image as the biometric image 923, palm vein authentication can be performed.
 特徴抽出部912は、生体画像923から複数の特徴点を抽出し、各特徴点の近傍の画素値から特徴ベクトルを算出する。そして、特徴抽出部912は、複数の特徴点それぞれの座標及び特徴ベクトルを含む照合対象データ924を生成して、記憶部917に格納する。生体画像923から抽出された各特徴点は、照合特徴点の一例であり、各特徴点の座標は、照合特徴点の座標情報の一例であり、各特徴点の特徴ベクトルは、照合特徴点の特徴量情報の一例である。 The feature extraction unit 912 extracts multiple feature points from the biometric image 923 and calculates a feature vector from pixel values near each feature point. The feature extraction unit 912 then generates matching target data 924 including the coordinates and feature vectors of each of the multiple feature points and stores it in the storage unit 917. Each feature point extracted from the biometric image 923 is an example of a matching feature point, the coordinates of each feature point are an example of coordinate information of a matching feature point, and the feature vector of each feature point is an example of feature amount information of a matching feature point.
 特徴抽出部912は、例えば、正規直交性が成り立つ基底による変換を用いて、特徴ベクトルを算出することができる。このような変換としては、主成分分析(Principal Component Analysis,PCA)、離散コサイン変換(discrete cosine transform,DCT)、高速フーリエ変換(Fast Fourier Transform,FFT)等が挙げられる。 The feature extraction unit 912 can calculate feature vectors, for example, using a transformation based on a basis for which orthonormality holds. Examples of such transformations include Principal Component Analysis (PCA), discrete cosine transform (DCT), and fast Fourier transform (FFT).
 座標変換算出部913は、照合対象データ924を登録テンプレート921内の各登録者の登録データに合わせるための座標変換Tを算出する。座標変換Tは、例えば、回転角θと、X方向及びY方向の平行移動量(ΔX,ΔY)とを含む。回転角θは、照合特徴点の座標情報と登録特徴点の座標情報とを用いて算出される角度差に対応する。 The coordinate transformation calculation unit 913 calculates a coordinate transformation T for matching the matching target data 924 with the registered data of each registered person in the registered template 921. The coordinate transformation T includes, for example, a rotation angle θ and a parallel translation amount (ΔX, ΔY) in the X direction and Y direction. The rotation angle θ corresponds to an angle difference calculated using the coordinate information of the matching feature point and the coordinate information of the registered feature point.
 座標変換部914は、座標変換Tを用いて、照合対象データ924に含まれる各特徴点の座標を座標変換する。 The coordinate transformation unit 914 uses coordinate transformation T to transform the coordinates of each feature point contained in the matching target data 924.
 照合スコア算出部915は、座標変換Tに含まれる回転角θとフィッティング情報922とを用いて、照合対象データ924に含まれる各特徴点の特徴ベクトルを補正する。次に、照合スコア算出部915は、各特徴点の座標変換後の座標と、各特徴点の補正後の特徴ベクトルと、登録テンプレート921に含まれる各登録者の登録データとを用いて、照合対象データ924と登録データとの照合スコア925を算出する。そして、照合スコア算出部915は、照合スコア925を記憶部917に格納する。 The matching score calculation unit 915 corrects the feature vector of each feature point included in the matching target data 924 using the rotation angle θ included in the coordinate transformation T and the fitting information 922. Next, the matching score calculation unit 915 calculates a matching score 925 between the matching target data 924 and the registered data using the coordinates of each feature point after the coordinate transformation, the feature vector of each feature point after the correction, and the registered data of each registered person included in the registered template 921. Then, the matching score calculation unit 915 stores the matching score 925 in the memory unit 917.
 次に、照合スコア算出部915は、照合スコア925を用いて照合対象者に対する生体認証を行い、認証結果を生成する。通信部916は、認証結果を制御装置902へ送信する。 Next, the matching score calculation unit 915 performs biometric authentication on the person to be matched using the matching score 925 and generates an authentication result. The communication unit 916 transmits the authentication result to the control device 902.
 次に、座標変換Tの算出方法について説明する。座標変換算出部913は、照合対象データ924に含まれる複数の特徴点と、各登録者の登録データに含まれる複数の特徴点との間の対応関係から、座標変換Tを求める。 Next, a method for calculating the coordinate transformation T will be described. The coordinate transformation calculation unit 913 calculates the coordinate transformation T from the correspondence between multiple feature points included in the matching target data 924 and multiple feature points included in the registered data of each registered person.
 まず、座標変換算出部913は、照合対象データ924に含まれる各特徴点と、登録データに含まれる各特徴点との組み合わせを表す特徴点ペアを生成する。次に、座標変換算出部913は、各特徴点ペアについて、照合対象データ924に含まれる特徴点の特徴ベクトルと、登録データに含まれる特徴点の特徴ベクトルとの間の特徴点スコアを算出する。そして、座標変換算出部913は、最良の特徴点スコアから順に上位X個(Xは2以上の整数)の特徴点ペアを選択する。 First, the coordinate transformation calculation unit 913 generates feature points pairs that represent combinations of each feature point included in the matching target data 924 and each feature point included in the registered data. Next, for each feature points pair, the coordinate transformation calculation unit 913 calculates a feature points score between the feature vector of the feature point included in the matching target data 924 and the feature vector of the feature point included in the registered data. Then, the coordinate transformation calculation unit 913 selects the top X (X is an integer of 2 or more) feature points pairs in order from the best feature points score.
 特徴点スコアとしては、2つの特徴ベクトルの類似度又は相違度を用いることができる。類似度は、2つの特徴ベクトルの内積であってもよく、相違度は、2つの特徴ベクトルのベクトル間距離であってもよい。特徴点スコアが2つの特徴ベクトルの類似度を表す場合、特徴点スコアは大きいほど良好な値となる。一方、特徴点スコアが2つの特徴ベクトルの相違度を表す場合、特徴点スコアは小さいほど良好な値となる。 The feature point score can be the similarity or dissimilarity of two feature vectors. The similarity can be the inner product of two feature vectors, and the dissimilarity can be the inter-vector distance of two feature vectors. When the feature point score represents the similarity of two feature vectors, the larger the feature point score, the better the value. On the other hand, when the feature point score represents the dissimilarity of two feature vectors, the smaller the feature point score, the better the value.
 次に、座標変換算出部913は、X個の特徴点ペアの中から2つの特徴点ペアを選択し、選択された2つの特徴点ペアに含まれる各特徴点の座標を用いて、座標変換Tを算出する。X個の特徴点ペアの中から選択可能な2つの特徴点ペアの組み合わせの総数は、個であるため、個の座標変換Ti(i=1~)が求められる。 Next, the coordinate transformation calculation unit 913 selects two feature points pairs from the X feature points pairs, and uses the coordinates of each feature point included in the two selected feature points pairs to calculate a coordinate transformation T. Since the total number of combinations of two feature points pairs that can be selected from the X feature points pairs is XC2 , XC2 coordinate transformations Ti (i= 1 to XC2 ) are found.
 座標変換Tiの計算において、選択された2つの特徴点ペアの一方が登録特徴点P1と照合特徴点Q1の組み合わせを含み、他方が登録特徴点P2と照合特徴点Q2の組み合わせを含む場合を想定する。各特徴点の座標(X,Y)は、以下のように記述される。 In calculating the coordinate transformation Ti, it is assumed that one of the two selected feature point pairs includes a combination of registered feature point P1 and matching feature point Q1, and the other includes a combination of registered feature point P2 and matching feature point Q2. The coordinates (X, Y) of each feature point are written as follows:
P1 (x1,y1)
Q1 (u1,v1)
P2 (x2,y2)
Q2 (u2,v2)
P1 (x1, y1)
Q1 (u1, v1)
P2 (x2, y2)
Q2 (u2, v2)
 座標変換算出部913は、座標変換後のP1-Q1間距離とP2-Q2間距離とが最小となるような座標変換Tiを、最小二乗法により求める。 The coordinate transformation calculation unit 913 uses the least squares method to find the coordinate transformation Ti that minimizes the distance between P1 and Q1 and the distance between P2 and Q2 after the coordinate transformation.
 回転角θ及び平行移動量(ΔX,ΔY)を含む座標変換Tiは、次式により表される。 The coordinate transformation Ti, which includes the rotation angle θ and the translation amount (ΔX, ΔY), is expressed by the following formula:
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 (x’,y’)は、(x,y)を座標変換した後の座標を表す。式(1)を同次形式で表現すると、次式のようになる。 (x', y') represents the coordinates after (x, y) is transformed. When equation (1) is expressed in homogeneous form, it becomes the following equation.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 座標変換後のQ1及びQ2がそれぞれP1及びP2に合致するという条件を課した場合、次式が得られる。 If we impose the condition that Q1 and Q2 after coordinate transformation match P1 and P2, respectively, we obtain the following equation.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 式(3)及び式(4)をまとめると、次式が得られる。 Combining equations (3) and (4), we obtain the following equation.
AB=C   (5)
Figure JPOXMLDOC01-appb-M000004
A B = C (5)
Figure JPOXMLDOC01-appb-M000004
 座標変換算出部913は、式(5)を満たすθ、ΔX、及びΔYを最小二乗法により算出する。最小二乗法により誤差が最小となる行列Aを求めるには、行列Bの疑似逆行列B-1を求めて、次式により行列Aを計算すればよい。 The coordinate transformation calculation unit 913 calculates θ, ΔX, and ΔY that satisfy the formula (5) by the least squares method. To find the matrix A that minimizes the error by the least squares method, the pseudo inverse matrix B −1 of the matrix B is found, and the matrix A is calculated by the following formula.
A=CB-1   (9) A = CB -1 (9)
 次に、各特徴点の補正後の特徴ベクトルの算出方法について説明する。以下の説明において、回転角θだけ回転した生体画像を表す画像ベクトルI(θ)に対する特徴ベクトルは、x=(x(1),x(2),...,x(N))(Nは2以上の整数)により表される。一方、回転していない画像を表す画像ベクトルIに対する特徴ベクトルは、x’=(x’(1),x’(2),...,x’(N))により表される。画像ベクトルIが表す画像は、画像ベクトルI(θ)が表す画像を-θだけ回転させた画像に対応する。 Next, a method for calculating the feature vector after correction of each feature point will be described. In the following description, the feature vector for image vector I(θ) representing a biometric image rotated by a rotation angle θ is represented by x = (x(1), x(2), ..., x(N)) (N is an integer equal to or greater than 2). On the other hand, the feature vector for image vector I representing an unrotated image is represented by x' = (x'(1), x'(2), ..., x'(N)). The image represented by image vector I corresponds to the image represented by image vector I(θ) rotated by -θ.
 PCAを用いて特徴ベクトルが算出される場合、Iに対する特徴抽出は、次式により表される。 When feature vectors are calculated using PCA, feature extraction for I is expressed by the following formula:
I=Σx’(i)P(i)   (10) I = Σx'(i)P(i) (10)
 x’(i)(i=1~N)は、特徴ベクトルxのi番目の要素を表し、P(i)は、i番目の基底ベクトル(主成分ベクトル)を表す。Σは、i=1~Nについての総和を表す。入退室管理システムの運用において、Nは、適当な整数に設定される。式(10)の右辺は、N個の基底ベクトルP(i)の一次結合を表し、x’(i)は係数を表す。 x'(i) (i = 1 to N) represents the i-th element of feature vector x, and P(i) represents the i-th basis vector (principal component vector). Σ represents the sum for i = 1 to N. In the operation of the access control system, N is set to an appropriate integer. The right-hand side of equation (10) represents a linear combination of the N basis vectors P(i), and x'(i) represents a coefficient.
 次に、回転角θの回転処理を行列R(θ)で表すと、I(θ)は、次式により表される。 Next, if the rotation process with a rotation angle θ is expressed as a matrix R(θ), I(θ) is expressed by the following formula.
I(θ)=R(θ)I   (11) I(θ) = R(θ)I (11)
 一方、I(θ)に対する特徴抽出は、次式により表される。 On the other hand, feature extraction for I(θ) is expressed by the following formula:
I(θ)=Σx(i)P(i)   (12) I(θ)=Σx(i)P(i)   (12)
 I(θ)は、θだけ回転しているのに対して、P(i)は回転していない。このため、x(i)は、回転誤差を含んでいる。しかし、I(θ)と同じθだけP(i)を回転させた基底ベクトルP(θ,i)を用いて特徴抽出を行うことができれば、回転していないIに対する特徴抽出と同等の結果が得られる。P(θ,i)を用いたI(θ)に対する特徴抽出は、次式により表される。 I(θ) is rotated by θ, while P(i) is not. For this reason, x(i) contains a rotation error. However, if feature extraction can be performed using basis vector P(θ, i) obtained by rotating P(i) by the same θ as I(θ), then the same results can be obtained as with feature extraction for unrotated I. Feature extraction for I(θ) using P(θ, i) is expressed by the following equation.
I(θ)=Σx(i)P(i)=Σx’(i)P(θ,i)   (13) I(θ)=Σx(i)P(i)=Σx'(i)P(θ,i)  (13)
 式(13)のx’(i)は、P(θ,i)を用いて抽出される特徴ベクトルx’のi番目の要素である。x’(i)を用いて照合スコアを算出することで、x(i)に含まれる回転誤差をキャンセルすることができる。以下では、特徴ベクトルx’を、補正後の特徴ベクトルx’と記載することがある。補正後の特徴ベクトルx’は、補正後の特徴量情報の一例である。 In equation (13), x'(i) is the i-th element of feature vector x' extracted using P(θ, i). By calculating the matching score using x'(i), the rotation error contained in x(i) can be cancelled. Hereinafter, feature vector x' may be referred to as corrected feature vector x'. Corrected feature vector x' is an example of corrected feature amount information.
 式(13)にP(θ,n)を乗算すると、次式が得られる。 Multiplying equation (13) by P(θ,n) gives the following equation.
(Σx(i)P(i))P(θ,n)
=(Σx’(i)P(θ,i))P(θ,n)   (14)
(Σx(i)P(i))P(θ,n)
= (Σx′(i)P(θ,i))P(θ,n) (14)
 P(θ,i)は正規直交系の基底ベクトルであるため、次式が成り立つ。 Since P(θ, i) is an orthonormal basis vector, the following equation holds:
P(θ,i)P(θ,n)=δ(i,n)   (15) P(θ,i)P(θ,n)=δ(i,n) (15)
 δ(i,n)は、クロネッカーのδである。したがって、i=nのときδ(i,n)=1となり、i≠nのときδ(i,n)=0となる。式(14)に対して式(15)を適用すると、次式が得られる。 δ(i,n) is the Kronecker δ. Therefore, when i=n, δ(i,n)=1, and when i≠n, δ(i,n)=0. Applying equation (15) to equation (14), we obtain the following equation.
x’(n)=(Σx(i)P(i))P(θ,n)   (16) x'(n) = (Σx(i)P(i))P(θ,n) (16)
 次に、P(i)を用いたP(θ,n)に対する特徴抽出は、次式により表される。 Next, feature extraction for P(θ,n) using P(i) is expressed by the following formula:
P(θ,n)=Σα(θ,n,i)P(i)   (17) P(θ,n)=Σα(θ,n,i)P(i)   (17)
 α(θ,n,i)は、P(θ,n)に対して特徴抽出を行うことで得られる特徴ベクトルα(θ,n)のi番目の要素である。α(θ,n,i)は、P(i)を回転させたときにどのように変化するかを表すパラメータであり、P(i)毎に固有の値を持つ。α(θ,n,i)は、特徴量の補正パラメータに対応する。式(16)に式(17)を代入すると、次式が得られる。 α(θ,n,i) is the i-th element of the feature vector α(θ,n) obtained by performing feature extraction on P(θ,n). α(θ,n,i) is a parameter that indicates how P(i) changes when it is rotated, and has a unique value for each P(i). α(θ,n,i) corresponds to the correction parameter for the feature. Substituting equation (17) into equation (16), the following equation is obtained.
x’(n)
=(Σx(i)P(i))(Σα(θ,n,i)P(i))   (18)
x'(n)
= (Σx(i)P(i)) (Σα(θ,n,i)P(i)) (18)
 P(i)は正規直交系の基底ベクトルであるため、次式が成り立つ。 Since P(i) is an orthonormal basis vector, the following equation holds:
P(i)P(n)=δ(i,n)   (19) P(i)P(n) = δ(i,n) (19)
 式(18)に対して式(19)を適用すると、次式が得られる。 By applying equation (19) to equation (18), we obtain the following equation.
x’(n)=Σx(i)α(θ,n,i)   (20) x'(n) = Σx(i)α(θ,n,i) (20)
 したがって、x’(n)は、特徴ベクトルxと特徴ベクトルα(θ,n)の内積により表される。α(θ,n)は、P(θ,n)の特徴ベクトルであり、事前に算出することができる。α(θ,n)は、θの関数としてフィッティングすることで、数値的に求めることができる。α(θ,n,i)は、例えば、次式により表される。 Therefore, x'(n) is expressed as the inner product of feature vector x and feature vector α(θ,n). α(θ,n) is the feature vector of P(θ,n) and can be calculated in advance. α(θ,n) can be obtained numerically by fitting it as a function of θ. α(θ,n,i) is expressed, for example, by the following equation.
α(θ,n,i)=F(n,i,θ)   (21) α(θ,n,i)=F(n,i,θ) (21)
 F(n,i,θ)は、フィッティング関数である。この場合、F(n,i,θ)の関数形を規定する係数等のパラメータが、フィッティング情報922として記憶部917に格納される。一例として、F(n,i,θ)がθの2次関数である場合、α(θ,n,i)は、次式により表される。 F(n, i, θ) is a fitting function. In this case, parameters such as coefficients that define the function form of F(n, i, θ) are stored in the storage unit 917 as fitting information 922. As an example, if F(n, i, θ) is a quadratic function of θ, α(θ, n, i) is expressed by the following equation.
α(θ,n,i)
=p1(n,i)θ+p2(n,i)θ+p3(n,i)   (22)
α(θ, n, i)
= p1(n,i) θ2 + p2(n,i)θ + p3(n,i) (22)
 p1(n,i)、p2(n,i)、及びp3(n,i)は、フィッティング係数であり、フィッティング情報922として用いられる。式(20)に式(22)を代入すると、次式が得られる。 p1(n,i), p2(n,i), and p3(n,i) are fitting coefficients and are used as fitting information 922. Substituting equation (22) into equation (20), the following equation is obtained.
x’(n)
=Σx(i)(p1(n,i)θ+p2(n,i)θ+p3(n,i)) (23)
x'(n)
=Σx(i)(p1(n,i) θ2 +p2(n,i)θ+p3(n,i)) (23)
 補正後の特徴ベクトルx’は、回転していない画像を表す画像ベクトルIに対する特徴ベクトルであるため、x’(n)を用いて照合スコアを算出することで、照合スコアの精度を向上させることができる。 The corrected feature vector x' is a feature vector for the image vector I that represents an unrotated image, so the accuracy of the matching score can be improved by calculating the matching score using x'(n).
 照合スコア算出部915は、座標変換Tに含まれる回転角θとフィッティング情報922とを用いて、α(θ,n,i)を算出する。そして、照合スコア算出部915は、照合対象データ924に含まれる各特徴点の特徴ベクトルxとα(θ,n,i)とを用いて、式(20)により、補正後の特徴ベクトルx’を算出する。 The matching score calculation unit 915 calculates α(θ, n, i) using the rotation angle θ included in the coordinate transformation T and the fitting information 922. The matching score calculation unit 915 then calculates a corrected feature vector x' using the feature vector x of each feature point included in the matching target data 924 and α(θ, n, i) according to formula (20).
 次に、照合スコア算出部915は、照合対象データ924に含まれる各特徴点の補正後の特徴ベクトルx’と、各登録者の登録データに含まれる各特徴点の特徴ベクトルとを用いて、照合スコアを算出する。照合スコアとしては、2つの特徴ベクトルの類似度又は相違度を用いることができる。類似度は、2つの特徴ベクトルの内積であってもよく、相違度は、2つの特徴ベクトルのベクトル間距離であってもよい。 Next, the matching score calculation unit 915 calculates a matching score using the corrected feature vector x' of each feature point included in the matching target data 924 and the feature vector of each feature point included in the registered data of each registrant. As the matching score, the similarity or dissimilarity of two feature vectors can be used. The similarity may be the inner product of two feature vectors, and the dissimilarity may be the vector distance between two feature vectors.
 補正後の特徴ベクトルx’を用いて算出される照合スコアは、照合特徴点と登録特徴点との何れか一方を座標変換した特徴点と、照合特徴点と登録特徴点との他方との照合スコアに対応する。 The matching score calculated using the corrected feature vector x' corresponds to the feature point obtained by transforming the coordinates of either the matching feature point or the registered feature point, and the matching score between the other of the matching feature point and the registered feature point.
 図10は、図9の照合スコア算出装置901が行う第1の生体認証処理の例を示すフローチャートである。まず、生体画像取得部911は、照合対象者の生体画像923を取得する(ステップ1001)。そして、特徴抽出部912は、生体画像923から複数の特徴点を抽出し、各特徴点の特徴ベクトルを算出し、各特徴点の座標及び特徴ベクトルを含む照合対象データ924を生成する(ステップ1002)。 FIG. 10 is a flowchart showing an example of the first biometric authentication process performed by the matching score calculation device 901 of FIG. 9. First, the biometric image acquisition unit 911 acquires a biometric image 923 of the person to be matched (step 1001). Then, the feature extraction unit 912 extracts a plurality of feature points from the biometric image 923, calculates a feature vector for each feature point, and generates matching target data 924 including the coordinates and feature vector for each feature point (step 1002).
 次に、座標変換算出部913は、照合対象データ924に含まれる特徴点と、各登録者の登録データに含まれる特徴点との組み合わせを表す特徴点ペアを生成し、2つの特徴点ペアのM個(Mは1以上の整数)の組み合わせを生成する。特徴点スコアが良好な順に選択されたX個の特徴点ペアから、選択可能な2つの特徴点ペアの組み合わせを生成する場合、M=となる。そして、座標変換算出部913は、制御変数iに1を設定する(ステップ1003)。 Next, the coordinate transformation calculation unit 913 generates feature points pairs representing combinations of feature points included in the matching target data 924 and feature points included in the enrollment data of each enrolled person, and generates M (M is an integer equal to or greater than 1) combinations of two feature points pairs. When generating selectable combinations of two feature points pairs from X feature points pairs selected in order of best feature point score, M = X C 2. Then, the coordinate transformation calculation unit 913 sets 1 to a control variable i (step 1003).
 次に、座標変換算出部913は、M個の組み合わせの中からi番目の組み合わせを選択し、選択された組み合わせに対応する2つの特徴点ペアに含まれる各特徴点の座標を用いて、最小二乗法により、i番目の座標変換Tiを算出する(ステップ1004)。そして、座標変換算出部913は、座標変換Tiの妥当性をチェックする(ステップ1005)。 The coordinate transformation calculation unit 913 then selects the i-th combination from the M combinations, and calculates the i-th coordinate transformation Ti by the least squares method using the coordinates of each feature point included in the two feature point pairs corresponding to the selected combination (step 1004).The coordinate transformation calculation unit 913 then checks the validity of the coordinate transformation Ti (step 1005).
 座標変換Tiの計算における最小二乗残差が所定値よりも小さい場合、座標変換Tiは妥当と判定され、最小二乗残差が所定値以上である場合、座標変換Tiは妥当ではないと判定される。 If the least squares residual in the calculation of the coordinate transformation Ti is smaller than a predetermined value, the coordinate transformation Ti is determined to be valid, and if the least squares residual is equal to or greater than the predetermined value, the coordinate transformation Ti is determined to be invalid.
 座標変換Tiが妥当ではない場合(ステップ1005,NO)、座標変換算出部913は、iを1だけインクリメントし(ステップ1014)、照合スコア算出装置901は、ステップ1004以降の処理を繰り返す。 If the coordinate transformation Ti is not valid (step 1005, NO), the coordinate transformation calculation unit 913 increments i by 1 (step 1014), and the matching score calculation device 901 repeats the processing from step 1004 onwards.
 座標変換Tiが妥当である場合(ステップ1005,YES)、座標変換部914は、座標変換Tiを用いて、照合対象データ924に含まれる各特徴点の座標を座標変換する(ステップ1006)。 If the coordinate transformation Ti is valid (step 1005, YES), the coordinate transformation unit 914 uses the coordinate transformation Ti to transform the coordinates of each feature point included in the matching target data 924 (step 1006).
 次に、照合スコア算出部915は、座標変換Tiに含まれる回転角θとフィッティング情報922とを用いて、照合対象データ924に含まれる各特徴点の補正後の特徴ベクトルを算出する(ステップ1007)。 Next, the matching score calculation unit 915 uses the rotation angle θ included in the coordinate transformation Ti and the fitting information 922 to calculate the corrected feature vectors of each feature point included in the matching target data 924 (step 1007).
 次に、照合スコア算出部915は、照合対象データ924に含まれる特徴点と、登録データに含まれる特徴点とを対応付ける探索処理を行う(ステップ1008)。そして、照合スコア算出部915は、対応付けられた2つの特徴点の照合スコアから、座標変換Tiに対する照合スコアSiを算出する計算処理を行う(ステップ1009)。照合スコアSiは、登録者毎に算出される。 Next, the matching score calculation unit 915 performs a search process to match feature points included in the matching target data 924 with feature points included in the registered data (step 1008). Then, the matching score calculation unit 915 performs a calculation process to calculate a matching score Si for the coordinate transformation Ti from the matching scores of the two matched feature points (step 1009). The matching score Si is calculated for each registered user.
 次に、照合スコア算出部915は、照合対象データ924と各登録者の登録データとの照合スコア925を更新する(ステップ1010)。生体認証処理の開始時に、各登録者の照合スコア925は初期値に設定されている。ステップ1009において算出された照合スコアSiが照合スコア925よりも良好である場合、照合スコア算出部915は、照合スコア925に照合スコアSiを設定することで、照合スコア925を更新する。 Next, the matching score calculation unit 915 updates the matching score 925 between the matching target data 924 and the registered data of each registered person (step 1010). At the start of the biometric authentication process, the matching score 925 of each registered person is set to an initial value. If the matching score Si calculated in step 1009 is better than the matching score 925, the matching score calculation unit 915 updates the matching score 925 by setting the matching score Si to the matching score 925.
 照合スコアが2つの特徴ベクトルの類似度を表す場合、照合スコアは大きいほど良好な値となる。一方、照合スコアが2つの特徴ベクトルの相違度を表す場合、照合スコアは小さいほど良好な値となる。 If the matching score represents the similarity between two feature vectors, the larger the matching score, the better the value. On the other hand, if the matching score represents the dissimilarity between two feature vectors, the smaller the matching score, the better the value.
 次に、座標変換算出部913は、iを1だけインクリメントし(ステップ1011)、iをMと比較する(ステップ1012)。iがM以下である場合(ステップ1012,NO)、照合スコア算出装置901は、ステップ1004以降の処理を繰り返す。 Next, the coordinate transformation calculation unit 913 increments i by 1 (step 1011) and compares i with M (step 1012). If i is equal to or smaller than M (step 1012, NO), the matching score calculation device 901 repeats the processes from step 1004 onwards.
 iがMよりも大きい場合(ステップ1012,YES)、照合スコア算出部915は、照合スコア925を用いて照合対象者に対する生体認証を行い、認証結果を生成する(ステップ1013)。 If i is greater than M (step 1012, YES), the matching score calculation unit 915 performs biometric authentication on the person to be matched using the matching score 925 and generates an authentication result (step 1013).
 生体認証において、照合スコア算出部915は、例えば、登録テンプレート921に登録された複数の登録者のうち、所定の条件を満たす登録者を特定する。照合スコアが類似度を表す場合、所定の条件は、照合スコア925が判定閾値よりも大きいことであってもよい。照合スコアが相違度を表す場合、所定の条件は、照合スコア925が判定閾値よりも小さいことであってもよい。 In biometric authentication, the matching score calculation unit 915, for example, identifies registrants who satisfy a predetermined condition from among multiple registrants registered in the registration template 921. When the matching score represents the similarity, the predetermined condition may be that the matching score 925 is greater than a judgment threshold. When the matching score represents the dissimilarity, the predetermined condition may be that the matching score 925 is less than a judgment threshold.
 照合スコア算出部915は、所定の条件を満たす登録者が特定された場合、認証成功を示す認証結果を生成し、所定の条件を満たす登録者が特定されなかった場合、認証失敗を示す認証結果を生成する。これにより、照合対象者が登録テンプレート921に登録されているか否かを判定することができる。通信部916は、生成された認証結果を制御装置902へ送信する。 If a registered person who satisfies the specified conditions is identified, the matching score calculation unit 915 generates an authentication result indicating successful authentication, and if a registered person who satisfies the specified conditions is not identified, it generates an authentication result indicating unsuccessful authentication. This makes it possible to determine whether or not the person to be matched is registered in the registration template 921. The communication unit 916 transmits the generated authentication result to the control device 902.
 図11は、図10のステップ1008における探索処理の例を示すフローチャートである。まず、照合スコア算出部915は、特徴点ペアリストを初期化し(ステップ1101)、制御変数iに1を設定し(ステップ1102)、制御変数jに1を設定する(ステップ1103)。 FIG. 11 is a flowchart showing an example of the search process in step 1008 of FIG. 10. First, the matching score calculation unit 915 initializes the feature point pair list (step 1101), sets the control variable i to 1 (step 1102), and sets the control variable j to 1 (step 1103).
 次に、照合スコア算出部915は、登録データに含まれるi番目(i=1~N1)の特徴点と、照合対象データ924に含まれるj番目(j=1~N2)の特徴点との間の距離D(i,j)を、閾値TH1と比較する(ステップ1104)。N1は、登録データに含まれる特徴点の個数を表し、N2は、照合対象データ924に含まれる特徴点の個数を表す。 Next, the matching score calculation unit 915 compares the distance D(i,j) between the ith (i=1 to N1) feature point included in the registered data and the jth (j=1 to N2) feature point included in the target data 924 with a threshold TH1 (step 1104). N1 represents the number of feature points included in the registered data, and N2 represents the number of feature points included in the target data 924.
 D(i,j)がTH1以上である場合(ステップ1104,NO)、照合スコア算出部915は、jを1だけインクリメントし(ステップ1105)、jをN2と比較する(ステップ1106)。jがN2以下である場合(ステップ1106,NO)、照合スコア算出部915は、ステップ1104以降の処理を繰り返す。 If D(i,j) is greater than or equal to TH1 (step 1104, NO), the matching score calculation unit 915 increments j by 1 (step 1105) and compares j with N2 (step 1106). If j is less than or equal to N2 (step 1106, NO), the matching score calculation unit 915 repeats the processing from step 1104 onwards.
 jがN2よりも大きい場合(ステップ1106,YES)、照合スコア算出部915は、iを1だけインクリメントし(ステップ1107)、iをN1と比較する(ステップ1108)。iがN1以下である場合(ステップ1108,NO)、照合スコア算出部915は、ステップ1103以降の処理を繰り返す。 If j is greater than N2 (step 1106, YES), the matching score calculation unit 915 increments i by 1 (step 1107) and compares i with N1 (step 1108). If i is less than or equal to N1 (step 1108, NO), the matching score calculation unit 915 repeats the processing from step 1103 onwards.
 D(i,j)がTH1よりも小さい場合(ステップ1104,YES)、照合スコア算出部915は、ステップ1109の処理を行う。ステップ1109において、照合スコア算出部915は、登録データに含まれるi番目の特徴点と、照合対象データ924に含まれるj番目の特徴点の組み合わせを、対応付けられた特徴点ペアとして特徴点ペアリストに追加する。 If D(i, j) is smaller than TH1 (step 1104, YES), the matching score calculation unit 915 performs the process of step 1109. In step 1109, the matching score calculation unit 915 adds the combination of the i-th feature point included in the registered data and the j-th feature point included in the matching target data 924 to the feature pair list as a corresponding feature pair.
 iがN1よりも大きい場合(ステップ1106,YES)、照合スコア算出部915は、処理を終了する。 If i is greater than N1 (step 1106, YES), the matching score calculation unit 915 ends the process.
 図12は、図10のステップ1009における計算処理の例を示すフローチャートである。まず、照合スコア算出部915は、座標変換Tiに対する照合スコアSiに、照合が無効であることを示す0を設定する(ステップ1201)。 FIG. 12 is a flowchart showing an example of the calculation process in step 1009 of FIG. 10. First, the matching score calculation unit 915 sets the matching score Si for the coordinate transformation Ti to 0, which indicates that the matching is invalid (step 1201).
 次に、照合スコア算出部915は、特徴点ペアリストに含まれる特徴点ペアの個数を閾値TH2と比較する(ステップ1202)。特徴点ペアの個数がTH2以上である場合(ステップ1202,NO)、照合スコア算出部915は、照合が有効であると判定する。 Next, the matching score calculation unit 915 compares the number of feature points pairs included in the feature points pair list with a threshold value TH2 (step 1202). If the number of feature points pairs is equal to or greater than TH2 (step 1202, NO), the matching score calculation unit 915 determines that the matching is valid.
 この場合、照合スコア算出部915は、各特徴点ペアについて、照合対象データ924に含まれる特徴点の補正後の特徴ベクトルと、登録データに含まれる特徴点の特徴ベクトルとを用いて、照合スコアを算出する。そして、照合スコア算出部915は、最良の照合スコアから順に上位K個(Kは1以上の整数)の照合スコアを選択し、それらのK個の照合スコアの統計値をSiに設定する(ステップ1203)。統計値としては、平均値、中央値、最大値、最小値等が用いられる。 In this case, the matching score calculation unit 915 calculates a matching score for each feature point pair using the corrected feature vector of the feature point included in the matching target data 924 and the feature vector of the feature point included in the registered data. Then, the matching score calculation unit 915 selects the top K matching scores (K is an integer equal to or greater than 1) in descending order of the best matching score, and sets the statistical value of these K matching scores to Si (step 1203). As the statistical value, the average, median, maximum value, minimum value, etc. are used.
 特徴点ペアの個数がTH2よりも小さい場合(ステップ1202,YES)、照合スコア算出部915は、照合が無効であると判定し、処理を終了する。 If the number of feature point pairs is smaller than TH2 (step 1202, YES), the matching score calculation unit 915 determines that the matching is invalid and ends the process.
 図9の入退室管理システムによれば、正規化処理が適用された照合対象データ924に回転誤差が含まれている場合であっても、回転誤差が含まれていない場合と同等の照合スコア925が得られるため、照合スコア925の精度が向上する。 According to the entrance/exit management system of FIG. 9, even if the matching target data 924 to which the normalization process has been applied contains a rotation error, a matching score 925 equivalent to that obtained when no rotation error is contained is obtained, thereby improving the accuracy of the matching score 925.
 また、角度差θを算出した後に、照合対象者の生体画像923を-θだけ回転させて、特徴ベクトルを再度抽出する必要がないため、処理時間及びメモリ消費量の増大を抑えることができ、セキュリティの観点からも望ましい生体認証が実現される。 In addition, after calculating the angle difference θ, there is no need to rotate the biometric image 923 of the person to be matched by -θ and re-extract the feature vector, which makes it possible to suppress increases in processing time and memory consumption, and achieves biometric authentication that is desirable from a security standpoint.
 ところで、生体認証においては、特許文献3に記載されているように、照合処理を高速化するためにハミング距離が利用されることがある。 Incidentally, in biometric authentication, as described in Patent Document 3, the Hamming distance is sometimes used to speed up matching processing.
 図13は、ハミング距離を用いた照合スコアの例を示している。図13(a)は、特徴量を表すバイナリデータの例を示している。ハミング距離を利用する場合、事前に特徴量1301がバイナリ化され、0と1からなるバイナリデータ1302に変換される。 FIG. 13 shows an example of a matching score using the Hamming distance. FIG. 13(a) shows an example of binary data representing a feature. When using the Hamming distance, the feature 1301 is binarized in advance and converted into binary data 1302 consisting of 0s and 1s.
 図13(b)は、バイナリデータに基づく照合スコアの例を示している。登録データに含まれるバイナリデータ1302と、照合対象データに含まれるバイナリデータ1303とを比較する場合、バイナリデータ1302とバイナリデータ1303の同一桁のビット同士が比較される。この場合、バイナリデータ1302とバイナリデータ1303とで異なっているビットの個数が、ハミング距離として求められ、照合スコアとして用いられる。 FIG. 13(b) shows an example of a matching score based on binary data. When comparing binary data 1302 contained in the registration data with binary data 1303 contained in the data to be matched, bits of the same digit in binary data 1302 and binary data 1303 are compared. In this case, the number of bits that differ between binary data 1302 and binary data 1303 is calculated as the Hamming distance and used as the matching score.
 具体的には、バイナリデータ1302とバイナリデータ1303との排他的論理和(XOR)がビット毎に計算され、計算結果のバイナリデータ1304に含まれる論理値“1”の個数である“9”が、ハミング距離として求められる。 Specifically, the exclusive OR (XOR) of binary data 1302 and binary data 1303 is calculated for each bit, and the number of logical values "1" contained in the resulting binary data 1304, "9", is found as the Hamming distance.
 CPU(Central Processing Unit)のビット演算機能を利用することで、ハミング距離を高速に求めることができる。特に、浮動小数点演算が弱い組み込み機器向けのCPU等において、ハミング距離は高速性を発揮する。 The Hamming distance can be calculated quickly by using the bit operation function of the CPU (Central Processing Unit). The Hamming distance is particularly fast in CPUs for embedded devices that have poor floating-point arithmetic.
 また、特許文献2に記載されているように、セキュリティを向上させるために特徴量が暗号化されることもある。 Furthermore, as described in Patent Document 2, features may be encrypted to improve security.
 図14は、バイナリデータの暗号化の例を示している。暗号データ1411は、バイナリデータの暗号化に用いられる乱数データである。登録データに含まれるバイナリデータ1401と暗号データ1411とのXORをビット毎に計算することで、暗号化バイナリデータ1421が生成される。また、照合対象データに含まれるバイナリデータ1402と暗号データ1411とのXORをビット毎に計算することで、暗号化バイナリデータ1422が生成される。 FIG. 14 shows an example of binary data encryption. Encrypted data 1411 is random number data used to encrypt the binary data. Encrypted binary data 1421 is generated by calculating the XOR of binary data 1401 included in the registration data and encrypted data 1411 for each bit. Encrypted binary data 1422 is generated by calculating the XOR of binary data 1402 included in the data to be matched and encrypted data 1411 for each bit.
 暗号データ1411は、バイナリデータ1401及びバイナリデータ1402と同じ長さのバイナリデータであり、AES(Advanced Encryption Standard)等に基づいて生成される。 Encrypted data 1411 is binary data of the same length as binary data 1401 and binary data 1402, and is generated based on AES (Advanced Encryption Standard) or the like.
 XOR演算の特性から、バイナリデータ1401とバイナリデータ1402のハミング距離がHDである場合、暗号化バイナリデータ1421と暗号化バイナリデータ1422のハミング距離もHDとなる。このような暗号化方法を利用すれば、特徴量を暗号化したままで照合スコアを算出することが可能になる。 Due to the characteristics of the XOR operation, if the Hamming distance between binary data 1401 and binary data 1402 is HD, the Hamming distance between encrypted binary data 1421 and encrypted binary data 1422 will also be HD. By using such an encryption method, it becomes possible to calculate the matching score while keeping the features encrypted.
 次に、特徴ベクトルが暗号化されている場合に照合スコアを算出する算出方法について説明する。この場合、x(i)が暗号化されているため、式(20)によりx’(n)を算出することは適切ではない。そこで、α(θ,n,i)に対して、次式のような近似を適用する。 Next, we will explain how to calculate the matching score when the feature vector is encrypted. In this case, since x(i) is encrypted, it is not appropriate to calculate x'(n) using equation (20). Therefore, the following approximation is applied to α(θ, n, i).
α(θ,n,n)≒1         (31)
α(θ,n,i)≒0 (n≠i)   (32)
α(θ, n, n)≈1 (31)
α(θ, n, i)≒0 (n≠i) (32)
 式(31)は、n=iである場合、α(θ,n,i)が1に近い値であることを表す。式(32)は、n≠iである場合、α(θ,n,i)が0に近い値であることを表す。 Equation (31) indicates that when n = i, α(θ, n, i) is close to 1. Equation (32) indicates that when n ≠ i, α(θ, n, i) is close to 0.
 α(θ,n,n)は、θだけ回転させた基底ベクトルP(θ,n)が元の基底ベクトルP(n)をどの程度含んでいるかを表している。一方、α(θ,n,i)は、θだけ回転させた基底ベクトルP(θ,n)が、元の基底ベクトルP(n)以外の基底ベクトルP(i)をどの程度含んでいるかを表している。例えば、正規化処理が適用される生体認証のようにθが十分に小さい場合は、P(θ,n)がP(n)から大きく乖離していないと考えられるため、このような近似が成り立つ。 α(θ,n,n) represents the extent to which the basis vector P(θ,n) rotated by θ contains the original basis vector P(n). On the other hand, α(θ,n,i) represents the extent to which the basis vector P(θ,n) rotated by θ contains basis vectors P(i) other than the original basis vector P(n). For example, when θ is sufficiently small, such as in biometric authentication where normalization processing is applied, P(θ,n) is not considered to deviate significantly from P(n), and this approximation holds true.
 式(20)に対して式(32)の近似を適用して、n≠iである場合、α(θ,n,i)=0とみなすと、次式が得られる。 By applying the approximation of equation (32) to equation (20) and assuming that α(θ, n, i) = 0 when n ≠ i, the following equation is obtained.
x’(n)=x(n)α(θ,n,n)   (33) x'(n) = x(n)α(θ, n, n) (33)
 α(θ,n,n)は、x’(n)の回転に対する安定性を表す係数とみなすこともできる。例えば、α(θ,n,n)が大きな値(1に近い値)の場合、x’(n)は、回転による影響を受けにくい。逆に、α(θ,n,n)が小さな値(1よりも小さな値)の場合、x’(n)は、回転による影響を受けやすい。 α(θ,n,n) can also be considered as a coefficient that indicates the stability of x'(n) against rotation. For example, when α(θ,n,n) is a large value (close to 1), x'(n) is less susceptible to the effects of rotation. Conversely, when α(θ,n,n) is a small value (smaller than 1), x'(n) is more susceptible to the effects of rotation.
 図15は、α(θ,n,n)と回転角θの関係の例を示している。曲線1501は、回転による影響を受けにくいα(θ,n,n)を表し、曲線1502は、回転による影響を受けやすいα(θ,n,n)を表す。α(θ,n,n)が回転による影響を受けやすいか否かは、nの値に応じて変化し、P(i)の特性に依存して決まる。 Figure 15 shows an example of the relationship between α(θ,n,n) and the rotation angle θ. Curve 1501 represents α(θ,n,n) that is not easily affected by rotation, and curve 1502 represents α(θ,n,n) that is easily affected by rotation. Whether α(θ,n,n) is easily affected by rotation varies depending on the value of n, and is determined by the characteristics of P(i).
 x(n)が暗号化されている場合であっても、式(20)の代わりに式(33)を用いることで、x’(n)を算出することができる。 Even if x(n) is encrypted, x'(n) can be calculated by using equation (33) instead of equation (20).
 一例として、照合対象データに含まれる特徴ベクトルxと登録データに含まれる特徴ベクトルが暗号化バイナリデータであり、照合スコアとして、重みを考慮したハミング距離が用いられる場合を想定する。この場合、照合スコア算出部915は、特徴ベクトルxと登録データに含まれる特徴ベクトルとの照合スコアSDを、次式により算出する。 As an example, assume that the feature vector x included in the data to be matched and the feature vector included in the registered data are encrypted binary data, and the Hamming distance taking into account the weight is used as the matching score. In this case, the matching score calculation unit 915 calculates the matching score SD between the feature vector x and the feature vector included in the registered data using the following formula.
SD=Σw(i)d(i)   (34) SD = Σw(i)d(i) (34)
 d(i)は、特徴ベクトルxのi番目の要素x(i)と、登録データに含まれる特徴ベクトルのi番目の要素とのXORを表す。w(i)は、d(i)に対する重みを表し、式(34)の右辺は、d(i)の重み付き和を表す。 d(i) represents the XOR of the i-th element x(i) of the feature vector x and the i-th element of the feature vector included in the registered data. w(i) represents the weight for d(i), and the right-hand side of equation (34) represents the weighted sum of d(i).
 w(i)としては、α(θ,i,i)が用いられる。α(θ,i,i)は、x’(i)の回転に対する安定性を表す係数とみなされるため、α(θ,i,i)をd(i)毎の重みとして利用することで、照合スコアSDの精度を向上させることができる。 α(θ,i,i) is used as w(i). α(θ,i,i) is considered to be a coefficient that indicates the stability of x'(i) against rotation, so by using α(θ,i,i) as a weight for each d(i), the accuracy of the matching score SD can be improved.
 以下では、式(20)のx’(n)を用いて照合スコアを算出する方法を第1方式と記載し、式(34)により照合スコアSDを算出する方法を第2方式と記載することがある。 In the following, the method of calculating the matching score using x'(n) in formula (20) may be referred to as the first method, and the method of calculating the matching score SD using formula (34) may be referred to as the second method.
 第2方式は、第1方式と比較すると、精度改善効果は限定的ではあるが、適用範囲が広いという特長がある。第2方式によれば、照合対象データ及び登録データに含まれる特徴ベクトルが暗号されたままであっても、照合スコアSDを算出することができる。 Compared to the first method, the second method has a limited effect on improving accuracy, but has the advantage of being applicable in a wider range of applications. With the second method, the matching score SD can be calculated even if the feature vectors contained in the matching target data and registered data remain encrypted.
 正規化されたw(i)を用いて照合スコアSDを算出することも可能である。w(i)を正規化することで、安定した照合スコアSDの計算が実現される。正規化後の重みw’(i)は、例えば、次式により算出することができる。 It is also possible to calculate the matching score SD using the normalized w(i). Normalizing w(i) allows for a stable calculation of the matching score SD. The weight w'(i) after normalization can be calculated, for example, by the following formula.
w’(i)=w(i)/(Σw(i)/N)   (35) w'(i) = w(i) / (Σw(i) / N) (35)
 式(35)のw’(i)によれば、w’(1)~w’(N)の総和が一定値Nになるため、照合スコアSDが異常値になることを防止できる。 According to w'(i) in equation (35), the sum of w'(1) to w'(N) is a constant value N, so the matching score SD can be prevented from becoming an abnormal value.
 図16は、図9の照合スコア算出装置901が行う第2の生体認証処理の例を示すフローチャートである。図16の生体認証処理では、第2方式が用いられ、照合対象データ及び登録データに含まれる特徴ベクトルは、暗号化バイナリデータである。 FIG. 16 is a flowchart showing an example of the second biometric authentication process performed by the matching score calculation device 901 in FIG. 9. In the biometric authentication process in FIG. 16, the second method is used, and the feature vectors included in the matching target data and the registered data are encrypted binary data.
 ステップ1601~ステップ1606、ステップ1608、及びステップ1610~ステップ1614の処理は、図10のステップ1001~ステップ1006、ステップ1008、及びステップ1010~ステップ1014の処理と同様である。 The processing in steps 1601 to 1606, 1608, and 1610 to 1614 is similar to the processing in steps 1001 to 1006, 1008, and 1010 to 1014 in FIG. 10.
 ステップ1607において、照合スコア算出部915は、式(34)の重みw(i)として、α(θ,i,i)を算出する。照合スコア算出部915は、座標変換Tiに含まれる回転角θとフィッティング情報922とを用いて、α(θ,i,i)を算出する。 In step 1607, the matching score calculation unit 915 calculates α(θ, i, i) as the weight w(i) in equation (34). The matching score calculation unit 915 calculates α(θ, i, i) using the rotation angle θ included in the coordinate transformation Ti and the fitting information 922.
 ステップ1609において、照合スコア算出部915は、図12の計算処理を行う。この場合、ステップ1203において、照合スコア算出部915は、式(34)により、各特徴点ペアに対する照合スコアを算出する。そして、照合スコア算出部915は、最良の照合スコアから順に上位K個の照合スコアを選択し、それらのK個の照合スコアの統計値をSiに設定する。 In step 1609, the matching score calculation unit 915 performs the calculation process of FIG. 12. In this case, in step 1203, the matching score calculation unit 915 calculates the matching score for each feature point pair using equation (34). Then, the matching score calculation unit 915 selects the top K matching scores in descending order of the best matching score, and sets the statistical value of these K matching scores to Si.
 式(34)では、照合対象データ及び登録データに含まれる暗号化バイナリデータの各要素のXORが、d(i)として用いられている。しかし、XORの代わりに、暗号化バイナリデータから計算可能な他の相違度又は類似度を、d(i)として用いてもよい。 In formula (34), the XOR of each element of the encrypted binary data contained in the data to be matched and the registered data is used as d(i). However, instead of the XOR, another dissimilarity or similarity that can be calculated from the encrypted binary data may be used as d(i).
 図17は、図7の照合スコア算出装置701を含むフラッパーゲート管理システムの機能的構成例を示している。図17のフラッパーゲート管理システムは、特徴取得装置1701、制御装置1702、及び照合スコア算出装置1703を含む。特徴取得装置1701は、例えば、クライアントであり、照合スコア算出装置1703は、例えば、サーバである。フラッパーゲート管理システムでは、照合対象者の特徴ベクトルが通信ネットワーク上を流れるため、第2方式が用いられ、特徴ベクトルが暗号化される。 FIG. 17 shows an example of the functional configuration of a flapper gate management system including the matching score calculation device 701 of FIG. 7. The flapper gate management system of FIG. 17 includes a feature acquisition device 1701, a control device 1702, and a matching score calculation device 1703. The feature acquisition device 1701 is, for example, a client, and the matching score calculation device 1703 is, for example, a server. In the flapper gate management system, the feature vector of the person to be matched flows over a communication network, so the second method is used and the feature vector is encrypted.
 照合スコア算出装置1703は、図7の照合スコア算出装置701に対応する。フラッパーゲート管理システムは、照合スコア算出システムの一例である。 The matching score calculation device 1703 corresponds to the matching score calculation device 701 in FIG. 7. The flapper gate management system is an example of a matching score calculation system.
 特徴取得装置1701は、フラッパーゲートの周辺に設置され、生体画像取得部1711、特徴抽出部1712、通信部1713、及び記憶部1714を含む。特徴抽出部1712は、取得部の一例であり、通信部1713は、送信部の一例である。 The feature acquisition device 1701 is installed near the flapper gate, and includes a biometric image acquisition unit 1711, a feature extraction unit 1712, a communication unit 1713, and a storage unit 1714. The feature extraction unit 1712 is an example of an acquisition unit, and the communication unit 1713 is an example of a transmission unit.
 生体画像取得部1711は、照合対象者の生体画像1721を取得して、記憶部1714に格納する。生体画像1721は、例えば、指紋画像、手のひら静脈画像、掌紋画像、又は顔画像である。生体画像取得部1711は、例えば、指紋センサ、静脈センサ、又は画像センサである。 The biometric image acquisition unit 1711 acquires a biometric image 1721 of the person to be matched and stores it in the storage unit 1714. The biometric image 1721 is, for example, a fingerprint image, a palm vein image, a palm print image, or a face image. The biometric image acquisition unit 1711 is, for example, a fingerprint sensor, a vein sensor, or an image sensor.
 特徴抽出部1712は、生体画像1721から複数の特徴点を抽出し、各特徴点の近傍の画素値から特徴ベクトルを算出する。特徴ベクトルとしては、例えば、暗号化バイナリデータが算出される。そして、特徴抽出部1712は、複数の特徴点それぞれの座標及び特徴ベクトルを含む照合対象データ1722を生成して、記憶部1714に格納する。通信部1713は、通信ネットワークを介して、照合対象データ1722を照合スコア算出装置1703へ送信する。 The feature extraction unit 1712 extracts multiple feature points from the biometric image 1721 and calculates a feature vector from pixel values in the vicinity of each feature point. For example, encrypted binary data is calculated as the feature vector. The feature extraction unit 1712 then generates matching target data 1722 including the coordinates and feature vectors of each of the multiple feature points, and stores the data in the memory unit 1714. The communication unit 1713 transmits the matching target data 1722 to the matching score calculation device 1703 via a communication network.
 照合スコア算出装置1703は、照合対象データ1722を用いて照合スコアを算出し、照合スコアを用いて照合対象者に対する生体認証を行って、認証結果を制御装置1702へ送信する。制御装置1702は、認証結果に基づいてフラッパーゲートの開閉を制御する。制御装置1702は、例えば、認証結果が成功を示す場合、フラッパーゲートを開く制御を行い、認証結果が失敗を示す場合、フラッパーゲートを閉じる制御を行う。 The matching score calculation device 1703 calculates a matching score using the matching target data 1722, performs biometric authentication on the person to be matched using the matching score, and transmits the authentication result to the control device 1702. The control device 1702 controls the opening and closing of the flapper gate based on the authentication result. For example, if the authentication result indicates success, the control device 1702 controls the opening of the flapper gate, and if the authentication result indicates failure, the control device 1702 controls the closing of the flapper gate.
 照合スコア算出装置1703は、座標変換算出部1731、座標変換部1732、照合スコア算出部1733、通信部1734、及び記憶部1735を含む。座標変換部1732、照合スコア算出部1733、及び記憶部1735は、図7の座標変換部712、照合スコア算出部713、及び記憶部711にそれぞれ対応する。通信部1734は、受信部の一例である。 The matching score calculation device 1703 includes a coordinate transformation calculation unit 1731, a coordinate transformation unit 1732, a matching score calculation unit 1733, a communication unit 1734, and a memory unit 1735. The coordinate transformation unit 1732, the matching score calculation unit 1733, and the memory unit 1735 correspond to the coordinate transformation unit 712, the matching score calculation unit 713, and the memory unit 711 in FIG. 7, respectively. The communication unit 1734 is an example of a receiving unit.
 記憶部1735は、登録テンプレート1741及びフィッティング情報1742を記憶する。登録テンプレート1741は、複数の登録者それぞれの登録データを含み、各登録者の登録データは、複数の特徴点それぞれの座標及び特徴ベクトルを含む。特徴ベクトルとしては、例えば、暗号化バイナリデータが用いられる。フィッティング情報1742は、図9のフィッティング情報922と同様である。 The storage unit 1735 stores a registration template 1741 and fitting information 1742. The registration template 1741 includes registration data for each of a plurality of registered persons, and the registration data for each of the registered persons includes the coordinates and feature vectors of each of a plurality of feature points. For example, encrypted binary data is used as the feature vector. The fitting information 1742 is similar to the fitting information 922 in FIG. 9.
 通信部1734は、特徴取得装置1701から照合対象データ1722を受信し、記憶部1735は、照合対象データ1722を記憶する。座標変換算出部1731は、照合対象データ1722を登録テンプレート1741内の各登録者の登録データに合わせるための座標変換Tを算出する。座標変換Tは、例えば、回転角θと、X方向及びY方向の平行移動量(ΔX,ΔY)とを含む。 The communication unit 1734 receives the matching target data 1722 from the feature acquisition device 1701, and the storage unit 1735 stores the matching target data 1722. The coordinate transformation calculation unit 1731 calculates a coordinate transformation T for matching the matching target data 1722 with the registered data of each registered person in the registered template 1741. The coordinate transformation T includes, for example, a rotation angle θ and a translation amount (ΔX, ΔY) in the X and Y directions.
 座標変換部1732は、座標変換Tを用いて、照合対象データ1722に含まれる各特徴点の座標を座標変換する。 The coordinate transformation unit 1732 uses coordinate transformation T to transform the coordinates of each feature point contained in the matching target data 1722.
 照合スコア算出部1733は、座標変換Tに含まれる回転角θと、フィッティング情報1742とを用いて、α(θ,i,i)を算出する。そして、照合スコア算出部1733は、α(θ,i,i)を式(34)の重みw(i)として用いて、照合対象データ1722と登録データとの照合スコア1743を算出する。 The matching score calculation unit 1733 calculates α(θ, i, i) using the rotation angle θ included in the coordinate transformation T and the fitting information 1742. The matching score calculation unit 1733 then calculates the matching score 1743 between the matching target data 1722 and the registered data using α(θ, i, i) as the weight w(i) in equation (34).
 このとき、照合スコア算出部1733は、照合対象データ1722に含まれる各特徴点の座標変換後の座標と、各特徴点の特徴ベクトルと、登録テンプレート1741に含まれる各登録者の登録データとを用いて、照合スコア1743を算出する。そして、照合スコア算出部1733は、照合スコア1743を記憶部1735に格納する。 At this time, the matching score calculation unit 1733 calculates the matching score 1743 using the coordinates after coordinate transformation of each feature point included in the matching target data 1722, the feature vectors of each feature point, and the enrollment data of each enrollee included in the enrollment template 1741. Then, the matching score calculation unit 1733 stores the matching score 1743 in the storage unit 1735.
 次に、照合スコア算出部1733は、照合スコア1743を用いて照合対象者に対する生体認証を行い、認証結果を生成する。通信部1734は、通信ネットワークを介して、認証結果を制御装置1702へ送信する。 Next, the matching score calculation unit 1733 performs biometric authentication on the person to be matched using the matching score 1743 and generates an authentication result. The communication unit 1734 transmits the authentication result to the control device 1702 via the communication network.
 特徴取得装置1701が行う処理は、図16のステップ1601及びステップ1602の処理と同様であり、照合スコア算出装置1703が行う処理は、図16のステップ1603~ステップ1614の処理と同様である。 The processing performed by the feature acquisition device 1701 is similar to the processing in steps 1601 and 1602 in FIG. 16, and the processing performed by the matching score calculation device 1703 is similar to the processing in steps 1603 to 1614 in FIG. 16.
 次に、照合スコア算出処理の変形例について説明する。式(21)のF(n,i,θ)はθの2次関数でなくてもよく、θの絶対値|θ|の関数であってもよい。F(n,i,θ)が|θ|の関数である場合、α(θ,n,i)は、例えば、次式により表される。 Next, a modified example of the matching score calculation process will be described. F(n, i, θ) in formula (21) does not have to be a quadratic function of θ, and may be a function of the absolute value of θ, |θ|. When F(n, i, θ) is a function of |θ|, α(θ, n, i) is expressed, for example, by the following formula.
α(θ,n,i)=p4(n,i)|θ|+p5(n,i)   (36) α(θ,n,i)=p4(n,i)|θ|+p5(n,i)   (36)
 p4(n,i)及びp5(n,i)は、フィッティング係数である。式(36)を用いることで、P(i)の回転をより適切に表現できるため、照合スコアの高精度化が期待できる。 p4(n,i) and p5(n,i) are fitting coefficients. By using equation (36), the rotation of P(i) can be more appropriately expressed, which is expected to improve the accuracy of the matching score.
 なお、式(22)又は式(36)の何れを用いるかは、P(i)毎に切り替えてもよい。例えば、i=1の場合は式(22)を用いてフィッティングを行い、i=2の場合は式(36)を用いてフィッティングを行ってもよい。P(i)毎に、式(22)又は式(36)のうちフィッティング誤差の小さい方を用いて、フィッティングを行ってもよい。 It should be noted that whether to use formula (22) or formula (36) may be switched for each P(i). For example, when i=1, fitting may be performed using formula (22), and when i=2, fitting may be performed using formula (36). For each P(i), fitting may be performed using either formula (22) or formula (36), whichever has the smaller fitting error.
 また、第1方式と第2方式の何れを用いるかを適宜切り替えてもよい。例えば、角度差θが小さい場合は、第2方式を用いても十分に高い精度で照合スコアを算出できることがある。したがって、演算が簡易な第2方式を用いることで、処理を高速化することができる。一方、角度差θが大きい場合は、第1方式を用いることで、より高い精度で照合スコアを算出することができる。 Furthermore, it is possible to switch between the first method and the second method as appropriate. For example, when the angle difference θ is small, the second method may be used to calculate the matching score with sufficiently high accuracy. Therefore, by using the second method, which is simple to calculate, it is possible to speed up the processing. On the other hand, when the angle difference θ is large, the first method may be used to calculate the matching score with higher accuracy.
 図9の照合スコア算出装置901又は図17の照合スコア算出装置1703は、照合対象データに含まれる特徴点の代わりに、各登録者の登録データに含まれる特徴点の座標を座標変換してもよい。この場合、照合スコア算出装置901又は照合スコア算出装置1703は、登録データに含まれる各特徴点の座標変換後の座標と、各特徴点の特徴ベクトルと、照合対象データとを用いて、第1方式又は第2方式と同様の算出方法により、照合スコアを算出する。 The matching score calculation device 901 in FIG. 9 or the matching score calculation device 1703 in FIG. 17 may convert the coordinates of the feature points included in the registered data of each registered person instead of the feature points included in the data to be matched. In this case, the matching score calculation device 901 or the matching score calculation device 1703 calculates the matching score using the coordinates after coordinate conversion of each feature point included in the registered data, the feature vectors of each feature point, and the data to be matched, in a calculation method similar to the first method or the second method.
 図7の照合スコア算出装置701の構成は一例に過ぎず、照合スコア算出装置701の用途又は条件に応じて一部の構成要素を省略又は変更してもよい。図9の入退室管理システム及び図17のフラッパーゲート管理システムの構成は一例に過ぎず、入退室管理システム又はフラッパーゲート管理システムの用途又は条件に応じて一部の構成要素を省略又は変更してもよい。 The configuration of the matching score calculation device 701 in FIG. 7 is merely an example, and some of the components may be omitted or changed depending on the use or conditions of the matching score calculation device 701. The configurations of the entrance/exit management system in FIG. 9 and the flapper gate management system in FIG. 17 are merely an example, and some of the components may be omitted or changed depending on the use or conditions of the entrance/exit management system or the flapper gate management system.
 図8、図10~図12、及び図16のフローチャートは一例に過ぎず、照合スコア算出装置701、入退室管理システム、又はフラッパーゲート管理システムの構成又は条件に応じて一部の処理を省略又は変更してもよい。 The flowcharts in Figures 8, 10 to 12, and 16 are merely examples, and some processing may be omitted or changed depending on the configuration or conditions of the matching score calculation device 701, the entrance/exit management system, or the flapper gate management system.
 図1に示した特徴点及び特徴量と、図5に示した特徴点と、図6、図13、及び図14に示した特徴量は一例に過ぎず、特徴点及び特徴量は、生体画像に応じて変化する。図2及び図3に示した正規化処理は一例に過ぎず、正規化処理は、生体画像に応じて変化する。図4に示した照合処理は一例に過ぎず、照合処理は、照合対象データ及び登録データに応じて変化する。 The feature points and feature amounts shown in FIG. 1, the feature points shown in FIG. 5, and the feature amounts shown in FIG. 6, FIG. 13, and FIG. 14 are merely examples, and the feature points and feature amounts change depending on the biometric image. The normalization process shown in FIG. 2 and FIG. 3 is merely an example, and the normalization process changes depending on the biometric image. The matching process shown in FIG. 4 is merely an example, and the matching process changes depending on the data to be matched and the registered data.
 図15に示したα(θ,n,n)と回転角θの関係は一例に過ぎず、α(θ,n,n)と回転角θの関係は、基底ベクトルP(i)の特性に応じて変化する。 The relationship between α(θ,n,n) and the rotation angle θ shown in Figure 15 is merely an example, and the relationship between α(θ,n,n) and the rotation angle θ changes depending on the characteristics of the basis vector P(i).
 式(1)~式(36)は一例に過ぎず、入退室管理システム及びフラッパーゲート管理システムは、別の計算式を用いて生体認証処理を行ってもよい。 Equations (1) to (36) are merely examples, and the entrance/exit management system and the flapper gate management system may use other calculation formulas to perform biometric authentication processing.
 図18は、図7の照合スコア算出装置701、図9の照合スコア算出装置901、図17の特徴取得装置1701、及び図17の照合スコア算出装置1703として用いられる情報処理装置のハードウェア構成例を示している。 FIG. 18 shows an example of the hardware configuration of an information processing device used as the matching score calculation device 701 in FIG. 7, the matching score calculation device 901 in FIG. 9, the feature acquisition device 1701 in FIG. 17, and the matching score calculation device 1703 in FIG. 17.
 図18の情報処理装置は、CPU1801、メモリ1802、入力装置1803、出力装置1804、補助記憶装置1805、媒体駆動装置1806、及びネットワーク接続装置1807を含む。これらの構成要素はハードウェアであり、バス1808により互いに接続されている。図9の生体画像取得部911及び図17の生体画像取得部1711は、バス1808に接続されるハードウェアセンサであってもよい。 The information processing device of FIG. 18 includes a CPU 1801, a memory 1802, an input device 1803, an output device 1804, an auxiliary storage device 1805, a medium drive device 1806, and a network connection device 1807. These components are hardware and are connected to each other via a bus 1808. The biometric image acquisition unit 911 of FIG. 9 and the biometric image acquisition unit 1711 of FIG. 17 may be hardware sensors connected to the bus 1808.
 メモリ1802は、例えば、ROM(Read Only Memory)、RAM(Random Access Memory)、フラッシュメモリ等の半導体メモリであり、処理に用いられるプログラム及びデータを記憶する。メモリ1802は、図7の記憶部711、図9の記憶部917、図17の記憶部1714、又は図17の記憶部1735として動作してもよい。 Memory 1802 is, for example, a semiconductor memory such as a ROM (Read Only Memory), a RAM (Random Access Memory), or a flash memory, and stores programs and data used in processing. Memory 1802 may operate as memory unit 711 in FIG. 7, memory unit 917 in FIG. 9, memory unit 1714 in FIG. 17, or memory unit 1735 in FIG. 17.
 CPU1801(プロセッサ)は、例えば、メモリ1802を利用してプログラムを実行することにより、図7の座標変換部712及び照合スコア算出部713として動作する。 The CPU 1801 (processor) operates as the coordinate conversion unit 712 and the matching score calculation unit 713 in FIG. 7 by, for example, executing a program using the memory 1802.
 CPU1801は、メモリ1802を利用してプログラムを実行することにより、図9の特徴抽出部912、座標変換算出部913、座標変換部914、及び照合スコア算出部915としても動作する。 The CPU 1801 also operates as the feature extraction unit 912, the coordinate transformation calculation unit 913, the coordinate transformation unit 914, and the matching score calculation unit 915 in FIG. 9 by executing a program using the memory 1802.
 CPU1801は、メモリ1802を利用してプログラムを実行することにより、図17の特徴抽出部1712、座標変換算出部1731、座標変換部1732、及び照合スコア算出部1733としても動作する。 The CPU 1801 also operates as the feature extraction unit 1712, the coordinate transformation calculation unit 1731, the coordinate transformation unit 1732, and the matching score calculation unit 1733 in FIG. 17 by executing a program using the memory 1802.
 入力装置1803は、例えば、キーボード、ポインティングデバイス等であり、オペレータからの指示又は情報の入力に用いられる。出力装置1804は、例えば、表示装置、プリンタ、スピーカ等であり、オペレータへの問い合わせ又は処理結果の出力に用いられる。処理結果は、照合スコア925、照合スコア1743、又は認証結果であってもよい。 The input device 1803 is, for example, a keyboard, a pointing device, etc., and is used to input instructions or information from the operator. The output device 1804 is, for example, a display device, a printer, a speaker, etc., and is used to output inquiries to the operator or processing results. The processing results may be the matching score 925, the matching score 1743, or an authentication result.
 補助記憶装置1805は、例えば、磁気ディスク装置、光ディスク装置、光磁気ディスク装置、テープ装置等である。補助記憶装置1805は、ハードディスクドライブ又はSSD(Solid State Drive)であってもよい。情報処理装置は、補助記憶装置1805にプログラム及びデータを格納しておき、それらをメモリ1802にロードして使用することができる。補助記憶装置1805は、図7の記憶部711、図9の記憶部917、図17の記憶部1714、又は図17の記憶部1735として動作してもよい。 The auxiliary storage device 1805 is, for example, a magnetic disk device, an optical disk device, a magneto-optical disk device, a tape device, etc. The auxiliary storage device 1805 may be a hard disk drive or an SSD (Solid State Drive). The information processing device can store programs and data in the auxiliary storage device 1805 and load them into the memory 1802 for use. The auxiliary storage device 1805 may operate as the memory unit 711 in FIG. 7, the memory unit 917 in FIG. 9, the memory unit 1714 in FIG. 17, or the memory unit 1735 in FIG. 17.
 媒体駆動装置1806は、可搬型記録媒体1809を駆動し、その記録内容にアクセスする。可搬型記録媒体1809は、メモリデバイス、フレキシブルディスク、光ディスク、光磁気ディスク等である。可搬型記録媒体1809は、CD-ROM(Compact Disk Read Only Memory)、DVD(Digital Versatile Disk)、USB(Universal Serial Bus)メモリ等であってもよい。オペレータは、可搬型記録媒体1809にプログラム及びデータを格納しておき、それらをメモリ1802にロードして使用することができる。 The medium drive device 1806 drives the portable recording medium 1809 and accesses the recorded contents. The portable recording medium 1809 is a memory device, a flexible disk, an optical disk, a magneto-optical disk, etc. The portable recording medium 1809 may be a CD-ROM (Compact Disk Read Only Memory), a DVD (Digital Versatile Disk), a USB (Universal Serial Bus) memory, etc. The operator can store programs and data in the portable recording medium 1809 and load them into the memory 1802 for use.
 このように、処理に用いられるプログラム及びデータを格納するコンピュータ読み取り可能な記録媒体は、メモリ1802、補助記憶装置1805、又は可搬型記録媒体1809のような、物理的な(非一時的な)記録媒体である。 In this way, the computer-readable recording medium that stores the programs and data used in the processing is a physical (non-transitory) recording medium such as memory 1802, auxiliary storage device 1805, or portable recording medium 1809.
 ネットワーク接続装置1807は、WAN(Wide Area Network)、LAN(Local Area Network)等の通信ネットワークに接続され、通信に伴うデータ変換を行う通信インタフェース回路である。情報処理装置は、プログラム及びデータを外部の装置からネットワーク接続装置1807を介して受信し、それらをメモリ1802にロードして使用することができる。ネットワーク接続装置1807は、図9の通信部916、図17の通信部1713、又は図17の通信部1734として動作してもよい。 The network connection device 1807 is a communication interface circuit that is connected to a communication network such as a WAN (Wide Area Network) or a LAN (Local Area Network) and performs data conversion associated with communication. The information processing device receives programs and data from an external device via the network connection device 1807 and can load them into the memory 1802 for use. The network connection device 1807 may operate as the communication unit 916 in FIG. 9, the communication unit 1713 in FIG. 17, or the communication unit 1734 in FIG. 17.
 なお、情報処理装置が図18のすべての構成要素を含む必要はなく、用途又は条件に応じて一部の構成要素を省略又は変更することも可能である。例えば、オペレータとのインタフェースが不要である場合は、入力装置1803及び出力装置1804を省略してもよい。情報処理装置が可搬型記録媒体1809又は通信ネットワークを利用しない場合は、媒体駆動装置1806又はネットワーク接続装置1807を省略してもよい。 Note that the information processing device does not need to include all of the components in FIG. 18, and some components may be omitted or modified depending on the application or conditions. For example, if an interface with an operator is not required, the input device 1803 and the output device 1804 may be omitted. If the information processing device does not use the portable recording medium 1809 or a communication network, the medium drive device 1806 or the network connection device 1807 may be omitted.
 開示の実施形態とその利点について詳しく説明したが、当業者は、特許請求の範囲に明確に記載した本発明の範囲から逸脱することなく、様々な変更、追加、省略をすることができるであろう。 Although the disclosed embodiments and their advantages have been described in detail, those skilled in the art may make various modifications, additions, and omissions without departing from the scope of the present invention as expressly set forth in the claims.

Claims (16)

  1.  コンピュータが、
     照合特徴点の座標情報と特徴量情報とを取得した場合、前記照合特徴点の座標情報と記憶部に記憶された登録特徴点の座標情報とを用いて算出される角度差に基づいて、前記照合特徴点と前記登録特徴点との何れか一方を座標変換し、
     前記角度差と前記取得した照合特徴点の特徴量情報とを用いて、前記照合特徴点と前記登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記登録特徴点との他方との照合スコアを算出する
     処理を実行することを特徴とする照合スコア算出方法。
    The computer
    when the coordinate information and feature amount information of a matching feature point are acquired, a coordinate transformation is performed on either the matching feature point or the registered feature point based on an angle difference calculated using the coordinate information of the matching feature point and coordinate information of a registered feature point stored in a storage unit;
    a process of calculating a feature point obtained by coordinate transformation of one of the matching feature point and the registered feature point, and a matching score between the other of the matching feature point and the registered feature point, using the angle difference and feature amount information of the acquired matching feature point.
  2.  前記照合スコアを算出する処理は、
      前記角度差を用いて、前記照合特徴点と前記登録特徴点との何れか一方の特徴量情報を補正する処理と、
      前記照合特徴点と前記登録特徴点との何れか一方の補正後の特徴量情報を用いて、前記照合スコアを算出する処理と、
     を含むことを特徴とする請求項1記載の照合スコア算出方法。
    The process of calculating the matching score includes:
    a process of correcting feature amount information of either the matching feature point or the registered feature point using the angle difference;
    calculating the matching score by using corrected feature amount information of either the matching feature point or the registered feature point;
    2. The method for calculating a match score according to claim 1, further comprising:
  3.  前記照合スコアを算出する処理は、
      前記角度差を用いて重みを算出する処理と、
      前記重みを用いて前記照合スコアを算出する処理と、
     を含むことを特徴とする請求項1記載の照合スコア算出方法。
    The process of calculating the matching score includes:
    A process of calculating a weight using the angle difference;
    calculating the match score using the weights;
    2. The method for calculating a match score according to claim 1, further comprising:
  4.  前記照合特徴点は、照合対象者の生体画像から抽出された特徴点であり、
     前記登録特徴点は、複数の登録者各々の生体画像から抽出された特徴点であり、
     前記照合特徴点と前記登録特徴点との何れか一方を座標変換する処理は、前記照合特徴点の座標情報と、前記記憶部に記憶された前記複数の登録者各々の登録特徴点の座標情報とを用いて算出される角度差に基づいて、前記照合特徴点と前記複数の登録者各々の登録特徴点との何れか一方を座標変換する処理を含み、
     前記照合特徴点と前記登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記登録特徴点との他方との照合スコアを算出する処理は、前記照合特徴点と前記複数の登録者各々の登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記複数の登録者各々の登録特徴点との他方との照合スコアを算出する処理を含み、
     前記コンピュータは、前記照合特徴点と前記複数の登録者各々の登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記複数の登録者各々の登録特徴点との他方との照合スコアに基づいて、前記複数の登録者のうち所定の条件を満たす登録者を特定する処理をさらに実行することを特徴とする請求項1乃至3の何れか1項に記載の照合スコア算出方法。
    the matching feature points are feature points extracted from a biometric image of a person to be matched,
    the registered feature points are feature points extracted from biometric images of each of a plurality of enrolled persons;
    the process of converting coordinates of either the matching feature point or the registered feature point includes a process of converting coordinates of either the matching feature point or the registered feature point of each of the multiple enrolled persons based on an angle difference calculated using coordinate information of the matching feature point and coordinate information of the registered feature point of each of the multiple enrolled persons stored in the storage unit,
    the process of calculating a feature point obtained by coordinate transformation of either the matching feature point or the registered feature point, and a matching score between the other of the matching feature point and the registered feature point includes a process of calculating a feature point obtained by coordinate transformation of either the matching feature point or the registered feature point of each of the multiple enrolled persons, and a matching score between the matching feature point and the other of the registered feature points of each of the multiple enrolled persons,
    4. The method for calculating a matching score according to claim 1, further comprising: a step of identifying a registrant who satisfies a predetermined condition among the plurality of registrants, based on feature points obtained by coordinate transformation of either the matching feature points or the registered feature points of each of the plurality of registrants, and a matching score between the matching feature points and the other of the registered feature points of each of the plurality of registrants.
  5.  前記照合対象者の生体画像は、前記照合対象者の手のひら静脈画像であり、前記複数の登録者各々の生体画像は、前記複数の登録者各々の手のひら静脈画像であることを特徴とする請求項4記載の照合スコア算出方法。 The matching score calculation method according to claim 4, characterized in that the biometric image of the person to be matched is a palm vein image of the person to be matched, and the biometric image of each of the multiple registered persons is a palm vein image of each of the multiple registered persons.
  6.  登録特徴点の座標情報を記憶する記憶部と、
     照合特徴点の座標情報と特徴量情報とを取得した場合、前記照合特徴点の座標情報と前記登録特徴点の座標情報とを用いて算出される角度差に基づいて、前記照合特徴点と前記登録特徴点との何れか一方を座標変換する座標変換部と、
     前記角度差と前記取得した照合特徴点の特徴量情報とを用いて、前記照合特徴点と前記登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記登録特徴点との他方との照合スコアを算出する照合スコア算出部と、
     を備えることを特徴とする照合スコア算出装置。
    A storage unit that stores coordinate information of the registered feature points;
    a coordinate conversion unit that converts the coordinates of one of the matching feature point and the registered feature point based on an angle difference calculated using the coordinate information of the matching feature point and the coordinate information of the registered feature point when the coordinate information and feature amount information of the matching feature point are acquired;
    a matching score calculation unit that calculates a feature point obtained by performing coordinate transformation on one of the matching feature point and the registered feature point using the angle difference and feature amount information of the acquired matching feature point, and a matching score between the other of the matching feature point and the registered feature point;
    A matching score calculation device comprising:
  7.  前記照合スコア算出部は、前記角度差を用いて、前記照合特徴点と前記登録特徴点との何れか一方の特徴量情報を補正し、前記照合特徴点と前記登録特徴点との何れか一方の補正後の特徴量情報を用いて、前記照合スコアを算出することを特徴とする請求項6記載の照合スコア算出装置。 The matching score calculation device according to claim 6, characterized in that the matching score calculation unit uses the angle difference to correct feature amount information of either the matching feature point or the registered feature point, and calculates the matching score using the corrected feature amount information of either the matching feature point or the registered feature point.
  8.  前記照合スコア算出部は、前記角度差を用いて重みを算出し、前記重みを用いて前記照合スコアを算出することを特徴とする請求項6記載の照合スコア算出装置。 The matching score calculation device according to claim 6, characterized in that the matching score calculation unit calculates a weight using the angle difference and calculates the matching score using the weight.
  9.  前記照合特徴点は、照合対象者の生体画像から抽出された特徴点であり、
     前記登録特徴点は、複数の登録者各々の生体画像から抽出された特徴点であり、
     前記座標変換部は、前記照合特徴点の座標情報と、前記記憶部に記憶された前記複数の登録者各々の登録特徴点の座標情報とを用いて算出される角度差に基づいて、前記照合特徴点と前記複数の登録者各々の登録特徴点との何れか一方を座標変換し、
     前記照合スコア算出部は、前記照合特徴点と前記複数の登録者各々の登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記複数の登録者各々の登録特徴点との他方との照合スコアを算出し、前記照合特徴点と前記複数の登録者各々の登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記複数の登録者各々の登録特徴点との他方との照合スコアに基づいて、前記複数の登録者のうち所定の条件を満たす登録者を特定することを特徴とする請求項6乃至8の何れか1項に記載の照合スコア算出装置。
    the matching feature points are feature points extracted from a biometric image of a person to be matched,
    the registered feature points are feature points extracted from biometric images of each of a plurality of enrolled persons;
    the coordinate transformation unit transforms the coordinates of either the matching feature point or the registered feature point of each of the plurality of enrolled persons based on an angle difference calculated using coordinate information of the matching feature point and coordinate information of the registered feature point of each of the plurality of enrolled persons stored in the storage unit;
    9. The matching score calculation device according to claim 6, wherein the matching score calculation unit calculates a matching score between a feature point obtained by coordinate transformation of either the matching feature point or the registered feature point of each of the multiple registrants and the other of the matching feature point and the registered feature point of each of the multiple registrants, and identifies a registrant who satisfies a predetermined condition from among the multiple registrants, based on the matching score between the feature point obtained by coordinate transformation of either the matching feature point or the registered feature point of each of the multiple registrants and the other of the registered feature points of each of the multiple registrants.
  10.  前記照合対象者の生体画像は、前記照合対象者の手のひら静脈画像であり、前記複数の登録者各々の生体画像は、前記複数の登録者各々の手のひら静脈画像であることを特徴とする請求項9記載の照合スコア算出装置。 The matching score calculation device according to claim 9, characterized in that the biometric image of the person to be matched is a palm vein image of the person to be matched, and the biometric image of each of the multiple registered persons is a palm vein image of each of the multiple registered persons.
  11.  特徴取得装置と照合スコア算出装置とを備える照合スコア算出システムであって、
     前記特徴取得装置は、
      照合特徴点の座標情報と特徴量情報とを取得する取得部と、
      前記照合特徴点の座標情報と特徴量情報とを前記照合スコア算出装置へ送信する送信部とを含み、
     前記照合スコア算出装置は、
      前記照合特徴点の座標情報と特徴量情報とを前記特徴取得装置から受信する受信部と、
      登録特徴点の座標情報を記憶する記憶部と、
      前記照合特徴点の座標情報と前記登録特徴点の座標情報とを用いて算出される角度差に基づいて、前記照合特徴点と前記登録特徴点との何れか一方を座標変換する座標変換部と、
      前記角度差と前記取得した照合特徴点の特徴量情報とを用いて、前記照合特徴点と前記登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記登録特徴点との他方との照合スコアを算出する照合スコア算出部とを含む、
     ことを特徴とする照合スコア算出システム。
    A matching score calculation system including a feature acquisition device and a matching score calculation device,
    The feature acquisition device includes:
    an acquisition unit that acquires coordinate information and feature amount information of a matching feature point;
    a transmission unit that transmits coordinate information and feature amount information of the matching feature points to the matching score calculation device,
    The matching score calculation device includes:
    a receiving unit that receives coordinate information and feature amount information of the matching feature points from the feature acquisition device;
    A storage unit that stores coordinate information of the registered feature points;
    a coordinate conversion unit that converts the coordinates of one of the matching feature point and the registered feature point based on an angle difference calculated using coordinate information of the matching feature point and coordinate information of the registered feature point;
    a matching score calculation unit that calculates a feature point obtained by performing coordinate transformation on one of the matching feature point and the registered feature point using the angle difference and feature amount information of the acquired matching feature point, and a matching score between the other of the matching feature point and the registered feature point.
    A matching score calculation system comprising:
  12.  照合特徴点の座標情報と特徴量情報とを取得した場合、前記照合特徴点の座標情報と記憶部に記憶された登録特徴点の座標情報とを用いて算出される角度差に基づいて、前記照合特徴点と前記登録特徴点との何れか一方を座標変換し、
     前記角度差と前記取得した照合特徴点の特徴量情報とを用いて、前記照合特徴点と前記登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記登録特徴点との他方との照合スコアを算出する
     処理をコンピュータに実行させるための照合スコア算出プログラム。
    when the coordinate information and feature amount information of a matching feature point are acquired, a coordinate transformation is performed on either the matching feature point or the registered feature point based on an angle difference calculated using the coordinate information of the matching feature point and coordinate information of a registered feature point stored in a storage unit;
    a matching score calculation program for causing a computer to execute a process of calculating a feature point obtained by coordinate transformation of either the matching feature point or the registered feature point, and a matching score between the other of the matching feature point and the registered feature point, using the angle difference and feature amount information of the acquired matching feature point.
  13.  前記照合スコアを算出する処理は、
      前記角度差を用いて、前記照合特徴点と前記登録特徴点との何れか一方の特徴量情報を補正する処理と、
      前記照合特徴点と前記登録特徴点との何れか一方の補正後の特徴量情報を用いて、前記照合スコアを算出する処理と、
     を含むことを特徴とする請求項12記載の照合スコア算出プログラム。
    The process of calculating the matching score includes:
    a process of correcting feature amount information of either the matching feature point or the registered feature point using the angle difference;
    calculating the matching score by using corrected feature amount information of either the matching feature point or the registered feature point;
    13. The collation score calculation program according to claim 12, further comprising:
  14.  前記照合スコアを算出する処理は、
      前記角度差を用いて重みを算出する処理と、
      前記重みを用いて前記照合スコアを算出する処理と、
     を含むことを特徴とする請求項12記載の照合スコア算出プログラム。
    The process of calculating the matching score includes:
    A process of calculating a weight using the angle difference;
    calculating the match score using the weights;
    13. The collation score calculation program according to claim 12, further comprising:
  15.  前記照合特徴点は、照合対象者の生体画像から抽出された特徴点であり、
     前記登録特徴点は、複数の登録者各々の生体画像から抽出された特徴点であり、
     前記照合特徴点と前記登録特徴点との何れか一方を座標変換する処理は、前記照合特徴点の座標情報と、前記記憶部に記憶された前記複数の登録者各々の登録特徴点の座標情報とを用いて算出される角度差に基づいて、前記照合特徴点と前記複数の登録者各々の登録特徴点との何れか一方を座標変換する処理を含み、
     前記照合特徴点と前記登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記登録特徴点との他方との照合スコアを算出する処理は、前記照合特徴点と前記複数の登録者各々の登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記複数の登録者各々の登録特徴点との他方との照合スコアを算出する処理を含み、
     前記照合スコア算出プログラムは、前記照合特徴点と前記複数の登録者各々の登録特徴点との何れか一方を座標変換した特徴点と、前記照合特徴点と前記複数の登録者各々の登録特徴点との他方との照合スコアに基づいて、前記複数の登録者のうち所定の条件を満たす登録者を特定する処理を、前記コンピュータにさらに実行させることを特徴とする請求項12乃至14の何れか1項に記載の照合スコア算出プログラム。
    the matching feature points are feature points extracted from a biometric image of a person to be matched,
    the registered feature points are feature points extracted from biometric images of each of a plurality of enrolled persons;
    the process of converting coordinates of either the matching feature point or the registered feature point includes a process of converting coordinates of either the matching feature point or the registered feature point of each of the multiple enrolled persons based on an angle difference calculated using coordinate information of the matching feature point and coordinate information of the registered feature point of each of the multiple enrolled persons stored in the storage unit,
    the process of calculating a feature point obtained by coordinate transformation of either the matching feature point or the registered feature point, and a matching score between the other of the matching feature point and the registered feature point includes a process of calculating a feature point obtained by coordinate transformation of either the matching feature point or the registered feature point of each of the multiple enrolled persons, and a matching score between the matching feature point and the other of the registered feature points of each of the multiple enrolled persons,
    15. The matching score calculation program according to claim 12, further causing the computer to execute a process of identifying a registrant who satisfies a predetermined condition among the multiple registrants, based on feature points obtained by coordinate transformation of either the matching feature points or the registered feature points of each of the multiple registrants, and a matching score between the matching feature points and the other of the registered feature points of each of the multiple registrants.
  16.  前記照合対象者の生体画像は、前記照合対象者の手のひら静脈画像であり、前記複数の登録者各々の生体画像は、前記複数の登録者各々の手のひら静脈画像であることを特徴とする請求項15記載の照合スコア算出プログラム。 The matching score calculation program according to claim 15, characterized in that the biometric image of the person to be matched is a palm vein image of the person to be matched, and the biometric image of each of the multiple registered persons is a palm vein image of each of the multiple registered persons.
PCT/JP2022/041196 2022-11-04 2022-11-04 Comparison score calculation method, comparison score calculation device, comparison score calculation system, and comparison score calculation program WO2024095462A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/041196 WO2024095462A1 (en) 2022-11-04 2022-11-04 Comparison score calculation method, comparison score calculation device, comparison score calculation system, and comparison score calculation program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/041196 WO2024095462A1 (en) 2022-11-04 2022-11-04 Comparison score calculation method, comparison score calculation device, comparison score calculation system, and comparison score calculation program

Publications (1)

Publication Number Publication Date
WO2024095462A1 true WO2024095462A1 (en) 2024-05-10

Family

ID=90929990

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/041196 WO2024095462A1 (en) 2022-11-04 2022-11-04 Comparison score calculation method, comparison score calculation device, comparison score calculation system, and comparison score calculation program

Country Status (1)

Country Link
WO (1) WO2024095462A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002083298A (en) * 2000-09-06 2002-03-22 Hitachi Ltd Personal identification device and method
JP2016081116A (en) * 2014-10-10 2016-05-16 富士通株式会社 Biological information correction apparatus, biological information correction method and biological information correction computer program
JP2018185730A (en) * 2017-04-27 2018-11-22 富士通株式会社 Verification device, verification method, and verification program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002083298A (en) * 2000-09-06 2002-03-22 Hitachi Ltd Personal identification device and method
JP2016081116A (en) * 2014-10-10 2016-05-16 富士通株式会社 Biological information correction apparatus, biological information correction method and biological information correction computer program
JP2018185730A (en) * 2017-04-27 2018-11-22 富士通株式会社 Verification device, verification method, and verification program

Similar Documents

Publication Publication Date Title
US10509943B2 (en) Method of processing fingerprint information
Wang et al. Alignment-free cancelable fingerprint template design: A densely infinite-to-one mapping (DITOM) approach
US9633261B2 (en) Salting system and method for cancelable iris biometric
Yang et al. Fingerprint matching based on extreme learning machine
US20180054299A1 (en) Encrypting and decrypting information
KR101956071B1 (en) Method and apparatus for verifying a user
JP4867601B2 (en) User authentication method and user authentication system using biometric feature conversion device
US20080065900A1 (en) Method and apparatus for biometrics
US20200265211A1 (en) Fingerprint distortion rectification using deep convolutional neural networks
WO2012124115A1 (en) Biological information acquisition device, biological information comparison device, and program
US20090123077A1 (en) Coefficient determining method, feature extracting method, system, and program, and pattern checking method, system, and program
EP2833320B1 (en) Biometric authentication device, biometric authentication method, and biometric authentication program
KR102476017B1 (en) Method and apparatus for authentication using biometric information
CN116010917A (en) Privacy-protected image processing method, identity registration method and identity authentication method
Ziauddin et al. Iris recognition performance enhancement using weighted majority voting
Pandey et al. ASRA: Automatic singular value decomposition-based robust fingerprint image alignment
WO2024095462A1 (en) Comparison score calculation method, comparison score calculation device, comparison score calculation system, and comparison score calculation program
KR101086632B1 (en) VLSI Architecture for the fuzzy fingerprint vault system and matching method thereof
Wesley Authentication-based multimodal biometric system using exponential water wave optimization algorithm
Dash et al. Efficient private key generation from iris data for privacy and security applications
KR101838432B1 (en) Method and system for authentication using biometrics and functional encryption-inner product
US11711216B1 (en) Systems and methods for privacy-secured biometric identification and verification
Pandey et al. Learning representations for cryptographic hash based face template protection
Singh et al. Comprehensive survey on cancelable biometrics with novel case study on finger dorsal template protection
Zannou et al. Secured revocable contactless fingerprint template based on center of mass