WO2012029150A1 - Biometric authentication system, biometric authentication method and program - Google Patents

Biometric authentication system, biometric authentication method and program Download PDF

Info

Publication number
WO2012029150A1
WO2012029150A1 PCT/JP2010/064970 JP2010064970W WO2012029150A1 WO 2012029150 A1 WO2012029150 A1 WO 2012029150A1 JP 2010064970 W JP2010064970 W JP 2010064970W WO 2012029150 A1 WO2012029150 A1 WO 2012029150A1
Authority
WO
WIPO (PCT)
Prior art keywords
evaluation
unit
biometric
image
evaluation value
Prior art date
Application number
PCT/JP2010/064970
Other languages
French (fr)
Japanese (ja)
Inventor
福田充昭
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2010/064970 priority Critical patent/WO2012029150A1/en
Publication of WO2012029150A1 publication Critical patent/WO2012029150A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the embodiment discussed in this specification relates to a biometric authentication technique for performing personal authentication using human body characteristics such as fingerprints of hands, palm veins, and faces.
  • biometric authentication technology that performs personal authentication by using the characteristics of each part of the human body such as fingerprints, veins, faces, etc. has become widespread as one method for accurately performing identity verification.
  • biometric authentication is used to determine whether to enter or leave a restricted area where only authorized people can enter, whether to log in to a personal computer, whether to access various services such as online transactions, etc. It's being used.
  • biometric authentication In order to perform such biometric authentication, first, as a user's biometric information, for example, an image such as a fingerprint, a vein, or a face is acquired in advance and recorded as a registered biometric information of the user on a recording medium. . Thereafter, when the identity verification for using the service as described above becomes necessary, the biometric information of the user is acquired again in the same manner as at the time of registration. Then, the biometric information (authenticated biometric information) acquired at this time is compared with the registered biometric information recorded on the recording medium, and the level of similarity between the two is determined. Here, if the similarity is higher than a predetermined threshold, an authentication result indicating that the person is the person is obtained.
  • a predetermined threshold an authentication result indicating that the person is the person is obtained.
  • an image of a biological part which is biometric information used for authentication
  • biometric image can be taken at the same position and posture during registration and verification. preferable.
  • photographing cannot be performed, and this is one of the factors that reduce the authentication accuracy.
  • Another method is to capture and record multiple biometric images of the same user at various positions, postures, and shapes at the time of registration, and target multiple biometric images acquired at registration at the time of verification.
  • Several methods have also been proposed for collating images on a single sheet to a plurality of sheets. This method can be classified into several types according to what kind of biometric images of the same user, which are registered biometric information, are to be used.
  • the first of these is to record a plurality of captured biometric images as registered biometric information about the user without any special consideration.
  • the second of these methods evaluates the similarity of the similarity between the captured biometric images, and determines the combination of bioimages that are not similar (that is, the similarity is low) as a registered biometric for the user. It is to record as information.
  • the third method is to present a guidance message to the user at the time of registration to guide the living body part to an appropriate position and posture, and then shoot a plurality of images on the user. It is recorded as registered biometric information.
  • the first method described above is a method that relies entirely by chance, the registered biometric information is not always appropriate for biometric authentication.
  • the second method described above it is not always possible to comprehensively acquire registered biometric information of positions and orientations suitable for collation with a user's biometric image captured during collation. In other words, this second method is also relied upon by chance, and appropriate registered biometric information is not always obtained.
  • the third method described above is an effective method in that there is a high possibility that appropriate registered biometric information can be obtained.
  • a means for performing such guidance for example, an assistant (operator), visual means such as a display device, an output device for emitting a guidance sound, etc. is separately required.
  • the person to be authenticated is based on the similarity between the registered biometric information of the person to be authenticated previously registered at the time of registration and the authentication biometric information of the person to be authenticated acquired at the time of authentication. There is something to verify the identity of.
  • This biometric authentication system includes a biometric image acquisition unit, an evaluation unit, an evaluation value extraction unit, a registered biometric information storage unit, and an authentication unit.
  • the biometric image acquisition unit acquires a plurality of biometric images of the person to be authenticated.
  • the evaluation unit evaluates each of the plurality of biological images acquired by the biological image acquisition unit for each of a plurality of predetermined evaluation conditions, and determines the degree of coincidence with the evaluation conditions for the biological image.
  • the evaluation value to be shown is output as the evaluation result of the biological image for each of the plurality of evaluation conditions.
  • the evaluation value extraction unit extracts, for each evaluation condition, an evaluation value indicating a high degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the evaluation values for each of the plurality of biological images output by the evaluation unit.
  • the evaluation value extraction unit further has the highest degree of matching with the same evaluation condition from the plurality of extracted evaluation values.
  • the evaluation value is extracted.
  • the registered biometric information storage unit stores a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction unit as registered biometric information of the person to be authenticated.
  • the authentication unit determines whether the similarity between the registered biometric information stored in the registered biometric information storage unit and the authenticated biometric information is high or low, and verifies the identity of the person to be authenticated based on the determination result.
  • the biometric authentication method described later in this specification uses the biometric authentication method based on the degree of similarity between the registered biometric information of the authenticated person previously registered at the time of registration and the authenticated biometric information of the authenticated person acquired at the time of authentication. There is one that verifies the identity of the certifier.
  • the evaluation unit evaluates each of the plurality of biometric images of the person to be authenticated for each of a plurality of predetermined evaluation conditions, and matches the evaluation conditions for the biometric image.
  • An evaluation value indicating the height of the degree is output for each of the plurality of evaluation conditions as a biological image evaluation result.
  • the evaluation value extraction unit extracts, for each evaluation condition, an evaluation value indicating a degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the evaluation values output by the evaluation unit for each of a plurality of biological images. .
  • the evaluation value extraction unit further has the highest degree of matching with the same evaluation condition from the plurality of extracted evaluation values.
  • the evaluation value is extracted.
  • the registered biometric information storage unit stores a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction unit as registered biometric information of the person to be authenticated.
  • the authentication unit determines whether the similarity between the registered biometric information stored in the registered biometric information storage unit and the authenticated biometric information is high or low, and verifies the identity of the person to be authenticated based on the determination result.
  • the authenticated person based on the similarity between the registered biometric information of the authenticated person registered in advance at the time of registration and the authenticated biometric information of the authenticated person acquired at the time of authentication.
  • This program causes a computer to perform an evaluation process, an evaluation value extraction process, a registered biometric information storage process, and an authentication process.
  • the evaluation process evaluates each of the plurality of biometric images of the person to be authenticated for each of a plurality of predetermined evaluation conditions, and determines the degree of coincidence with the evaluation conditions for the biometric image. This is a process of outputting the indicated evaluation value as the evaluation result of the biological image for each of the plurality of evaluation conditions.
  • the evaluation value extraction process is a process of extracting, for each evaluation condition, an evaluation value indicating a degree of coincidence that is equal to or higher than a predetermined evaluation threshold, from the evaluation values output by the evaluation process for each of a plurality of biological images. is there. If a plurality of evaluation values are extracted under the same evaluation condition by this extraction, the degree of matching with the same evaluation condition is the highest from the plurality of evaluation values extracted by the evaluation value extraction process.
  • a process for extracting an evaluation value is performed.
  • the registered biometric information storage process is a process of storing a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction process in the storage unit as registered biometric information of the person to be authenticated. In the authentication process, the degree of similarity between the registered biometric information stored in the storage unit and the authentication biometric information is determined, and the identity of the person to be authenticated is verified based on the determination result.
  • the biometric authentication device described later in this specification can acquire registered biometric information having a position, posture, and shape suitable for collation without imposing a great burden on the user.
  • FIG. 10 is a flowchart illustrating an example of processing content of a biometric information registration process performed by a biometric information registration unit in FIG. 9. It is a 1st modification of the flowchart of FIG. It is a 2nd modification of the flowchart of FIG. It is the figure which illustrated an example of the specific structure of the authentication part in the biometric authentication system of FIG. It is the flowchart which illustrated an example of the processing content of the collation process performed in the authentication part of FIG. It is a 3rd modification of the flowchart of FIG. It is a 1st modification of the flowchart of FIG. It is a modification of the concrete structure of the biometric information registration part of FIG.
  • FIG. 1 is a functional configuration diagram of an embodiment of a biometric authentication system.
  • the biometric authentication system 1 confirms the identity of the authenticated person based on the similarity between the registered biometric information of the authenticated person registered in advance at the time of registration and the authenticated biometric information of the authenticated person acquired at the time of authentication. I do.
  • the biometric authentication system 1 includes a biometric information registration unit 10 that registers biometric information of a person to be authenticated, and authenticates the person to be authenticated using the biometric information registered by the biometric information registration unit 10. And an authenticating unit 20 for performing identity verification.
  • the biometric information registration unit 10 includes a biometric image acquisition unit 11, an evaluation unit 12, an evaluation value extraction unit 13, and a registered biometric information storage unit 14.
  • the biometric image acquisition unit 11 acquires a plurality of biometric images of the person to be authenticated.
  • the evaluation unit 12 evaluates each of the plurality of biological images acquired by the biological image acquisition unit 11 for each of a plurality of predetermined evaluation conditions, and has a high degree of coincidence with the evaluation conditions for the biological image. An evaluation value indicating the accuracy is output as the evaluation result of the biological image for each of the plurality of evaluation conditions.
  • the evaluation value extraction unit 13 calculates an evaluation value indicating a degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the evaluation value for each of the plurality of biological images output by the evaluation unit 12 for each evaluation condition described above. To extract. The extraction of the evaluation value is performed in order to shake off a biometric image that cannot be used for the later-described authentication of the person to be authenticated by the authenticating unit 20 from the registered biometric information candidates. In addition, when a plurality of evaluation values are extracted under the same evaluation condition by this extraction, the evaluation value extraction unit 13 further has the highest degree of match with the same evaluation condition from the plurality of extracted evaluation values. Extract high evaluation values.
  • the registered biometric information storage unit 14 stores a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction unit 13 as registered biometric information of the person to be authenticated.
  • the authentication unit 20 determines the degree of similarity between the registered biometric information stored in the registered biometric information storage unit 14 and the authentication biometric information described above, and authenticates the person to be authenticated based on the determination result. Do it.
  • the biometric information registration unit 10 of the biometric authentication system 1 may further include an evaluation result check unit 15 and a guidance output unit 16 as illustrated in FIG.
  • the evaluation result check unit 15 determines whether or not there is an evaluation condition in which the evaluation value is not extracted in the extraction of the evaluation value for each evaluation condition by the evaluation value extraction unit 13. If the evaluation result check unit 15 determines that there is an evaluation condition for which no evaluation value has been extracted, the evaluation result check unit 15 acquires the biological image by the biological image acquisition unit 11 and evaluates the biological image by the evaluation unit 12. And the control for causing the evaluation value extraction unit 13 to extract the evaluation value again.
  • the guidance output unit 16 A guide to the person to be authenticated for guiding the living body part is output. This guidance is for the biological image acquisition unit 11 to obtain a biological image that satisfies the evaluation condition.
  • the biometric image 101 is a plurality of images about the person to be authenticated acquired by the biometric image acquisition unit 11.
  • each of the plurality of biometric images 101 includes images of hands having various positions, postures, and shapes that are biometric parts of the person to be authenticated.
  • the evaluation unit 12 evaluates each of the biological images 101 with respect to a plurality of predetermined evaluation conditions 102, and calculates an evaluation value indicating the degree of matching with each of the evaluation conditions 102 for each of the biological images 101. Is output as an evaluation result for each evaluation condition 102.
  • the evaluation condition 102 is preferably such that various image patterns that are typical in the usage scene of the biometric authentication system 1 are arranged as registered biometric information.
  • the first evaluation condition 102a, the second evaluation condition 102b, the third evaluation condition 102c, the fourth evaluation condition 102d, the fifth evaluation condition 102e, the sixth evaluation condition 102f, and the seventh evaluation condition 102g are calculated.
  • Seven evaluation conditions 102 are defined.
  • the first evaluation condition 102a is a condition that the size of the hand image shown in the image is a predetermined first size
  • the second evaluation condition 102b is shown in the image.
  • the condition is that the size of the hand image is a predetermined second size.
  • the third evaluation condition 102c is a condition that the inclination of the hand image shown in the image is a predetermined first inclination
  • the fourth evaluation condition 102d is the image of the hand shown in the image. Is a predetermined second inclination.
  • the fifth evaluation condition 102e is a condition that the image of the hand shown in the image has a shape with all fingers closed
  • the sixth evaluation condition 102f is the image of the hand shown in the image. It is a condition that all fingers are in a shape with the maximum opening.
  • the seventh evaluation condition 102g is a condition that each finger of the image of the hand shown in the image is bent to a predetermined angle.
  • the evaluation value extraction unit 13 determines, for each evaluation condition 102, an evaluation value that indicates the degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the evaluation values for each of the plurality of biological images 101 output by the evaluation unit 12. To extract. Further, when a plurality of evaluation values are extracted under the same evaluation condition 102 by this extraction, the evaluation value extraction unit 13 further determines the degree of match with the same evaluation condition 102 from the plurality of extracted evaluation values. The highest evaluation value is extracted.
  • the extracted biological image 103 in FIG. 2 is a biological image 101 for each evaluation condition 102 corresponding to the evaluation value extracted by the evaluation value extraction unit 13 in this way.
  • the evaluation result check 104 in FIG. 2 is performed.
  • this evaluation result check 104 it is determined whether or not there is an evaluation condition 102 for which the evaluation value is not extracted in the extraction of the evaluation value for each evaluation condition 102 by the evaluation value extraction unit 13.
  • the biological image 101 is acquired again by the biological image acquisition unit 11, and further, the biological image 101 by the evaluation unit 12 is acquired. And evaluation value extraction by the evaluation value extraction unit 13 are performed again.
  • the evaluation result check 104 the evaluation values for all the evaluation conditions 102 are extracted, and the extracted biological images 103 for all the evaluation conditions 102 are obtained.
  • the biometric authentication system 1 further includes the guidance output unit 16, in the above-described case, the biometric part of the person to be authenticated is guided so that the biometric image acquisition unit 11 can obtain the biometric image 101 that satisfies the evaluation condition 102.
  • the guidance output unit 16 outputs guidance to the person to be authenticated for this purpose.
  • the extracted biometric image 103 obtained for each evaluation condition 102 as described above is stored in the registered biometric information storage unit 14 as the registered biometric information 105 of the person to be authenticated.
  • the authentication unit 20 determines whether the degree of similarity between the registered biometric information of the person to be authenticated stored in the registered biometric information storage unit 14 and the authentication biometric information of the person to be authenticated acquired at the time of authentication is high or low.
  • the identity verification of the person to be authenticated is performed based on the determination result.
  • FIG. 3 is a first example of the correspondence between the biological image 101 and the evaluation value.
  • This first example represents the correspondence between the size of the hand image shown in the biological image 101 and the evaluation value.
  • the evaluation condition 102 in the case of the first evaluation condition 102a and the second evaluation condition 102b described above, that is, the evaluation condition 102 includes information on the size of the image of the person to be authenticated on the biometric image 101. It can be used when it has a condition corresponding to the evaluation value.
  • the horizontal axis represents information on the size of the hand image shown in the biological image 101, more specifically, the area of the hand image, and the vertical axis represents the evaluation value for the area.
  • the evaluation unit 12 includes a table corresponding to the graph of FIG. 3 in which the correspondence between the area value of the hand image and the evaluation value is represented.
  • the area of the hand image shown in the biological image 101 is, for example, counting the number of pixels whose luminance value is estimated to represent the hand image among the pixels constituting the biological image 101. Can be estimated.
  • the evaluation unit 12 refers to the above-described table, acquires an evaluation value associated with the area value estimated as described above from the table, and outputs the acquired evaluation value.
  • FIG. 4 is a second example of the correspondence between the biological image 101 and the evaluation value.
  • This second example represents the correspondence between the inclination of the hand image shown in the biological image 101 and the evaluation value.
  • the evaluation condition 102 evaluates information on the inclination of the image of the user's hand on the biometric image 101. It can be used when it has a condition corresponding to a value.
  • the horizontal axis represents information on the inclination of the hand image shown in the biological image 101, more specifically, the inclination angle of the hand image, and the vertical axis represents the evaluation value for the angle.
  • the evaluation unit 12 is provided with a table corresponding to the graph of FIG. 4 in which the correspondence between the inclination angle of the hand image and the evaluation value is represented.
  • the inclination angle of the hand image shown in the biological image 101 can be estimated, for example, as shown in FIG.
  • this method first, feature points in the image of the hand shown in the biological image 101 are extracted (step 1), and then a plane imitating a palm is formed based on the positional relationship of the feature points ( Step 2). Then, about this plane, around the X axis (the horizontal axis passing through the central position of the biological image 101) and the Y axis (the vertical axis passing through the central position of the biological image 101) with respect to the image plane of the biological image 101.
  • a rotation angle is calculated (step 3).
  • the evaluation unit 12 refers to the table described above, acquires two evaluation values associated with the two rotation angles estimated as described above from the table, and, for example, obtains an average value of the evaluation values. Output.
  • FIG. 6 is a third example of the correspondence relationship between the biological image 101 and the evaluation value.
  • This third example represents the correspondence between the shape of the hand image shown in the biological image 101 and the evaluation value.
  • the evaluation condition 102 evaluates information on the shape of the image of the person's hand on the biometric image 101. It can be used when it has a condition corresponding to a value.
  • the horizontal axis represents information on the shape of the hand image shown in the biological image 101, more specifically, an index value indicating the degree of finger opening in the hand image, and the vertical axis represents the index value. It represents an evaluation value for the index value.
  • the evaluation unit 12 includes a table corresponding to the graph of FIG. 6 in which a correspondence relationship between the index value and the evaluation value in the hand image is represented.
  • the index value representing the degree of finger opening in the hand image shown in the biological image 101 can be estimated as shown in FIG. 7, for example.
  • this method first, from the biological image 101, the direction from the base of each finger in the image of the hand shown there is extracted as the direction of each finger, and the angle difference between the directions of adjacent fingers is calculated. The opening angles ⁇ 1 , ⁇ 2 , ⁇ 3 , and ⁇ 4 are calculated (step 1). Next, an index value for evaluating the finger opening degree is calculated based on the finger opening angle (step 2). The evaluation of the finger opening degree based on the finger opening angle is performed by the evaluation unit 12 as follows, for example.
  • the evaluation unit 12 calculates the sum of squares ⁇ 1 2 + ⁇ 2 2 + ⁇ 3 2 + ⁇ 4 2 as a first index value I1 as an index value for representing the magnitude of the finger opening angle.
  • the evaluation unit 12 uses ( ⁇ 2 ⁇ 3 ) 2 + ( ⁇ 3 ⁇ 4 ) 2 + ( ⁇ 4 ⁇ as an index value for expressing the degree of opening of the four fingers excluding the thumb.
  • ⁇ 2 ) 2 is calculated as the second index value I2. Note that the evaluation unit 12 is provided with two tables in advance, in which the correspondence between the two index values I1 and I2 and the evaluation value in the hand image is represented.
  • the evaluation unit 12 refers to this table, obtains two evaluation values E1 and E2 associated with the index values I1 and I2 from the table, performs subtraction of E1-E2, and performs the subtraction. The result is output as an evaluation value of the degree of finger opening.
  • the evaluation unit 12 When the evaluation unit 12 performs the evaluation based on the seventh evaluation condition 102g, for example, the length from the base of each finger in the image of the hand shown in the biological image 101 and the evaluation condition 102 are set in advance.
  • the evaluation unit 12 outputs an evaluation value corresponding to the closeness of the value to the predetermined value.
  • the evaluation part 12 perform the evaluation about the biometric image 101 based on another viewpoint. For example, an evaluation is made as to whether or not the image of the hand, which is the body part of the person to be authenticated, is deviated from the center position of the living body image 101, and the center position and the pixel whose luminance value is estimated to represent the image of the hand You may make it perform based on the positional relationship with the gravity center position of a group.
  • the registered biometric information 105 arranged in the lower part of FIG. It is extracted from the biological image 101 by the operation and stored in the registered biological information storage unit 14.
  • the registered biometric information 100 according to the conventional method arranged in the upper part of FIG. 8 is selected from the biometric images 101 by the second method described in the background section, that is, the similarity between the biometric images 101 is evaluated. This is a combination of images that are not similar to each other.
  • the authentication biometric information 106 is acquired at the time of authentication.
  • it is assumed that the same authentication biometric information 106 is obtained by the conventional technique and the biometric authentication system 1 of FIG.
  • the probability values attached to the biological image 101 in FIG. 8 are examples of probability values obtained by such an experiment. If there are images of position / posture / shape obtained with a relatively high probability, the probability values are relatively low. There are also images of position, posture, and shape that can only be obtained with probability.
  • the above-described second method that is, a method of selecting a combination of images that are not similar to each other from the biological image 101, a living body having a position, posture, and shape that can be obtained only with low probability. It is possible that the part showing the part is selected as the registered biometric information 100.
  • the registered biometric information 100 according to such a conventional method it is naturally likely that the position, posture, and shape of the image of the biometric part are different from the authenticated biometric information 106. There is concern about the deterioration of authentication accuracy.
  • the registered biometric information 100 and the authenticating biometric information 106 according to the conventional method are 15% when the image of the body part of the person to be authenticated is similar in position / posture / shape. , 80% of cases are not similar.
  • the evaluation condition 102 is set so that an image of a biological part image having a position / posture / shape obtained with a relatively high probability by the above-described experiment is extracted. Can do. If the biometric image 101 extracted in this way is registered biometric information 105, there is a high possibility that the authentication biometric information 106 and the position / posture / shape of the image of the biometric part are similar. It can be expected that the authentication with the information 106 can be performed with high accuracy. In the lower example of FIG. 8, the registered biometric information 105 and the authenticated biometric information 106 are 70% when the image of the body part of the person to be authenticated is similar in position, posture, and shape. This means that 25% of cases do not exist.
  • the biometric authentication system 1 of FIG. 1 by setting the evaluation condition 102 that exhibits an evaluation value having a higher degree of match as the position / posture / shape of the biological part that the person to be authenticated will perform at the time of verification, The possibility of performing biometric authentication with high accuracy can be increased. Therefore, by using the biometric authentication system 1 shown in FIG. 1, it is possible to perform authentication with high accuracy in accordance with an actual usage scene.
  • an image of a living body part having a position / posture / shape that the algorithm for matching the registered biometric information and the authenticating biometric information used in the authentication unit 20 is good at.
  • the evaluation condition 102 may be set so as to be extracted.
  • an evaluation condition 102 is set so that an image of a biological part having a position, posture, and shape that is not suitable for an algorithm for collating registered biometric information and authenticated biometric information used in the authentication unit 20 is extracted. Also good.
  • FIG. 9 illustrates an example of a specific configuration of the biometric information registration unit 10 in the biometric authentication system 1 of FIG.
  • the biometric information registration unit 10 includes an imaging unit 31, an imaging control unit 32, a biological image temporary storage unit 33, an image evaluation unit 34, a registration candidate storage unit 35, an evaluation result check unit 36, a feature extraction unit 37, and A registered biometric information storage unit 38 is provided.
  • the imaging unit 31 is a camera that captures a biological part of the person to be authenticated and acquires a biological image 101.
  • the imaging control unit 32 controls the imaging unit 31 to acquire a plurality of biometric images 101 of the person to be authenticated.
  • the imaging unit 31 and the imaging control unit 32 provide the function of the biological image acquisition unit 11 in FIG.
  • the biological image temporary storage unit 33 is a storage unit that temporarily stores the biological image 101 acquired by the imaging unit 31.
  • the image evaluation unit 34 reads the biological image 101 stored in the biological image temporary storage unit 33, evaluates each of a plurality of predetermined evaluation conditions 102, and evaluates each of the evaluation conditions 102 for the biological image 101. Obtain an evaluation value.
  • the image evaluation unit 34 extracts, for each evaluation condition 102, an evaluation value 107 indicating a high degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the acquired evaluation value 107.
  • the image evaluation unit 34 further determines the degree of match with the same evaluation condition 102 from the plurality of extracted evaluation values. The highest evaluation value 107 is extracted.
  • the image evaluation unit 34 reads the biological image 101 whose evaluation value 107 extracted as described above is an evaluation result from the biological image temporary storage unit 33 and associates it with the evaluation value 107 in the registration candidate storage unit 35. Each evaluation condition 102 is temporarily stored.
  • the image evaluation unit 34 provides the functions of the evaluation unit 12 and the evaluation value extraction unit 13 in FIG.
  • the registration candidate storage unit 35 stores the evaluation value 107 extracted by the image evaluation unit 34 and the biological image 101 whose evaluation value 107 is an evaluation result in association with each other, and temporarily stores each evaluation condition 102. Part.
  • the evaluation result check unit 36 determines whether there is an evaluation condition 102 in which the evaluation value 107 is not stored in the registration candidate storage unit 35.
  • the evaluation result check unit 36 gives an instruction to the imaging control unit 32 to control the imaging unit 31 to acquire a further biological image 101. Let it be done.
  • the biometric image 101 acquired again in this way is stored in the biometric image temporary storage unit 33, and then the above-described evaluation is performed by the image evaluation unit 34 to obtain the evaluation value 107 for each evaluation condition 102.
  • the evaluation result check unit 36 repeats the above-described operation until there is no evaluation condition 102 in which the evaluation value 107 is not stored in the registration candidate storage unit 35.
  • the evaluation result check unit 36 provides the function of the evaluation result check unit 15 in FIG.
  • the feature extraction unit 37 first reads all the combinations of the biological image 101 and the evaluation value 107 for each evaluation condition 102 from the registration candidate storage unit 35 after the above-described operation by the evaluation result check unit 36 is completed. Then, this combination is stored in the registered biometric information storage unit 38 as the biometric feature 108.
  • the registered biometric information storage unit 38 stores, as registered biometric information, a biometric feature 108 that is a combination of the evaluation value 107 extracted by the image evaluation unit 34 and the biometric image 101 whose evaluation value 107 is an evaluation result. It is a storage unit.
  • the registered biometric information storage unit 38 corresponds to the registered biometric information storage unit 14 in FIG.
  • the biometric information registration unit 10 in FIG. 9 is configured as described above. Note that some components of the biometric information registration unit 10 illustrated in FIG. 9 can be configured using a computer having a standard configuration.
  • FIG. 10 illustrates an example of the configuration of a computer.
  • the computer 40 includes an MPU 41, ROM 42, RAM 43, hard disk device 44, input device 45, display device 46, interface device 47, and recording medium drive device 48. These components are connected via a bus line 49, and various data can be exchanged under the management of the MPU 41.
  • An MPU (Micro Processing Unit) 41 is an arithmetic processing unit that controls the operation of the entire computer 40.
  • a ROM (Read Only Memory) 42 is a read-only semiconductor memory in which a predetermined basic control program is recorded in advance. The MPU 41 reads out and executes this basic control program when the computer 40 is activated, thereby enabling operation control of each component of the computer 40.
  • a RAM (Random Access Memory) 43 is a semiconductor memory that can be written and read at any time and used as a working storage area as necessary when the MPU 41 executes various control programs.
  • the hard disk device 44 is a storage device that stores various control programs executed by the MPU 41 and various data.
  • the MPU 41 can perform various control processes by reading and executing a predetermined control program stored in the hard disk device 44.
  • the input device 45 is, for example, a keyboard device or a mouse device.
  • the input device 45 acquires various information input from the administrator associated with the operation content. Then, the acquired input information is sent to the MPU 41.
  • the display device 46 is a liquid crystal display, for example, and displays various texts and images according to display data sent from the MPU 41.
  • the interface device 47 manages the exchange of various information with various devices connected to the computer 40.
  • the recording medium driving device 48 is a device that reads various control programs and data recorded in the portable recording medium 50.
  • the MPU 41 can read out and execute a predetermined control program recorded on the portable recording medium 50 via the recording medium driving device 48 to perform various control processes described later.
  • the portable recording medium 50 for example, a flash memory equipped with a CD-ROM (Compact Disc Read Only Memory), DVD-ROM (Digital Versatile Disc Disc Read Only Memory), or USB (Universal Serial Bus) standard connector. and so on.
  • a control program for causing the MPU 41 to perform biometric information registration processing described below is created.
  • the created control program is stored in advance in the hard disk device 44 or the portable recording medium 50.
  • the hard disk device 44 is caused to function as the biological image temporary storage unit 33, the registration candidate storage unit 35, and the registered biological information storage unit 38.
  • a camera which is an example of the imaging unit 31 is connected to the interface device 47 of the computer 40 so that the computer 40 can control the imaging unit 31 to acquire the biological image 101 and take it into the computer 40. Keep it. Then, a predetermined instruction is given to the MPU 41 to read and execute this control program.
  • the imaging control unit 32, the biological image temporary storage unit 33, the image evaluation unit 34, the registration candidate storage unit 35, the evaluation result check unit 36, the feature extraction unit 37, and the registered biological information storage unit 38 respectively have. It is possible to provide the function with the computer 40.
  • FIG. 11 is a flowchart illustrating an example of the contents of the biometric information registration process performed by the biometric information registration unit 10 in FIG.
  • the imaging unit 31 is controlled to capture the biometric part of the person to be authenticated, thereby acquiring the biometric image 101, and the biometric image temporary storage unit 33.
  • the imaging control unit 32 performs processing to be stored in.
  • the biological image 101 stored in the biological image temporary storage unit 33 at this time is expressed as “CaptImage”. This notation is used in the following description.
  • the image evaluation unit 34 performs a process of substituting the initial value “0” for the variable i.
  • "CaptImage” is evaluated with a value corresponding to the value of the variable i at this point in time among a plurality of predetermined evaluation conditions 102, and an evaluation value as an evaluation result is obtained.
  • the image evaluation part 34 performs the process to perform.
  • “Result [i]” acquired by the processing of S103 executed most recently is more than a predetermined evaluation threshold “Thresh [i]” predetermined for “Func [i]”.
  • the image evaluation unit 34 performs a process of determining whether the evaluation value is large and higher than a predetermined value.
  • the process proceeds to S105.
  • the image evaluation unit 34 determines that “Result [i]” acquired by the most recently executed S103 is equal to or less than “Thresh [i]” (when the determination result is No)
  • the process proceeds to S108. Advances.
  • the biological image 101 having the highest evaluation value for “Func [i]” is denoted as “TempImage [i]”, and this notation is used in the following description.
  • the image evaluation unit 34 determines that “TempImage [i]” is not stored in the predetermined storage position in the registration candidate storage unit 35 (is empty) (when the determination result is Yes) ) Proceeds to S107.
  • the image evaluation unit 34 determines that “TempImage [i]” is stored at a predetermined storage position in the registration candidate storage unit 35 (when the determination result is No) the process proceeds to S106.
  • step S107 the process proceeds.
  • the image evaluation unit 34 determines that “Result [i]” acquired by the most recently executed S103 is equal to or lower than “TempResult [i]” (when the determination result is No)
  • the process proceeds to S108. Advances.
  • the image evaluation unit 34 performs a process of adding “1” to the current value of the variable i and substituting the addition result into the variable i.
  • the image evaluation unit 34 performs a process of determining whether or not the value of the variable i at the time of this process is equal to or greater than the number FuncNum of the evaluation conditions 102.
  • the process proceeds to S110.
  • the evaluation result check unit 36 performs the processing.
  • the evaluation result check unit 36 determines that the biological image 101 is stored in all of the storage positions of “TempImage [i]” in the registration candidate storage unit 35 (when the determination result is Yes)
  • S111 determines that the biological image 101 is stored in all of the storage positions of “TempImage [i]” in the registration candidate storage unit 35 (when the determination result is Yes)
  • S111 the process proceeds.
  • all the storage positions of “TempImage [i]” in the registration candidate storage unit 35 remain where the biological image 101 is not stored (when the determination result is No)
  • the process proceeds to S101. Will return. In this case, the above-described processing after S101 is repeated until the biological image 101 is stored in all the storage positions of “TempImage [i]” in the registration candidate storage unit 35
  • the process up to and above is the biometric information registration process.
  • the evaluation value extracted by the image evaluation unit 34, the evaluation condition 102 when the image evaluation unit 34 outputs the evaluation value, and the biological image 101 that is the evaluation target are associated with each other.
  • the biometric feature 108 is stored in the registered biometric information storage unit 38.
  • the function of the biological image acquisition unit 11 in FIG. 1 is provided by the imaging control unit 32 executing the process of S101 in FIG.
  • the functions of the evaluation unit 12 and the evaluation value extraction unit 13 in FIG. 1 are provided by the image evaluation unit 34 executing the processing from S102 to S109 in FIG.
  • the function of the evaluation result check unit 15 in FIG. 1 is provided by the evaluation result check unit 36 executing the process of S110 in FIG. 11, and the registration in FIG. 1 is performed by the feature extraction unit 37 executing the process of S111.
  • the function of the biological information storage unit 14 is provided.
  • the biometric image 101 acquired by the imaging unit 31 is temporarily stored in the biometric image temporary storage unit 33.
  • the biological image 101 may be evaluated by the image evaluation unit 34.
  • the imaging unit 31 may acquire a plurality of biological images 101, and then the image evaluation unit 34 may collectively evaluate the plurality of biological images 101.
  • the value of the variable i is changed by “1” by the process of S108, so that the evaluation of “CaptImage ⁇ ⁇ ⁇ ” by “Func [i]” is sequentially performed.
  • the evaluation processing of “CaptImage” by “Func [i]” may be executed in a batch as parallel processing. For example, when a processing environment such as a multi-core multi-processor or a cloud system is used, processing time can be shortened by performing such parallel processing.
  • the evaluation result check unit 36 in FIG. 9 further extracts the biological image until the evaluation values 107 for all the evaluation conditions 102 are extracted in the extraction of the evaluation values 107 for each evaluation condition 102 by the image evaluation unit 34. 101 is acquired by the imaging unit 31. Instead, even if the evaluation condition 102 for which the evaluation value 107 has not been extracted exists, if the number of the evaluation values 107 reaches a predetermined number RegistNum, the evaluation result check unit 36 uses the imaging unit 31 The control for acquiring the image 101 may be terminated.
  • FIG. 12 is a first modified example of the flowchart of FIG. 11, in which the above-described operation is performed by the evaluation result check unit 36.
  • This determination process of S120 replaces the determination process of S110 in the flowchart of FIG.
  • the determination process in S120 is a process executed by the evaluation result check unit 36 when the result of the determination process in S109 in FIG. 11 is Yes.
  • the evaluation result check unit 36 determines that at least RegistNum biometric images 101 are stored (when the determination result is Yes)
  • the process proceeds to S111 in FIG.
  • the evaluation result check unit 36 determines that the number of stored biological images 101 is less than RegistNum (when the determination result is No)
  • the process returns to S101 in FIG. In this case, the processing from S101 described above is repeated until at least RegistNum biometric images 101 are stored in the registration candidate storage unit 35.
  • the feature extraction unit 37 selects only a predetermined number of RegistNum sets in the descending order of the evaluation value 107 (in descending order of coincidence with the evaluation condition 102) from among the combinations of the biological image 101 and the evaluation value 107. You may make it memorize
  • FIG. 13 is a second modification of the flowchart of FIG. 11 and is a case where the feature extraction unit 37 performs the above-described operation.
  • the process of S121 replaces the process of S111 in the flowchart of FIG.
  • the process of S121 is a process executed by the feature extraction unit 37 when the result of the determination process of S110 of FIG. 11 is Yes.
  • the feature extraction unit 37 causes the registered biometric information storage unit 38 to store only the RegistNum sets in descending order of the value of “TempResult [i]” in this combination.
  • the feature extraction unit 37 also stores information for specifying “Func [i]” in the registered biometric information storage unit 38 in association with the above combination. After that, the biometric information registration process in FIG. 11 ends.
  • FIG. 14 illustrates an example of a specific configuration of the authentication unit 20 in the biometric authentication system 1 of FIG.
  • the authentication unit 20 includes an imaging unit 51, an imaging control unit 52, a verification biometric image temporary storage unit 53, an image evaluation unit 54, a feature extraction unit 55, a verification biometric information storage unit 56, an evaluation order determination unit 57, And a feature matching unit 58.
  • 14 is provided in the biometric information registration unit 10 in FIG. 9, and the biometric feature 108 described above is stored as registered biometric information.
  • the imaging unit 51 is a camera that captures a biometric part of a person to be authenticated and acquires a biometric image 111 for verification during authentication.
  • the imaging control unit 52 controls the imaging unit 51 to acquire a plurality of verification subject biometric images 111.
  • the verification biometric image temporary storage unit 53 is a storage unit that temporarily stores the verification biometric image 111 acquired by the imaging unit 51.
  • the image evaluation unit 54 reads the verification biometric image 111 stored in the verification biometric image temporary storage unit 53 and uses the same evaluation conditions as those used by the image evaluation unit 34 in the biometric information registration unit 10 of FIG. 102 is evaluated, and an evaluation value 112 for each evaluation condition 102 is obtained.
  • the image evaluation unit 34 extracts, for each evaluation condition 102, an evaluation value 107 indicating a high degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the acquired evaluation value 107.
  • the feature extraction unit 55 reads the biometric image 111 for collation stored in the biometric image temporary storage unit 53 for collation, and the biometric feature for collation, which is information on the characteristics of the biological part shown in the biometric image 111 for collation. 113 is extracted and stored in the collation biometric information storage unit 56.
  • the collation biometric information storage unit 56 is a storage unit that stores the biometric feature 113 for collation extracted from the biometric image 111 for collation by the feature extraction unit 55.
  • the evaluation order determination unit 57 obtains the order in which the feature matching unit 58 reads the biometric feature 108 from the registered biometric information storage unit 38 and performs the later-described matching operation, and the image evaluation unit 54 evaluates the matching biometric image 111. This is determined based on the evaluation value 112. More specifically, the feature matching unit 58 first arranges the evaluation values 112 for each evaluation condition 102 in descending order of matching between the matching biological image 111 and the evaluation condition 102.
  • the feature matching unit 58 obtains the order of the evaluation conditions 102 by replacing the evaluation values 112 arranged in this way with the evaluation conditions 102 corresponding to the evaluation values 112. Then, the feature matching unit 58 determines the order of the evaluation conditions 102 as the reading order of the biometric image 101 associated with the evaluation condition 102 as the biometric feature 108 in the registered biometric information storage unit 38.
  • the feature matching unit 58 first reads the matching biometric feature 113 from the matching biometric information storage unit 56. Next, the feature matching unit 58 reads the biometric image 101 in the biometric feature 108 stored in the registered biometric information storage unit 38 in the order determined by the evaluation order determining unit 57 and matches it with the biometric feature 113 for matching. The feature matching unit 58 determines the level of similarity between the matching biometric image 111 and the biometric image 101 read out from the registered biometric information storage unit 38 in this way, and based on the determination result, the identity of the person to be authenticated Confirm and output the confirmation result.
  • the image evaluation unit 54 evaluates the biometric image 111 for verification by applying the same evaluation condition 102 as the image evaluation unit 34 in the biometric information registration unit 10 in FIG. 9. . Then, according to the determination by the evaluation order determination unit 57, the feature matching unit 58 is registered in association with the evaluation condition 102 when the evaluation value 112 indicates that the matching degree with the matching biological image 111 is high. The collation with the biological image 101 stored in the storage unit 38 is performed with priority.
  • the time required for authentication can be shortened. This is because the images of living body parts having the same position, posture, and shape can be expected to have high authentication accuracy, so the probability of correctly identifying the person as the person to be authenticated increases when the person authenticates. This is because it is often the case that a collation result is obtained immediately after performing one collation.
  • the authentication accuracy is low, so if the person authenticates as the person to be authenticated, there are many cases where the person cannot be correctly determined. As a result, the verification needs to be repeated many times.
  • the authentication unit 20 in FIG. 14 is configured as described above. Note that some components of the authentication unit 20 illustrated in FIG. 14 can be configured using a computer having a standard configuration as illustrated in FIG. For this purpose, for example, a control program for causing the MPU 41 to perform a collation process described below is created.
  • the created control program is stored in advance in the hard disk device 44 or the portable recording medium 50.
  • the hard disk device 44 is caused to function as the verification biometric image temporary storage unit 53, the verification biometric information storage unit 56, and the registered biometric information storage unit 38.
  • a camera which is an example of the imaging unit 51 is connected to the interface device 47 of the computer 40, and the computer 40 controls the imaging unit 51 to acquire the biometric image 111 for verification and causes the computer 40 to Make sure you can capture it. Then, a predetermined instruction is given to the MPU 41 to read and execute this control program.
  • the imaging control unit 52, the biometric image temporary storage unit 53, the image evaluation unit 54, the feature extraction unit 55, the verification biometric information storage unit 56, the evaluation order determination unit 57, the feature verification unit 58, and the registered biometrics The functions that the information storage unit 38 has can be provided by the computer 40.
  • FIG. 15 is a flowchart illustrating an example of the processing content of the collation processing performed by the authentication unit 20 of FIG.
  • the imaging unit 51 is controlled to capture the biometric part of the person to be authenticated to acquire the verification biometric image 111.
  • the imaging control unit 32 performs processing to be stored in the image temporary storage unit 53.
  • the biometric image 111 for collation stored in the biometric image temporary storage unit 53 for collation at this time is denoted as “SampleImage”. This notation is used in the following description.
  • step S ⁇ b> 202 the image evaluation unit 54 performs a process of substituting the initial value “0” for the variable i.
  • step S203 "SampleImage" is evaluated with a value corresponding to the value at the time of processing of the variable i among the same plurality of evaluation conditions 102 used in the image evaluation unit 34 in FIG. The image evaluation unit 54 performs processing for obtaining an evaluation value that is the evaluation result.
  • the image evaluation unit 54 performs a process of substituting the value of the variable i at the time of execution of this process into “EvalResult [i] .index”.
  • “EvalResult [i] .index” is information for specifying “Func [i]” when “EvalResult [i] .value” is obtained.
  • the image evaluation unit 54 performs a process of adding “1” to the current value of the variable i and substituting the addition result into the variable i.
  • the image evaluation unit 54 performs a process of determining whether or not the value of the variable i at the time of this process is equal to or greater than the number FuncNum of the evaluation conditions 102.
  • the process proceeds to S207.
  • sampleImage stored in the verification biometric image temporary storage unit 53 is read out, and information on the characteristics of the biological part shown in the image is extracted as the verification biometric feature 113, and the verification biometric information storage unit
  • the feature extraction unit 55 performs processing to be stored in 56.
  • the biometric feature 113 for matching is represented as “SampleFeature”, and this notation is used in the following description.
  • the feature matching unit 58 performs a process of substituting the initial value “0” for the variable j.
  • the feature matching unit 58 performs a process of substituting the value of “EvalResult [j] .index” into the variable k.
  • the feature matching unit 58 performs a process of matching the matching biometric feature 113 extracted from the matching biometric image 111 with the biometric image 101 read from the registered biometric information storage unit 38.
  • “SampleFeature” is read from the collation biometric information storage unit 56 and the biometric image 101 associated with “Func [k]” in the biometric feature 108 stored in the registered biometric information storage unit 38. Is read out. Then, the read “SampleFeature” and the feature extracted from the biometric image 101 are collated, and a process of obtaining the similarity between the biometric image 111 for collation and the biometric image 101 is performed.
  • the feature extracted from the biometric image 101 associated with “Func [k] ⁇ ⁇ ⁇ ⁇ ” in the biometric feature 108 stored in the registered biometric information storage unit 38 is denoted as “Template [k]”. is doing.
  • the feature matching unit 58 performs a process of determining whether or not the person to be authenticated has been confirmed to be the person. This determination process is performed by determining whether the similarity between the biometric image 111 for collation obtained by the collation process in S211 and the biometric image 101 is equal to or greater than a predetermined threshold value.
  • the feature matching unit 58 performs a process of outputting information indicating “person” as a matching result. Thereafter, the collation process ends.
  • the feature matching unit 58 performs a process of adding “1” to the current value of the variable j and substituting the addition result into the variable j.
  • the feature matching unit 58 performs a process of determining whether or not the value of the variable j at the time of this process is equal to or greater than the number FuncNum of the evaluation conditions 102.
  • the process proceeds to S216.
  • the process returns to S210, and the above-described value of the variable j is below the value at the time of this process. The processes after S210 are performed again.
  • the feature matching unit 58 performs a process of outputting information indicating “other” as a matching result. Thereafter, the collation process ends.
  • the process so far is the collation process.
  • the degree of similarity between the registered biometric information stored in the registered biometric information storage unit 38 and the authenticated biometric information is determined, and the authentication subject's person based on the determination result is determined.
  • Identity verification is performed.
  • this process as described above, since matching is performed preferentially between images showing images of living body parts having the same position, posture, and shape at the time of registration and at the time of authentication, The time required for verification can be shortened.
  • the feature matching unit 58 registers the biometric features 108 corresponding to the evaluation conditions 102 in the order of the evaluation conditions 102 having the highest evaluation value 112 when the biometric image 111 for matching is evaluated.
  • the collation operation is performed by reading from the information storage unit 38. Instead, the biometric features 108 are arranged in the descending order of the evaluation value 107 and stored in the registered biometric information storage unit 38, and the feature matching unit 58 reads the biometric features 108 from the registered biometric information storage unit 38 in this storage order. You may make it perform collation operation
  • FIG. 16A is a third modification of the flowchart of FIG. 11, in which the feature extraction unit 37 performs an operation for storing the biometric features 108 in the registered biometric information storage unit 38 in the order described above.
  • FIG. 16B is a first modification of the flowchart of FIG. 15, and is a case where the above-described operation is performed by the feature matching unit 58.
  • the process of S131 in FIG. 16A replaces the process of S111 in the flowchart of FIG.
  • the process of S131 is a process executed by the feature extraction unit 37 when the result of the determination process of S110 of FIG. 11 is Yes.
  • the feature matching unit 58 performs a process of substituting the initial value “0” for the variable j.
  • the processing content of S222 is the same as the processing of S209 in FIG.
  • the feature matching unit 58 performs a process of matching the matching biometric feature 113 extracted from the matching biometric image 111 with the biometric image 101 read from the registered biometric information storage unit 38.
  • “SampleFeature” is read from the collation biometric information storage unit 56 and the biometric image 101 associated with “Func [j]” in the biometric feature 108 stored in the registered biometric information storage unit 38. Is read out.
  • FIG. 17 is a modification of the specific configuration of the biometric information registration unit 10 illustrated in FIG.
  • the same components as those in FIG. 9 are denoted by the same reference numerals, and detailed description thereof will be omitted.
  • the configuration example illustrated in FIG. 17 is different from the configuration example illustrated in FIG. 9 only in that a guidance message display unit 39 is added.
  • the evaluation result check unit 36 determines whether there is an evaluation condition 102 in which the evaluation value 107 is not stored in the registration candidate storage unit 35, as in the configuration example of FIG. 9.
  • the evaluation result check unit 36 gives an instruction to the imaging control unit 32 to control the imaging unit 31 to acquire a further biological image 101. Let it be done.
  • the evaluation result check unit 36 gives a predetermined instruction to the guidance message display unit 39 in parallel with the instruction to the imaging control unit 32.
  • the guidance message display unit 39 is a display device that displays a predetermined message when receiving a predetermined instruction from the evaluation result check unit 36. This message has a content for guiding the body part of the person to be authenticated so that the imaging unit 31 can acquire the biological image 101 that satisfies the evaluation condition 102 in which the evaluation value 107 is not stored in the registration candidate storage unit 35.
  • the guidance message display unit 39 provides the function of the guidance output unit 16 of FIG.
  • the guidance message display unit 39 displays such a message
  • the biometric image 101 that satisfies the evaluation condition 102 that is insufficient in the registration candidate storage unit 35 is quickly obtained because the authenticated person follows the content of the message. Acquisition can be expected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Security & Cryptography (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The disclosed biometric authentication system acquires registered biometric information in a position, orientation and form suitable for authentication without placing a large burden on the user. To this end, the biometric authentication system is provided with a biometric image acquisition unit, an evaluation unit, an evaluation value extraction unit, and a registered biometric information storage unit. The evaluation unit evaluates each of the multiple biometric images acquired by the biometric image acquisition unit for each of multiple evaluation criteria and outputs, as the evaluation result for each of said evaluation criterion, the evaluation values indicating the degree to which the biometric images match the evaluation criterion. From the evaluation values of the biometric images outputted by the evaluation unit, the evaluation value extraction unit extracts for each evaluation condition the evaluation values that indicate a degree of matching greater than or equal to a prescribed evaluation threshold value. Furthermore, if multiple evaluation values are extracted for the same evaluation criterion, the evaluation value with the highest degree of matching for that evaluation criterion is selected. The registered biometric information storage unit stores, as registered biometric information of the authenticated subject, biometric images corresponding to the evaluation values extracted by the evaluation value extraction unit.

Description

生体認証システム、生体認証方法、及びプログラムBiometric authentication system, biometric authentication method, and program
 本明細書で議論される実施態様は、手の指紋、掌の静脈、顔などといった人の身体特徴を利用して個人認証を行う生体認証技術に関するものである。 The embodiment discussed in this specification relates to a biometric authentication technique for performing personal authentication using human body characteristics such as fingerprints of hands, palm veins, and faces.
 近代社会では様々なシチュエーションにおいて本人確認が必要とされる。この本人確認を精度良く行う手法のひとつとして、指紋・静脈・顔などといった人体の各部の特徴を利用して個人認証を行う生体認証技術が、近年広まってきている。例えば、許可された人しか入ることのできない制限エリアへの入退室の可否の判定、パソコンへのログイン判定、オンライン取引などといった様々なサービスへのアクセスの可否の判定などのために、生体認証が利用されている。 In modern society, identity verification is required in various situations. In recent years, biometric authentication technology that performs personal authentication by using the characteristics of each part of the human body such as fingerprints, veins, faces, etc. has become widespread as one method for accurately performing identity verification. For example, biometric authentication is used to determine whether to enter or leave a restricted area where only authorized people can enter, whether to log in to a personal computer, whether to access various services such as online transactions, etc. It's being used.
 このような生体認証を実施するには、まず、利用者の生体情報として、例えば指紋・静脈・顔などの画像を予め取得して、その利用者の登録生体情報として記録媒体に記録しておく。その後、上記のようなサービスを利用するための本人確認が必要になった際に、利用者の生体情報を登録時と同じように再び取得する。そして、このときに取得した生体情報(認証生体情報)と、記録媒体に記録しておいた前述の登録生体情報との照合を行い、その両者の類似度の高低を判別する。ここで、この類似度が所定の閾値よりも高ければ、本人であるとの認証結果を得る。 In order to perform such biometric authentication, first, as a user's biometric information, for example, an image such as a fingerprint, a vein, or a face is acquired in advance and recorded as a registered biometric information of the user on a recording medium. . Thereafter, when the identity verification for using the service as described above becomes necessary, the biometric information of the user is acquired again in the same manner as at the time of registration. Then, the biometric information (authenticated biometric information) acquired at this time is compared with the registered biometric information recorded on the recording medium, and the level of similarity between the two is determined. Here, if the similarity is higher than a predetermined threshold, an authentication result indicating that the person is the person is obtained.
 このような生体認証を高い精度で行うためには、認証に使用する生体情報である生体部位の画像(生体画像)の撮影を、登録時と照合時とで同じ位置・同じ姿勢で行うことが好ましい。しかしながら、このような撮影ができない場合が実際には多く、このことが認証精度を低下させる要因のひとつとなっている。 In order to perform such biometric authentication with high accuracy, an image of a biological part (biological image), which is biometric information used for authentication, can be taken at the same position and posture during registration and verification. preferable. However, there are actually many cases where such photographing cannot be performed, and this is one of the factors that reduce the authentication accuracy.
 これは、例えば、生体画像を撮影するセンサ(カメラ)に対し、毎回全く同じ位置・同じ姿勢で生体部位をかざすことは通常極めて困難であるためである。また、多くの場合において、利用者が登録時にどのような位置・姿勢で生体部位を撮影したかを忘れてしまうことも、このような撮影を困難にしている。更には、人の身体は、その部位によっては姿勢や形状の自由度が大きいために、同じ部位を撮影しても、得られる画像は姿勢や形状がその撮影の度に微妙に異なるものとなってしまう。例えば、掌のような部位は姿勢や形状の自由度が極めて高い生体部位であるため、得られる画像は撮影の度に掌の形状が微妙に異なるものとなってしまう。 This is because, for example, it is usually extremely difficult to hold a living body part at the same position and the same posture with respect to a sensor (camera) that captures a biological image. Also, in many cases, it is difficult to take such an image because the user forgets at which position / posture the image of the living body part was imaged at the time of registration. Furthermore, since the human body has a large degree of freedom in posture and shape depending on the part, even if the same part is photographed, the obtained image is slightly different in posture and shape each time of photographing. End up. For example, since a part such as a palm is a living body part with a very high degree of freedom in posture and shape, the shape of the palm is slightly different every time an image is taken.
 このような登録時と照合時とでの位置・姿勢・形状の違いに起因する認証精度の低下を避けるための手法が幾つか知られている。そのひとつに、生体部位を同じ位置・姿勢・形状にしっかりと固定して撮影するというものがある。しかしながら、この手法は利用者に多大な負担をかけることになり、一般には好まれるものではない。また、このような固定を行うために、かなりの時間を要する場合もある。 Several methods are known for avoiding a decrease in authentication accuracy due to the difference in position, orientation, and shape between registration and verification. One of them is to shoot with the body part firmly fixed at the same position, posture and shape. However, this method places a great burden on the user and is not generally preferred. In addition, it may take a considerable time to perform such fixing.
 また、別の手法として、様々に異なる位置・姿勢・形状で同一利用者の複数枚の生体画像を登録時に撮影して記録しておき、照合時には、登録時に取得した複数枚の生体画像を対象にした1枚対複数枚での画像の照合を行うという手法も幾つか提案されている。この手法については、登録生体情報である同一利用者の複数枚の生体画像をどのようなものにするかによって、幾つかに分類することができる。 Another method is to capture and record multiple biometric images of the same user at various positions, postures, and shapes at the time of registration, and target multiple biometric images acquired at registration at the time of verification. Several methods have also been proposed for collating images on a single sheet to a plurality of sheets. This method can be classified into several types according to what kind of biometric images of the same user, which are registered biometric information, are to be used.
 このうちの第一の手法は、特別な配慮は何もせずに、撮影した複数枚の生体画像をそのまま利用者についての登録生体情報として記録しておくというものである。また、このうちの第二の手法は、撮影した生体画像間の類似度の類似度を評価して、似ていない(すなわち類似度が低い)生体画像の組み合わせを、その利用者についての登録生体情報として記録しておくというものである。また、このうちの第三の手法は、登録時に利用者へ何らかの誘導メッセージを提示して生体部位を適切な位置や姿勢に誘導し、その上で複数枚の画像を撮影して利用者についての登録生体情報として記録しておくというものである。 The first of these is to record a plurality of captured biometric images as registered biometric information about the user without any special consideration. Also, the second of these methods evaluates the similarity of the similarity between the captured biometric images, and determines the combination of bioimages that are not similar (that is, the similarity is low) as a registered biometric for the user. It is to record as information. In addition, the third method is to present a guidance message to the user at the time of registration to guide the living body part to an appropriate position and posture, and then shoot a plurality of images on the user. It is recorded as registered biometric information.
特表2007-521577号公報Special table 2007-521777 gazette 特開2007-156936号公報JP 2007-156936 A
 前述の第一の手法は、全くの偶然に頼った手法であるため、登録生体情報が生体認証の用途として適切なものになるとは限らない。
 また、前述の第二の手法でも、照合時に撮影される利用者の生体画像との照合に適した位置・姿勢の登録生体情報が網羅的に取得できているとは限らない。つまり、この第二の手法も結局は偶然に頼ったものであり、適切な登録生体情報が得られているとは限らない。
Since the first method described above is a method that relies entirely by chance, the registered biometric information is not always appropriate for biometric authentication.
In addition, even with the second method described above, it is not always possible to comprehensively acquire registered biometric information of positions and orientations suitable for collation with a user's biometric image captured during collation. In other words, this second method is also relied upon by chance, and appropriate registered biometric information is not always obtained.
 これらに対し、前述の第三の手法は、適切な登録生体情報を得られる可能性が高い点においては有効な手法であるが、生体部位の適切な位置や姿勢への誘導に従わせることが利用者に負担を強いることになる。また、このような誘導を行うための手段(例えば、補助者(オペレータ)、表示装置などの視覚的手段、ガイダンス音声を放音させるための出力装置など)が別途必要となる。 On the other hand, the third method described above is an effective method in that there is a high possibility that appropriate registered biometric information can be obtained. However, it is possible to follow guidance to an appropriate position and posture of the living body part. This will put a burden on the user. In addition, a means for performing such guidance (for example, an assistant (operator), visual means such as a display device, an output device for emitting a guidance sound, etc.) is separately required.
 このように、照合に適した位置・姿勢・形状の登録生体情報を、利用者に多大な負担を与えずに獲得することが望まれている。 As described above, it is desired to obtain registered biometric information having a position, posture, and shape suitable for collation without imposing a great burden on the user.
 本明細書で後述する生体認証システムに、登録時に予め登録しておいた被認証者の登録生体情報と認証時に取得する該被認証者の認証生体情報との類似度に基づいて該被認証者の本人確認を行うというものがある。この生体認証システムは、生体画像取得部と、評価部と、評価値抽出部と、登録生体情報記憶部と、認証部と、を備えているものがある。ここで、生体画像取得部は、被認証者の生体画像を複数枚取得する。評価部は、生体画像取得部が取得した複数枚の生体画像の各々を、予め定められている複数の評価条件の各々について評価して、生体画像についての評価条件との合致度の高さを示す評価値を、生体画像の評価結果として当該複数の評価条件ごとに出力する。評価値抽出部は、評価部が出力する複数枚の生体画像の各々についての評価値から、所定の評価閾値以上の合致度の高さを示している評価値を評価条件ごとに抽出する。なお、この抽出によって評価値が同一の評価条件において複数抽出された場合には、評価値抽出部は、抽出された複数の評価値から、更に、その同一の評価条件との合致度が最も高い評価値の抽出を行う。登録生体情報記憶部は、評価値抽出部が抽出した評価値に対応する生体画像を、被認証者の登録生体情報として複数記憶しておく。そして、認証部は、登録生体情報記憶部に記憶されている登録生体情報と、認証生体情報との類似度の高低の判定を行い、その判定結果に基づいて被認証者の本人確認を行う。 In the biometric authentication system described later in this specification, the person to be authenticated is based on the similarity between the registered biometric information of the person to be authenticated previously registered at the time of registration and the authentication biometric information of the person to be authenticated acquired at the time of authentication. There is something to verify the identity of. This biometric authentication system includes a biometric image acquisition unit, an evaluation unit, an evaluation value extraction unit, a registered biometric information storage unit, and an authentication unit. Here, the biometric image acquisition unit acquires a plurality of biometric images of the person to be authenticated. The evaluation unit evaluates each of the plurality of biological images acquired by the biological image acquisition unit for each of a plurality of predetermined evaluation conditions, and determines the degree of coincidence with the evaluation conditions for the biological image. The evaluation value to be shown is output as the evaluation result of the biological image for each of the plurality of evaluation conditions. The evaluation value extraction unit extracts, for each evaluation condition, an evaluation value indicating a high degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the evaluation values for each of the plurality of biological images output by the evaluation unit. When a plurality of evaluation values are extracted under the same evaluation condition by this extraction, the evaluation value extraction unit further has the highest degree of matching with the same evaluation condition from the plurality of extracted evaluation values. The evaluation value is extracted. The registered biometric information storage unit stores a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction unit as registered biometric information of the person to be authenticated. Then, the authentication unit determines whether the similarity between the registered biometric information stored in the registered biometric information storage unit and the authenticated biometric information is high or low, and verifies the identity of the person to be authenticated based on the determination result.
 また、本明細書で後述する生体認証方法に、登録時に予め登録しておいた被認証者の登録生体情報と認証時に取得する該被認証者の認証生体情報との類似度に基づいて該被認証者の本人確認を行うというものがある。この生体認証方法では、まず、評価部が、被認証者の複数枚の生体画像の各々を、予め定められている複数の評価条件の各々について評価して、生体画像についての評価条件との合致度の高さを示す評価値を、生体画像の評価結果として該複数の評価条件ごとに出力する。次に、評価値抽出部が、複数枚の生体画像の各々について評価部が出力した評価値から、所定の評価閾値以上の合致度の高さを示している評価値を評価条件ごとに抽出する。なお、この抽出によって評価値が同一の評価条件において複数抽出された場合には、評価値抽出部が、抽出された複数の評価値から、更に、該同一の評価条件との合致度が最も高い評価値の抽出を行う。次に、登録生体情報記憶部が、評価値抽出部により抽出された評価値に対応する生体画像を、被認証者の登録生体情報として複数記憶しておく。そして、認証部が、登録生体情報記憶部に記憶されている登録生体情報と、認証生体情報との類似度の高低の判定を行い、この判定結果に基づいて被認証者の本人確認を行う。 In addition, the biometric authentication method described later in this specification uses the biometric authentication method based on the degree of similarity between the registered biometric information of the authenticated person previously registered at the time of registration and the authenticated biometric information of the authenticated person acquired at the time of authentication. There is one that verifies the identity of the certifier. In this biometric authentication method, first, the evaluation unit evaluates each of the plurality of biometric images of the person to be authenticated for each of a plurality of predetermined evaluation conditions, and matches the evaluation conditions for the biometric image. An evaluation value indicating the height of the degree is output for each of the plurality of evaluation conditions as a biological image evaluation result. Next, the evaluation value extraction unit extracts, for each evaluation condition, an evaluation value indicating a degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the evaluation values output by the evaluation unit for each of a plurality of biological images. . When a plurality of evaluation values are extracted under the same evaluation condition by this extraction, the evaluation value extraction unit further has the highest degree of matching with the same evaluation condition from the plurality of extracted evaluation values. The evaluation value is extracted. Next, the registered biometric information storage unit stores a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction unit as registered biometric information of the person to be authenticated. Then, the authentication unit determines whether the similarity between the registered biometric information stored in the registered biometric information storage unit and the authenticated biometric information is high or low, and verifies the identity of the person to be authenticated based on the determination result.
 また、本明細書で後述するプログラムに、登録時に予め登録しておいた被認証者の登録生体情報と認証時に取得する該被認証者の認証生体情報との類似度に基づいた該被認証者の本人確認をコンピュータに行わせるためのものがある。このプログラムは、コンピュータに、評価処理と、評価値抽出処理と、登録生体情報記憶処理と、認証処理とを行わせる。ここで、評価処理は、被認証者の複数枚の生体画像の各々を、予め定められている複数の評価条件の各々について評価して、生体画像についての評価条件との合致度の高さを示す評価値を、生体画像の評価結果として該複数の評価条件ごとに出力する処理である。評価値抽出処理は、複数枚の生体画像の各々について評価処理により出力された評価値から、所定の評価閾値以上の合致度の高さを示している評価値を評価条件ごとに抽出する処理である。なお、この抽出によって評価値が同一の評価条件において複数抽出された場合には、評価値抽出処理により、抽出された複数の評価値から、更に、この同一の評価条件との合致度が最も高い評価値を抽出する処理が行われる。登録生体情報記憶処理は、評価値抽出処理によって抽出された評価値に対応する生体画像を、被認証者の登録生体情報として記憶部に複数記憶させる処理である。そして、認証処理は、この記憶部に記憶されている登録生体情報と、認証生体情報との類似度の高低の判定を行い、この判定結果に基づいて前記被認証者の本人確認を行う。 In addition, in the program described later in this specification, the authenticated person based on the similarity between the registered biometric information of the authenticated person registered in advance at the time of registration and the authenticated biometric information of the authenticated person acquired at the time of authentication. There is a thing to make a computer perform identity verification. This program causes a computer to perform an evaluation process, an evaluation value extraction process, a registered biometric information storage process, and an authentication process. Here, the evaluation process evaluates each of the plurality of biometric images of the person to be authenticated for each of a plurality of predetermined evaluation conditions, and determines the degree of coincidence with the evaluation conditions for the biometric image. This is a process of outputting the indicated evaluation value as the evaluation result of the biological image for each of the plurality of evaluation conditions. The evaluation value extraction process is a process of extracting, for each evaluation condition, an evaluation value indicating a degree of coincidence that is equal to or higher than a predetermined evaluation threshold, from the evaluation values output by the evaluation process for each of a plurality of biological images. is there. If a plurality of evaluation values are extracted under the same evaluation condition by this extraction, the degree of matching with the same evaluation condition is the highest from the plurality of evaluation values extracted by the evaluation value extraction process. A process for extracting an evaluation value is performed. The registered biometric information storage process is a process of storing a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction process in the storage unit as registered biometric information of the person to be authenticated. In the authentication process, the degree of similarity between the registered biometric information stored in the storage unit and the authentication biometric information is determined, and the identity of the person to be authenticated is verified based on the determination result.
 本明細書で後述する生体認証装置は、照合に適した位置・姿勢・形状の登録生体情報を、利用者に多大な負担を与えずに獲得することができる。 The biometric authentication device described later in this specification can acquire registered biometric information having a position, posture, and shape suitable for collation without imposing a great burden on the user.
生体認証システムの一実施例の機能構成図である。It is a functional block diagram of one Example of a biometrics authentication system. 生体認証システムの一実施例の動作を説明する図である。It is a figure explaining operation | movement of one Example of a biometrics authentication system. 生体画像と評価値との対応関係の第一の例を図解した図である。It is the figure which illustrated the 1st example of the correspondence of a biometric image and an evaluation value. 生体画像と評価値との対応関係の第二の例を図解した図である。It is the figure which illustrated the 2nd example of the correspondence of a biometric image and an evaluation value. 生体画像に映っている手の像の傾きの角度の推定手法の一例を説明する図である。It is a figure explaining an example of the estimation method of the inclination angle of the image of the hand reflected in the biological image. 生体画像と評価値との対応関係の第三の例を図解した図である。It is the figure which illustrated the 3rd example of the correspondence of a biometric image and evaluation value. 生体画像に映っている手の像における指の開き具合を表す指標値の推定手法の一例を説明する図である。It is a figure explaining an example of the estimation method of the index value showing the opening degree of the finger in the image of the hand reflected in the biological image. 図1の生体認証システムの効用を説明する図である。It is a figure explaining the utility of the biometric authentication system of FIG. 図1の生体認証システムにおける生体情報登録部の具体的構成の一例を図解した図である。It is the figure which illustrated an example of the specific structure of the biometric information registration part in the biometric authentication system of FIG. コンピュータの構成例を図解した図である。It is the figure which illustrated the example of a structure of the computer. 図9の生体情報登録部で行われる生体情報登録処理の処理内容の一例を図解したフローチャートである。10 is a flowchart illustrating an example of processing content of a biometric information registration process performed by a biometric information registration unit in FIG. 9. 図11のフローチャートの第一変形例である。It is a 1st modification of the flowchart of FIG. 図11のフローチャートの第二変形例である。It is a 2nd modification of the flowchart of FIG. 図1の生体認証システムにおける認証部の具体的構成の一例を図解した図である。It is the figure which illustrated an example of the specific structure of the authentication part in the biometric authentication system of FIG. 図14の認証部で行われる照合処理の処理内容の一例を図解したフローチャートである。It is the flowchart which illustrated an example of the processing content of the collation process performed in the authentication part of FIG. 図11のフローチャートの第三変形例である。It is a 3rd modification of the flowchart of FIG. 図15のフローチャートの第一変形例である。It is a 1st modification of the flowchart of FIG. 図9の生体情報登録部の具体的構成の変形例である。It is a modification of the concrete structure of the biometric information registration part of FIG.
 まず図1について説明する。図1は生体認証システムの一実施例の機能構成図である。
 この生体認証システム1は、登録時に予め登録しておいた被認証者の登録生体情報と認証時に取得する当該被認証者の認証生体情報との類似度に基づいて、当該被認証者の本人確認を行う。
First, FIG. 1 will be described. FIG. 1 is a functional configuration diagram of an embodiment of a biometric authentication system.
The biometric authentication system 1 confirms the identity of the authenticated person based on the similarity between the registered biometric information of the authenticated person registered in advance at the time of registration and the authenticated biometric information of the authenticated person acquired at the time of authentication. I do.
 この生体認証システム1は、被認証者の生体情報の登録を行う生体情報登録部10と、生体情報登録部10により登録された生体情報を使用して被認証者の認証を行って被認証者の本人確認を行う認証部20とを備えている。 The biometric authentication system 1 includes a biometric information registration unit 10 that registers biometric information of a person to be authenticated, and authenticates the person to be authenticated using the biometric information registered by the biometric information registration unit 10. And an authenticating unit 20 for performing identity verification.
 生体情報登録部10は、生体画像取得部11、評価部12、評価値抽出部13、及び登録生体情報記憶部14を備えている。
 生体画像取得部11は、被認証者の生体画像を複数枚取得する。
The biometric information registration unit 10 includes a biometric image acquisition unit 11, an evaluation unit 12, an evaluation value extraction unit 13, and a registered biometric information storage unit 14.
The biometric image acquisition unit 11 acquires a plurality of biometric images of the person to be authenticated.
 評価部12は、生体画像取得部11が取得した複数枚の生体画像の各々を、予め定められている複数の評価条件の各々について評価して、生体画像についての評価条件との合致度の高さを示す評価値を、生体画像の評価結果として当該複数の評価条件ごとに出力する。 The evaluation unit 12 evaluates each of the plurality of biological images acquired by the biological image acquisition unit 11 for each of a plurality of predetermined evaluation conditions, and has a high degree of coincidence with the evaluation conditions for the biological image. An evaluation value indicating the accuracy is output as the evaluation result of the biological image for each of the plurality of evaluation conditions.
 評価値抽出部13は、評価部12が出力する複数枚の生体画像の各々についての評価値から、所定の評価閾値以上の合致度の高さを示している評価値を、前述の評価条件ごとに抽出する。この評価値の抽出は、認証部20による後述の被認証者の本人確認に使用できない生体画像を、登録生体情報の候補から振るい落とすために行われるものである。また、評価値抽出部13は、この抽出によって評価値が同一の評価条件において複数抽出された場合には、抽出された複数の評価値から、更に、当該同一の評価条件との合致度が最も高い評価値の抽出を行う。 The evaluation value extraction unit 13 calculates an evaluation value indicating a degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the evaluation value for each of the plurality of biological images output by the evaluation unit 12 for each evaluation condition described above. To extract. The extraction of the evaluation value is performed in order to shake off a biometric image that cannot be used for the later-described authentication of the person to be authenticated by the authenticating unit 20 from the registered biometric information candidates. In addition, when a plurality of evaluation values are extracted under the same evaluation condition by this extraction, the evaluation value extraction unit 13 further has the highest degree of match with the same evaluation condition from the plurality of extracted evaluation values. Extract high evaluation values.
 登録生体情報記憶部14は,評価値抽出部13が抽出した評価値に対応する生体画像を、被認証者の登録生体情報として複数記憶しておく。
 認証部20は、登録生体情報記憶部14に記憶されている登録生体情報と、前述の認証生体情報との類似度の高低の判定を行い、被認証者の本人確認を、この判定結果に基づいて行う。
The registered biometric information storage unit 14 stores a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction unit 13 as registered biometric information of the person to be authenticated.
The authentication unit 20 determines the degree of similarity between the registered biometric information stored in the registered biometric information storage unit 14 and the authentication biometric information described above, and authenticates the person to be authenticated based on the determination result. Do it.
 なお、生体認証システム1の生体情報登録部10は、図1に描かれているように、更に、評価結果チェック部15と案内出力部16とを備えるようにしてもよい。
 評価結果チェック部15は、評価値抽出部13による評価条件ごとの評価値の抽出において評価値の抽出がされなかった評価条件が存在したか否かを判定する。評価結果チェック部15は、ここで、評価値の抽出がされなかった評価条件が存在したと判定した場合には、生体画像取得部11による生体画像の取得と、評価部12による生体画像の評価と、評価値抽出部13による評価値の抽出とを再度行わせる制御を行う。
The biometric information registration unit 10 of the biometric authentication system 1 may further include an evaluation result check unit 15 and a guidance output unit 16 as illustrated in FIG.
The evaluation result check unit 15 determines whether or not there is an evaluation condition in which the evaluation value is not extracted in the extraction of the evaluation value for each evaluation condition by the evaluation value extraction unit 13. If the evaluation result check unit 15 determines that there is an evaluation condition for which no evaluation value has been extracted, the evaluation result check unit 15 acquires the biological image by the biological image acquisition unit 11 and evaluates the biological image by the evaluation unit 12. And the control for causing the evaluation value extraction unit 13 to extract the evaluation value again.
 案内出力部16は、評価結果チェック部15が評価値抽出部13による評価条件ごとの評価値の抽出において評価値の抽出がされなかった評価条件が存在したと判定した場合に、被認証者の生体部位を誘導するための被認証者への案内を出力する。この案内は、その評価条件を満たす生体画像を生体画像取得部11が得られるようにするためのものである。 When the evaluation result check unit 15 determines that there is an evaluation condition in which the evaluation value is not extracted in the extraction of the evaluation value for each evaluation condition by the evaluation value extraction unit 13, the guidance output unit 16 A guide to the person to be authenticated for guiding the living body part is output. This guidance is for the biological image acquisition unit 11 to obtain a biological image that satisfies the evaluation condition.
 次に、生体認証システム1の動作について、図2を用いて更に説明する。
 図2において、生体画像101は、生体画像取得部11が取得した被認証者についての複数枚の画像である。図2の例では、複数枚の生体画像101の各々には、被認証者の生体部位である、様々な位置・姿勢・形状の手の像が映っている。
Next, the operation of the biometric authentication system 1 will be further described with reference to FIG.
In FIG. 2, the biometric image 101 is a plurality of images about the person to be authenticated acquired by the biometric image acquisition unit 11. In the example of FIG. 2, each of the plurality of biometric images 101 includes images of hands having various positions, postures, and shapes that are biometric parts of the person to be authenticated.
 評価部12は、生体画像101の各々を、予め定められている複数の評価条件102について評価して、評価条件102の各々との合致度の高さを示す評価値を、生体画像101の各々についての評価結果として評価条件102ごとに出力する。 The evaluation unit 12 evaluates each of the biological images 101 with respect to a plurality of predetermined evaluation conditions 102, and calculates an evaluation value indicating the degree of matching with each of the evaluation conditions 102 for each of the biological images 101. Is output as an evaluation result for each evaluation condition 102.
 評価条件102は、生体認証システム1の利用場面において典型的である様々な画像のパターンが、登録生体情報として揃えられるようなものとしておくことが好ましい。図2の例では、第一評価条件102a、第二評価条件102b、第三評価条件102c、第四評価条件102d、第五評価条件102e、第六評価条件102f、及び第七評価条件102gの計7個の評価条件102が定められている。ここで、第一評価条件102aは、画像中に映っている手の像の大きさが所定の第一の大きさであるという条件であり、第二評価条件102bは、画像中に映っている手の像の大きさが所定の第二の大きさであるという条件である。また、第三評価条件102cは、画像中に映っている手の像の傾きが所定の第一の傾きであるという条件であり、第四評価条件102dは、画像中に映っている手の像の傾きが所定の第二の傾きであるという条件である。そして、第五評価条件102eは、画像中に映っている手の像が全ての指を閉じた形状であるという条件であり、第六評価条件102fは、画像中に映っている手の像が全ての指を最大に開いた形状であるという条件である。更に、第七評価条件102gは、画像中に映っている手の像の各指を所定の角度に曲げた形状であるという条件である。 The evaluation condition 102 is preferably such that various image patterns that are typical in the usage scene of the biometric authentication system 1 are arranged as registered biometric information. In the example of FIG. 2, the first evaluation condition 102a, the second evaluation condition 102b, the third evaluation condition 102c, the fourth evaluation condition 102d, the fifth evaluation condition 102e, the sixth evaluation condition 102f, and the seventh evaluation condition 102g are calculated. Seven evaluation conditions 102 are defined. Here, the first evaluation condition 102a is a condition that the size of the hand image shown in the image is a predetermined first size, and the second evaluation condition 102b is shown in the image. The condition is that the size of the hand image is a predetermined second size. The third evaluation condition 102c is a condition that the inclination of the hand image shown in the image is a predetermined first inclination, and the fourth evaluation condition 102d is the image of the hand shown in the image. Is a predetermined second inclination. The fifth evaluation condition 102e is a condition that the image of the hand shown in the image has a shape with all fingers closed, and the sixth evaluation condition 102f is the image of the hand shown in the image. It is a condition that all fingers are in a shape with the maximum opening. Furthermore, the seventh evaluation condition 102g is a condition that each finger of the image of the hand shown in the image is bent to a predetermined angle.
 評価値抽出部13は、評価部12が出力する複数枚の生体画像101の各々についての評価値から、所定の評価閾値以上の合致度の高さを示している評価値を、評価条件102ごとに抽出する。また、評価値抽出部13は、この抽出によって評価値が同一の評価条件102において複数抽出された場合には、抽出された複数の評価値から、更に、同一の評価条件102との合致度が最も高い評価値の抽出を行う。図2における抽出生体画像103は、このようにして評価値抽出部13が抽出した評価値に対応している、評価条件102ごとの生体画像101である。 The evaluation value extraction unit 13 determines, for each evaluation condition 102, an evaluation value that indicates the degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the evaluation values for each of the plurality of biological images 101 output by the evaluation unit 12. To extract. Further, when a plurality of evaluation values are extracted under the same evaluation condition 102 by this extraction, the evaluation value extraction unit 13 further determines the degree of match with the same evaluation condition 102 from the plurality of extracted evaluation values. The highest evaluation value is extracted. The extracted biological image 103 in FIG. 2 is a biological image 101 for each evaluation condition 102 corresponding to the evaluation value extracted by the evaluation value extraction unit 13 in this way.
 なお、生体認証システム1が評価結果チェック部15を備えている場合には、図2の評価結果チェック104が行われる。この評価結果チェック104では、評価値抽出部13による評価条件102ごとの評価値の抽出において評価値の抽出がされなかった評価条件102が存在したか否かの判定が行われる。ここで、評価値の抽出がされなかった評価条件102が存在したと判定された場合には、生体画像取得部11による生体画像101の取得が再度行われ、更に、評価部12による生体画像101の評価と、評価値抽出部13による評価値の抽出とが再度行われる。この評価結果チェック104により、全ての評価条件102についての評価値の抽出が行われて、全ての評価条件102についての抽出生体画像103が得られるようになる。なお、生体認証システム1が案内出力部16を更に備えているときには、上述の場合に、評価条件102を満たす生体画像101を生体画像取得部11が得られるように被認証者の生体部位を誘導するための被認証者への案内を案内出力部16が出力する。 When the biometric authentication system 1 includes the evaluation result check unit 15, the evaluation result check 104 in FIG. 2 is performed. In this evaluation result check 104, it is determined whether or not there is an evaluation condition 102 for which the evaluation value is not extracted in the extraction of the evaluation value for each evaluation condition 102 by the evaluation value extraction unit 13. Here, when it is determined that the evaluation condition 102 from which the evaluation value has not been extracted exists, the biological image 101 is acquired again by the biological image acquisition unit 11, and further, the biological image 101 by the evaluation unit 12 is acquired. And evaluation value extraction by the evaluation value extraction unit 13 are performed again. With this evaluation result check 104, the evaluation values for all the evaluation conditions 102 are extracted, and the extracted biological images 103 for all the evaluation conditions 102 are obtained. When the biometric authentication system 1 further includes the guidance output unit 16, in the above-described case, the biometric part of the person to be authenticated is guided so that the biometric image acquisition unit 11 can obtain the biometric image 101 that satisfies the evaluation condition 102. The guidance output unit 16 outputs guidance to the person to be authenticated for this purpose.
 以上のようにして評価条件102ごとに得られた抽出生体画像103は、被認証者の登録生体情報105として、登録生体情報記憶部14に記憶される。認証部20は、このようにして登録生体情報記憶部14に記憶されている被認証者の登録生体情報と、認証時に取得する当該被認証者の認証生体情報との類似度の高低の判定を行い、当該被認証者の本人確認を、この判定結果に基づいて行う。 The extracted biometric image 103 obtained for each evaluation condition 102 as described above is stored in the registered biometric information storage unit 14 as the registered biometric information 105 of the person to be authenticated. The authentication unit 20 determines whether the degree of similarity between the registered biometric information of the person to be authenticated stored in the registered biometric information storage unit 14 and the authentication biometric information of the person to be authenticated acquired at the time of authentication is high or low. The identity verification of the person to be authenticated is performed based on the determination result.
 次に、図2に例示した評価条件102ごとの評価部12の動作について説明する。
 まず図3について説明する。図3は、生体画像101と評価値との対応関係の第一の例である。この第一の例は、生体画像101に映っている手の像の大きさと評価値との対応関係を表している。この第一の例は、前述した第一評価条件102a及び第二評価条件102bの場合、すなわち、評価条件102が、生体画像101上での被認証者の手の像の大きさの情報を、評価値に対応させる条件として有している場合に利用できる。
Next, the operation of the evaluation unit 12 for each evaluation condition 102 illustrated in FIG. 2 will be described.
First, FIG. 3 will be described. FIG. 3 is a first example of the correspondence between the biological image 101 and the evaluation value. This first example represents the correspondence between the size of the hand image shown in the biological image 101 and the evaluation value. In the first example, in the case of the first evaluation condition 102a and the second evaluation condition 102b described above, that is, the evaluation condition 102 includes information on the size of the image of the person to be authenticated on the biometric image 101. It can be used when it has a condition corresponding to the evaluation value.
 図3において、横軸は生体画像101に映っている手の像の大きさの情報、より具体的には、その手の像の面積を表しており、縦軸はその面積についての評価値を表している。この図3のグラフでは、手の像の面積が評価条件102として予め定めておいた所定値に近いほど、その手の像の面積の値に大きな値の評価値が対応付けられており、評価条件102に含まれている大きさの情報との合致度が高いことがこの評価値から分かる。評価部12には、この手の像の面積値と評価値との対応関係が表されている、図3のグラフに相当するテーブルを備えておく。 In FIG. 3, the horizontal axis represents information on the size of the hand image shown in the biological image 101, more specifically, the area of the hand image, and the vertical axis represents the evaluation value for the area. Represents. In the graph of FIG. 3, as the area of the hand image is closer to a predetermined value set in advance as the evaluation condition 102, a larger evaluation value is associated with the value of the area of the hand image. It can be seen from this evaluation value that the degree of coincidence with the size information included in the condition 102 is high. The evaluation unit 12 includes a table corresponding to the graph of FIG. 3 in which the correspondence between the area value of the hand image and the evaluation value is represented.
 なお、生体画像101に映っている手の像の面積は、例えば、生体画像101を構成する画素のうち、輝度値が手の像を表現していると推定される画素の個数を計数することで推定できる。評価部12は、上述したテーブルを参照して、前述のようにして推定した面積値に対応付けられている評価値を当該テーブルから取得して出力する。 Note that the area of the hand image shown in the biological image 101 is, for example, counting the number of pixels whose luminance value is estimated to represent the hand image among the pixels constituting the biological image 101. Can be estimated. The evaluation unit 12 refers to the above-described table, acquires an evaluation value associated with the area value estimated as described above from the table, and outputs the acquired evaluation value.
 次に図4について説明する。図4は、生体画像101と評価値との対応関係の第二の例である。この第二の例は、生体画像101に映っている手の像の傾きと評価値との対応関係を表している。この第二の例は、前述した第三評価条件102c及び第四評価条件102dの場合、すなわち、評価条件102が、生体画像101上での被認証者の手の像の傾きの情報を、評価値に対応させる条件として有している場合に利用できる。 Next, FIG. 4 will be described. FIG. 4 is a second example of the correspondence between the biological image 101 and the evaluation value. This second example represents the correspondence between the inclination of the hand image shown in the biological image 101 and the evaluation value. In the second example, in the case of the third evaluation condition 102c and the fourth evaluation condition 102d described above, that is, the evaluation condition 102 evaluates information on the inclination of the image of the user's hand on the biometric image 101. It can be used when it has a condition corresponding to a value.
 図4において、横軸は生体画像101に映っている手の像の傾きの情報、より具体的には、その手の像の傾きの角度を表しており、縦軸はその角度についての評価値を表している。このグラフでは、手の像の傾きの角度が評価条件102として予め定めておいた所定値に近いほど、その手の像の傾きの角度に大きな値の評価値が対応付けられており、評価条件102に含まれている傾きの情報との合致度が高いことがこの評価値から分かる。評価部12には、この手の像の傾きの角度と評価値との対応関係が表されている、図4のグラフに相当するテーブルを備えておく。 In FIG. 4, the horizontal axis represents information on the inclination of the hand image shown in the biological image 101, more specifically, the inclination angle of the hand image, and the vertical axis represents the evaluation value for the angle. Represents. In this graph, the closer the inclination angle of the hand image is to a predetermined value set in advance as the evaluation condition 102, the larger the evaluation value is associated with the inclination angle of the hand image. It can be seen from this evaluation value that the degree of coincidence with the inclination information included in 102 is high. The evaluation unit 12 is provided with a table corresponding to the graph of FIG. 4 in which the correspondence between the inclination angle of the hand image and the evaluation value is represented.
 なお、生体画像101に映っている手の像の傾きの角度は、例えば、図5に示すようにして推定することができる。この手法は、まず、生体画像101から、そこに映っている手の像における特徴点を抽出し(ステップ1)、次に、その特徴点の位置関係に基づき掌を模した平面を形成し(ステップ2)。そして、この平面についての、生体画像101の画像面に対するX軸(生体画像101の中心位置を貫く横方向の軸)回り及びY軸(生体画像101の中心位置を貫く縦方向の軸)周りの回転角度を算出する(ステップ3)。評価部12は、上述したテーブルを参照して、前述のようにして推定した2つの回転角度に対応付けられている2つの評価値を当該テーブルから取得し、例えば、その評価値の平均値を出力する。 Note that the inclination angle of the hand image shown in the biological image 101 can be estimated, for example, as shown in FIG. In this method, first, feature points in the image of the hand shown in the biological image 101 are extracted (step 1), and then a plane imitating a palm is formed based on the positional relationship of the feature points ( Step 2). Then, about this plane, around the X axis (the horizontal axis passing through the central position of the biological image 101) and the Y axis (the vertical axis passing through the central position of the biological image 101) with respect to the image plane of the biological image 101. A rotation angle is calculated (step 3). The evaluation unit 12 refers to the table described above, acquires two evaluation values associated with the two rotation angles estimated as described above from the table, and, for example, obtains an average value of the evaluation values. Output.
 次に図6について説明する。図6は、生体画像101と評価値との対応関係の第三の例である。この第三の例は、生体画像101に映っている手の像の形状と評価値との対応関係を表している。この第三の例は、前述した第五評価条件102e及び第六評価条件102fの場合、すなわち、評価条件102が、生体画像101上での被認証者の手の像の形状の情報を、評価値に対応させる条件として有している場合に利用できる。 Next, FIG. 6 will be described. FIG. 6 is a third example of the correspondence relationship between the biological image 101 and the evaluation value. This third example represents the correspondence between the shape of the hand image shown in the biological image 101 and the evaluation value. In the third example, in the case of the fifth evaluation condition 102e and the sixth evaluation condition 102f described above, that is, the evaluation condition 102 evaluates information on the shape of the image of the person's hand on the biometric image 101. It can be used when it has a condition corresponding to a value.
 図6において、横軸は生体画像101に映っている手の像の形状の情報、より具体的には、その手の像における指の開き具合を表す指標値を表しており、縦軸はその指標値についての評価値を表している。この図4のグラフでは、手の像における当該指標値が評価条件102として予め定めておいた所定値に近いほど、当該指標値に大きな値の評価値が対応付けられており、評価条件102に含まれている形状の情報との合致度が高いことがこの評価値から分かる。評価部12には、この手の像における当該指標値と評価値との対応関係が表されている、図6のグラフに相当するテーブルを備えておく。 In FIG. 6, the horizontal axis represents information on the shape of the hand image shown in the biological image 101, more specifically, an index value indicating the degree of finger opening in the hand image, and the vertical axis represents the index value. It represents an evaluation value for the index value. In the graph of FIG. 4, the closer the index value in the hand image is to a predetermined value set in advance as the evaluation condition 102, the higher the evaluation value is associated with the index value. It can be seen from this evaluation value that the degree of coincidence with the included shape information is high. The evaluation unit 12 includes a table corresponding to the graph of FIG. 6 in which a correspondence relationship between the index value and the evaluation value in the hand image is represented.
 なお、生体画像101に映っている手の像における指の開き具合を表す指標値は、例えば、図7に示すようにして推定することができる。この手法は、まず、生体画像101から、そこに映っている手の像における各指についての付け根からの向きを、各指の向きとして抽出し、隣接する指の向きの角度差を、指の開き角度θ、θ、θ、及びθとして算出する(ステップ1)。次に、この指の開き角度に基づいて指の開き具合を評価するための指標値を算出する(ステップ2)。この指の開き角度に基づいた指の開き具合の評価は、例えば以下のようにして評価部12により行われる。 Note that the index value representing the degree of finger opening in the hand image shown in the biological image 101 can be estimated as shown in FIG. 7, for example. In this method, first, from the biological image 101, the direction from the base of each finger in the image of the hand shown there is extracted as the direction of each finger, and the angle difference between the directions of adjacent fingers is calculated. The opening angles θ 1 , θ 2 , θ 3 , and θ 4 are calculated (step 1). Next, an index value for evaluating the finger opening degree is calculated based on the finger opening angle (step 2). The evaluation of the finger opening degree based on the finger opening angle is performed by the evaluation unit 12 as follows, for example.
 まず、評価部12は、この指の開き角度の大きさを表すための指標値として、二乗合計θ1 2+θ2 2+θ3 2+θ4 2を第一の指標値I1として算出する。次に、評価部12は、親指を除く4つの指の開き具合の均等さを表すための指標値として、(θ2-θ32+(θ3-θ42+(θ4-θ22を第二の指標値I2として算出する。なお、評価部12には、この手の像における2つの指標値I1及びI2と評価値との対応関係が表されている、2つのテーブルを予め備えておくようする。そして、評価部12は、このテーブルを参照して、指標値I1及びI2にそれぞれ対応付けられている2つの評価値E1及びE2を当該テーブルから取得してE1-E2の減算を行い、その減算結果を、指の開き具合の評価値として出力する。 First, the evaluation unit 12 calculates the sum of squares θ 1 2 + θ 2 2 + θ 3 2 + θ 4 2 as a first index value I1 as an index value for representing the magnitude of the finger opening angle. Next, the evaluation unit 12 uses (θ 2 −θ 3 ) 2 + (θ 3 −θ 4 ) 2 + (θ 4 − as an index value for expressing the degree of opening of the four fingers excluding the thumb. θ 2 ) 2 is calculated as the second index value I2. Note that the evaluation unit 12 is provided with two tables in advance, in which the correspondence between the two index values I1 and I2 and the evaluation value in the hand image is represented. Then, the evaluation unit 12 refers to this table, obtains two evaluation values E1 and E2 associated with the index values I1 and I2 from the table, performs subtraction of E1-E2, and performs the subtraction. The result is output as an evaluation value of the degree of finger opening.
 なお、第七評価条件102gに基づく評価を評価部12が行う場合には、例えば、生体画像101に映っている手の像における各指の付け根からの長さと評価条件102として予め定めておいた所定値との値の近さに応じた評価値を評価部12が出力するようにする。 When the evaluation unit 12 performs the evaluation based on the seventh evaluation condition 102g, for example, the length from the base of each finger in the image of the hand shown in the biological image 101 and the evaluation condition 102 are set in advance. The evaluation unit 12 outputs an evaluation value corresponding to the closeness of the value to the predetermined value.
 なお、評価部12は、生体画像101についての評価を、他の観点に基づいて行うようにしてもよい。例えば、被認証者の生体部位である手の像が生体画像101の中心位置から偏っているかどうかの評価を、その中心位置と、輝度値が手の像を表現していると推定される画素群の重心位置との位置関係に基づいて行うようにしてもよい。 In addition, you may make it the evaluation part 12 perform the evaluation about the biometric image 101 based on another viewpoint. For example, an evaluation is made as to whether or not the image of the hand, which is the body part of the person to be authenticated, is deviated from the center position of the living body image 101, and the center position and the pixel whose luminance value is estimated to represent the image of the hand You may make it perform based on the positional relationship with the gravity center position of a group.
 次に、図1の生体認証システム1において、被認証者の登録生体情報を以上のようにして登録生体情報記憶部14に記憶させることの効用について、図8を用いて説明する。
 図8に図解されている、被認証者の生体情報である手の像が映っている各種の画像例において、図8の下段に配置した登録生体情報105は、図1の生体認証システム1の動作によって生体画像101から抽出されて登録生体情報記憶部14に記憶されるものである。一方、図8の上段に配置した従来手法による登録生体情報100は、背景技術の項で説明した第二の手法によって生体画像101から選択されるもの、すなわち、生体画像101間の類似度を評価して得られた、互いに似ていない画像の組み合わせである。また、認証生体情報106は、認証時に取得されるものであり、ここでは、従来手法と図1の生体認証システム1とで同一の認証生体情報106が得られた場合を想定している。
Next, the utility of storing the registered biometric information of the person to be authenticated in the registered biometric information storage unit 14 in the biometric authentication system 1 of FIG. 1 will be described with reference to FIG.
In various image examples illustrated in FIG. 8 in which an image of a hand as biometric information of the person to be authenticated is shown, the registered biometric information 105 arranged in the lower part of FIG. It is extracted from the biological image 101 by the operation and stored in the registered biological information storage unit 14. On the other hand, the registered biometric information 100 according to the conventional method arranged in the upper part of FIG. 8 is selected from the biometric images 101 by the second method described in the background section, that is, the similarity between the biometric images 101 is evaluated. This is a combination of images that are not similar to each other. The authentication biometric information 106 is acquired at the time of authentication. Here, it is assumed that the same authentication biometric information 106 is obtained by the conventional technique and the biometric authentication system 1 of FIG.
 生体画像101に映る生体部位の位置・姿勢・形状は様々である。但し、被認証者に特に誘導をせずに生体画像101の撮像を行うとどのような位置・姿勢・形状の生体画像101がどのような確率で得られるかは、多数の被認証者を対象とする実験によって概ね把握できる。図8の生体画像101に付されている確率値は、このような実験によって得られる確率値の例であり、比較的高い確率で得られる位置・姿勢・形状の画像もあれば、比較的低い確率でしか得られない位置・姿勢・形状の画像もある。 There are various positions, postures, and shapes of living body parts shown in the living body image 101. However, what kind of position / posture / shape of the biometric image 101 can be obtained by taking the biometric image 101 without particularly guiding the subject is subject to many subjects. It can be generally grasped by the experiment. The probability values attached to the biological image 101 in FIG. 8 are examples of probability values obtained by such an experiment. If there are images of position / posture / shape obtained with a relatively high probability, the probability values are relatively low. There are also images of position, posture, and shape that can only be obtained with probability.
 このような事情の下で、前述の第二の手法、すなわち、互いに似ていない画像の組み合わせを生体画像101から選択する手法を採ると、低確率でしか得られない位置・姿勢・形状の生体部位が映っているものを登録生体情報100に選択してしまうことがあり得る。このような従来手法による登録生体情報100では、認証生体情報106とは生体部位の像の位置・姿勢・形状が異なっている可能性が当然高いため、登録生体情報100と認証生体情報106との認証の精度の劣化が懸念される。なお、図8の上段の例では、従来手法による登録生体情報100と認証生体情報106とで被認証者の生体部位の像の位置・姿勢・形状が似たものとなる場合は15%であり、似ていないものとなる場合が80%であることを表している。 Under such circumstances, when the above-described second method, that is, a method of selecting a combination of images that are not similar to each other from the biological image 101, a living body having a position, posture, and shape that can be obtained only with low probability. It is possible that the part showing the part is selected as the registered biometric information 100. In the registered biometric information 100 according to such a conventional method, it is naturally likely that the position, posture, and shape of the image of the biometric part are different from the authenticated biometric information 106. There is concern about the deterioration of authentication accuracy. In the upper example of FIG. 8, the registered biometric information 100 and the authenticating biometric information 106 according to the conventional method are 15% when the image of the body part of the person to be authenticated is similar in position / posture / shape. , 80% of cases are not similar.
 これに対し、図1の生体認証システム1では、評価条件102を、上述した実験により比較的高い確率で得られる位置・姿勢・形状の生体部位の像の画像が抽出されるように設定することができる。このようにして抽出された生体画像101を登録生体情報105とすると、認証生体情報106と生体部位の像の位置・姿勢・形状が似ている可能性が高いので、登録生体情報105と認証生体情報106との認証を精度良く行えることが期待できる。なお、図8の下段の例では、登録生体情報105と認証生体情報106とで被認証者の生体部位の像の位置・姿勢・形状が似たものとなる場合は70%であり、似ていないものとなる場合が25%であることを表している。 On the other hand, in the biometric authentication system 1 of FIG. 1, the evaluation condition 102 is set so that an image of a biological part image having a position / posture / shape obtained with a relatively high probability by the above-described experiment is extracted. Can do. If the biometric image 101 extracted in this way is registered biometric information 105, there is a high possibility that the authentication biometric information 106 and the position / posture / shape of the image of the biometric part are similar. It can be expected that the authentication with the information 106 can be performed with high accuracy. In the lower example of FIG. 8, the registered biometric information 105 and the authenticated biometric information 106 are 70% when the image of the body part of the person to be authenticated is similar in position, posture, and shape. This means that 25% of cases do not exist.
 以上のように、図1の生体認証システム1では、照合時に被認証者が行うであろう生体部位の位置・姿勢・形状ほど合致度が高い評価値を呈する評価条件102を設定することで、高い精度での生体認証の実施の可能性を高めることができるのである。従って、図1の生体認証システム1を用いることで、実際の利用シーンに則した認証を高精度に実施することが可能になる。 As described above, in the biometric authentication system 1 of FIG. 1, by setting the evaluation condition 102 that exhibits an evaluation value having a higher degree of match as the position / posture / shape of the biological part that the person to be authenticated will perform at the time of verification, The possibility of performing biometric authentication with high accuracy can be increased. Therefore, by using the biometric authentication system 1 shown in FIG. 1, it is possible to perform authentication with high accuracy in accordance with an actual usage scene.
 なお、上述のようにして評価条件102の設定を行う代わりに、認証部20で用いられる登録生体情報と認証生体情報との照合のアルゴリズムが得意とする位置・姿勢・形状である生体部位の像が抽出されるような評価条件102を設定するようにしてもよい。更には、認証部20で用いられる登録生体情報と認証生体情報との照合のアルゴリズムが苦手とする位置・姿勢・形状である生体部位の像が抽出されないような評価条件102を設定するようにしてもよい。 In addition, instead of setting the evaluation condition 102 as described above, an image of a living body part having a position / posture / shape that the algorithm for matching the registered biometric information and the authenticating biometric information used in the authentication unit 20 is good at. The evaluation condition 102 may be set so as to be extracted. Furthermore, an evaluation condition 102 is set so that an image of a biological part having a position, posture, and shape that is not suitable for an algorithm for collating registered biometric information and authenticated biometric information used in the authentication unit 20 is extracted. Also good.
 次に図9について説明する。図9には、図1の生体認証システム1における生体情報登録部10の具体的構成の一例が図解されている。この例では、生体情報登録部10は、撮像部31、撮像制御部32、生体画像一時記憶部33、画像評価部34、登録候補記憶部35、評価結果チェック部36、特徴抽出部37、及び登録生体情報記憶部38を備えて構成されている。 Next, FIG. 9 will be described. FIG. 9 illustrates an example of a specific configuration of the biometric information registration unit 10 in the biometric authentication system 1 of FIG. In this example, the biometric information registration unit 10 includes an imaging unit 31, an imaging control unit 32, a biological image temporary storage unit 33, an image evaluation unit 34, a registration candidate storage unit 35, an evaluation result check unit 36, a feature extraction unit 37, and A registered biometric information storage unit 38 is provided.
 撮像部31は、被認証者の生体部位を撮像して生体画像101を取得するカメラである。撮像制御部32は、撮像部31を制御して、被認証者の生体画像101を複数枚取得させる。この撮像部31及び撮像制御部32によって、図1における生体画像取得部11の機能が提供される。
 生体画像一時記憶部33は、撮像部31により取得された生体画像101を一時的に記憶しておく記憶部である。
The imaging unit 31 is a camera that captures a biological part of the person to be authenticated and acquires a biological image 101. The imaging control unit 32 controls the imaging unit 31 to acquire a plurality of biometric images 101 of the person to be authenticated. The imaging unit 31 and the imaging control unit 32 provide the function of the biological image acquisition unit 11 in FIG.
The biological image temporary storage unit 33 is a storage unit that temporarily stores the biological image 101 acquired by the imaging unit 31.
 画像評価部34は、まず、生体画像一時記憶部33に記憶されている生体画像101を読み出して、所定の複数の評価条件102の各々について評価し、その生体画像101についての評価条件102ごとの評価値を獲得する。次に、画像評価部34は、獲得した評価値107から、所定の評価閾値以上の合致度の高さを示している評価値107を評価条件102ごとに抽出する。なお、この抽出によって評価値が同一の評価条件102において複数抽出された場合には、画像評価部34は、抽出された複数の評価値から、更に、その同一の評価条件102との合致度が最も高い評価値107の抽出を行う。更に、画像評価部34は、以上のようにして抽出された評価値107が評価結果である生体画像101を生体画像一時記憶部33から読み出し、評価値107と対応付けて登録候補記憶部35に評価条件102毎に一時的に格納しておく。この画像評価部34によって、図1における評価部12及び評価値抽出部13の機能が提供される。 First, the image evaluation unit 34 reads the biological image 101 stored in the biological image temporary storage unit 33, evaluates each of a plurality of predetermined evaluation conditions 102, and evaluates each of the evaluation conditions 102 for the biological image 101. Obtain an evaluation value. Next, the image evaluation unit 34 extracts, for each evaluation condition 102, an evaluation value 107 indicating a high degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the acquired evaluation value 107. When a plurality of evaluation values are extracted under the same evaluation condition 102 by this extraction, the image evaluation unit 34 further determines the degree of match with the same evaluation condition 102 from the plurality of extracted evaluation values. The highest evaluation value 107 is extracted. Further, the image evaluation unit 34 reads the biological image 101 whose evaluation value 107 extracted as described above is an evaluation result from the biological image temporary storage unit 33 and associates it with the evaluation value 107 in the registration candidate storage unit 35. Each evaluation condition 102 is temporarily stored. The image evaluation unit 34 provides the functions of the evaluation unit 12 and the evaluation value extraction unit 13 in FIG.
 登録候補記憶部35は、画像評価部34が抽出した評価値107と、当該評価値107が評価結果である生体画像101とを対応付けて、評価条件102毎に一時的に記憶しておく記憶部である。 The registration candidate storage unit 35 stores the evaluation value 107 extracted by the image evaluation unit 34 and the biological image 101 whose evaluation value 107 is an evaluation result in association with each other, and temporarily stores each evaluation condition 102. Part.
 評価結果チェック部36は、登録候補記憶部35に評価値107が格納されていない評価条件102が存在するか否かを判定する。ここで、そのような評価条件102が存在すると判定した場合には、評価結果チェック部36は、撮像制御部32に指示を与えて撮像部31を制御させて、更なる生体画像101の取得を行わせる。こうして再度取得された生体画像101は、生体画像一時記憶部33に記憶された後、画像評価部34によって前述した評価が行われて評価条件102ごとの評価値107が獲得され、評価値107に基づき登録候補記憶部35に格納される。評価結果チェック部36は、登録候補記憶部35に評価値107が格納されていない評価条件102が存在しなくなるまで、前述の動作を繰り返す。この評価結果チェック部36によって、図1における評価結果チェック部15の機能が提供される。 The evaluation result check unit 36 determines whether there is an evaluation condition 102 in which the evaluation value 107 is not stored in the registration candidate storage unit 35. Here, when it is determined that such an evaluation condition 102 exists, the evaluation result check unit 36 gives an instruction to the imaging control unit 32 to control the imaging unit 31 to acquire a further biological image 101. Let it be done. The biometric image 101 acquired again in this way is stored in the biometric image temporary storage unit 33, and then the above-described evaluation is performed by the image evaluation unit 34 to obtain the evaluation value 107 for each evaluation condition 102. Based on the registration candidate storage unit 35. The evaluation result check unit 36 repeats the above-described operation until there is no evaluation condition 102 in which the evaluation value 107 is not stored in the registration candidate storage unit 35. The evaluation result check unit 36 provides the function of the evaluation result check unit 15 in FIG.
 特徴抽出部37は、まず、評価結果チェック部36による上述の動作の完了後に、評価条件102ごとの生体画像101と評価値107との組み合わせの全てを登録候補記憶部35から読み出す。そして、この組み合わせを、生体特徴108として、登録生体情報記憶部38に記憶させる。 The feature extraction unit 37 first reads all the combinations of the biological image 101 and the evaluation value 107 for each evaluation condition 102 from the registration candidate storage unit 35 after the above-described operation by the evaluation result check unit 36 is completed. Then, this combination is stored in the registered biometric information storage unit 38 as the biometric feature 108.
 登録生体情報記憶部38は、画像評価部34が抽出した評価値107と、当該評価値107が評価結果である生体画像101との組み合わせである生体特徴108を、登録生体情報として記憶しておく記憶部である。この登録生体情報記憶部38は、図1における登録生体情報記憶部14に相当するものである。 The registered biometric information storage unit 38 stores, as registered biometric information, a biometric feature 108 that is a combination of the evaluation value 107 extracted by the image evaluation unit 34 and the biometric image 101 whose evaluation value 107 is an evaluation result. It is a storage unit. The registered biometric information storage unit 38 corresponds to the registered biometric information storage unit 14 in FIG.
 図9の生体情報登録部10は、以上のように構成されている。なお、この図9に図解した生体情報登録部10の一部の構成要素を、標準的な構成のコンピュータを用いて構成することができる。 The biometric information registration unit 10 in FIG. 9 is configured as described above. Note that some components of the biometric information registration unit 10 illustrated in FIG. 9 can be configured using a computer having a standard configuration.
 ここで図10について説明する。図10には、コンピュータの構成の一例が図解されている。
 このコンピュータ40は、MPU41、ROM42、RAM43、ハードディスク装置44、入力装置45、表示装置46、インタフェース装置47、及び記録媒体駆動装置48を備えている。なお、これらの構成要素はバスライン49を介して接続されており、MPU41の管理の下で各種のデータを相互に授受することができる。
Here, FIG. 10 will be described. FIG. 10 illustrates an example of the configuration of a computer.
The computer 40 includes an MPU 41, ROM 42, RAM 43, hard disk device 44, input device 45, display device 46, interface device 47, and recording medium drive device 48. These components are connected via a bus line 49, and various data can be exchanged under the management of the MPU 41.
 MPU(Micro Processing Unit)41は、このコンピュータ40全体の動作を制御する演算処理装置である。
 ROM(Read Only Memory)42は、所定の基本制御プログラムが予め記録されている読み出し専用半導体メモリである。MPU41は、この基本制御プログラムをコンピュータ40の起動時に読み出して実行することにより、このコンピュータ40の各構成要素の動作制御が可能になる。
An MPU (Micro Processing Unit) 41 is an arithmetic processing unit that controls the operation of the entire computer 40.
A ROM (Read Only Memory) 42 is a read-only semiconductor memory in which a predetermined basic control program is recorded in advance. The MPU 41 reads out and executes this basic control program when the computer 40 is activated, thereby enabling operation control of each component of the computer 40.
 RAM(Random Access Memory)43は、MPU41が各種の制御プログラムを実行する際に、必要に応じて作業用記憶領域として使用する、随時書き込み読み出し可能な半導体メモリである。 A RAM (Random Access Memory) 43 is a semiconductor memory that can be written and read at any time and used as a working storage area as necessary when the MPU 41 executes various control programs.
 ハードディスク装置44は、MPU41によって実行される各種の制御プログラムや各種のデータを記憶しておく記憶装置である。MPU41は、ハードディスク装置44に記憶されている所定の制御プログラムを読み出して実行することにより、各種の制御処理を行えるようになる。 The hard disk device 44 is a storage device that stores various control programs executed by the MPU 41 and various data. The MPU 41 can perform various control processes by reading and executing a predetermined control program stored in the hard disk device 44.
 入力装置45は、例えばキーボード装置やマウス装置であり、例えば図1の生体認証システム1の管理者により操作されると、その操作内容に対応付けられている管理者からの各種情報の入力を取得し、取得した入力情報をMPU41に送付する。 The input device 45 is, for example, a keyboard device or a mouse device. For example, when operated by the administrator of the biometric authentication system 1 in FIG. 1, the input device 45 acquires various information input from the administrator associated with the operation content. Then, the acquired input information is sent to the MPU 41.
 表示装置46は例えば液晶ディスプレイであり、MPU41から送付される表示データに応じて各種のテキストや画像を表示する。
 インタフェース装置47は、このコンピュータ40に接続される各種機器との間での各種情報の授受の管理を行う。
The display device 46 is a liquid crystal display, for example, and displays various texts and images according to display data sent from the MPU 41.
The interface device 47 manages the exchange of various information with various devices connected to the computer 40.
 記録媒体駆動装置48は、可搬型記録媒体50に記録されている各種の制御プログラムやデータの読み出しを行う装置である。MPU41は、可搬型記録媒体50に記録されている所定の制御プログラムを、記録媒体駆動装置48を介して読み出して実行することによって、後述する各種の制御処理を行うようにすることもできる。なお、可搬型記録媒体50としては、例えばCD-ROM(Compact Disc Read Only Memory)やDVD-ROM(Digital Versatile Disc Read Only Memory)、USB(Universal Serial Bus)規格のコネクタが備えられているフラッシュメモリなどがある。 The recording medium driving device 48 is a device that reads various control programs and data recorded in the portable recording medium 50. The MPU 41 can read out and execute a predetermined control program recorded on the portable recording medium 50 via the recording medium driving device 48 to perform various control processes described later. As the portable recording medium 50, for example, a flash memory equipped with a CD-ROM (Compact Disc Read Only Memory), DVD-ROM (Digital Versatile Disc Disc Read Only Memory), or USB (Universal Serial Bus) standard connector. and so on.
 このようなコンピュータ40を用いて図9の生体情報登録部10を構成するには、例えば、次に説明する生体情報登録処理をMPU41に行わせるための制御プログラムを作成する。作成された制御プログラムはハードディスク装置44若しくは可搬型記録媒体50に予め格納しておく。このプログラムでは、例えば、ハードディスク装置44を、生体画像一時記憶部33、登録候補記憶部35、及び登録生体情報記憶部38として機能させるようにしておく。また、コンピュータ40のインタフェース装置47には、撮像部31の一例であるカメラをインタフェース装置47に接続して、コンピュータ40が撮像部31を制御して生体画像101を取得させてコンピュータ40に取り込めるようにしておく。そして、MPU41に所定の指示を与えてこの制御プログラムを読み出させて実行させる。こうすることで、撮像制御部32、生体画像一時記憶部33、画像評価部34、登録候補記憶部35、評価結果チェック部36、特徴抽出部37、及び登録生体情報記憶部38が各々有している機能のコンピュータ40での提供が可能となる。 In order to configure the biometric information registration unit 10 of FIG. 9 using such a computer 40, for example, a control program for causing the MPU 41 to perform biometric information registration processing described below is created. The created control program is stored in advance in the hard disk device 44 or the portable recording medium 50. In this program, for example, the hard disk device 44 is caused to function as the biological image temporary storage unit 33, the registration candidate storage unit 35, and the registered biological information storage unit 38. Further, a camera which is an example of the imaging unit 31 is connected to the interface device 47 of the computer 40 so that the computer 40 can control the imaging unit 31 to acquire the biological image 101 and take it into the computer 40. Keep it. Then, a predetermined instruction is given to the MPU 41 to read and execute this control program. By doing so, the imaging control unit 32, the biological image temporary storage unit 33, the image evaluation unit 34, the registration candidate storage unit 35, the evaluation result check unit 36, the feature extraction unit 37, and the registered biological information storage unit 38 respectively have. It is possible to provide the function with the computer 40.
 次に図11について説明する。図11は、図9の生体情報登録部10で行われる生体情報登録処理の処理内容の一例を図解したフローチャートである。
 この生体情報登録処理が開始されると、図11において、まず、S101では、撮像部31を制御して被認証者の生体部位を撮像させて生体画像101を取得させ、生体画像一時記憶部33に記憶させる処理を撮像制御部32が行う。図11では、このときに生体画像一時記憶部33に記憶させた生体画像101を『CaptImage 』と表記している。なお、以降の説明でもこの表記を用いることとする。
Next, FIG. 11 will be described. FIG. 11 is a flowchart illustrating an example of the contents of the biometric information registration process performed by the biometric information registration unit 10 in FIG.
When this biometric information registration process is started, in FIG. 11, first, in S101, the imaging unit 31 is controlled to capture the biometric part of the person to be authenticated, thereby acquiring the biometric image 101, and the biometric image temporary storage unit 33. The imaging control unit 32 performs processing to be stored in. In FIG. 11, the biological image 101 stored in the biological image temporary storage unit 33 at this time is expressed as “CaptImage”. This notation is used in the following description.
 次に、S102では、変数iに初期値「0」を代入する処理を画像評価部34が行う。
 次に、S103では、予め定められている複数の評価条件102のうち、変数iのこの処理時点での値に対応するもので『CaptImage 』を評価して、その評価結果である評価値を獲得する処理を画像評価部34が行う。
Next, in S102, the image evaluation unit 34 performs a process of substituting the initial value “0” for the variable i.
Next, in S103, "CaptImage" is evaluated with a value corresponding to the value of the variable i at this point in time among a plurality of predetermined evaluation conditions 102, and an evaluation value as an evaluation result is obtained. The image evaluation part 34 performs the process to perform.
 図11では、複数の評価条件102のうちで変数iに対応するものを『Func[i] 』と表記しており、前述した『CaptImage 』を、この『Func[i] 』によって評価して得られた評価値を『Result[i] 』と表記している。なお、以降の説明でもこの表記を用いることとする。 In FIG. 11, among the plurality of evaluation conditions 102, the one corresponding to the variable i is expressed as “Func [i]”, and the above-mentioned “CaptImage” is obtained by evaluating with this “Func [i]”. The obtained evaluation value is written as “Result [i]”. This notation is used in the following description.
 次に、S104では、直近に実行されたS103の処理によって獲得された『Result[i] 』が、『Func[i] 』について予め定められている所定の評価閾値『Thresh[i] 』よりも大きく、評価値が所定値よりも高いか否かを判定する処理を画像評価部34が行う。ここで、直近に実行されたS103によって獲得された『Result[i] 』が『Thresh[i] 』と画像評価部34が判定したとき(判定結果がYesのとき)にはS105に処理が進む。一方、直近に実行されたS103によって獲得された『Result[i] 』が『Thresh[i] 』以下であると画像評価部34が判定したとき(判定結果がNoのとき)にはS108に処理が進む。 Next, in S104, “Result [i]” acquired by the processing of S103 executed most recently is more than a predetermined evaluation threshold “Thresh [i]” predetermined for “Func [i]”. The image evaluation unit 34 performs a process of determining whether the evaluation value is large and higher than a predetermined value. When the image evaluation unit 34 determines that “Result [i]” acquired in the most recently executed S103 is “Thresh [i]” (when the determination result is Yes), the process proceeds to S105. . On the other hand, when the image evaluation unit 34 determines that “Result [i]” acquired by the most recently executed S103 is equal to or less than “Thresh [i]” (when the determination result is No), the process proceeds to S108. Advances.
 次に、S105では、登録候補記憶部35における、『Func[i] 』についての評価値が最も高い生体画像101が格納される所定の格納位置に、生体画像101が格納されているかどうかを判定する処理を画像評価部34が行う。 Next, in S105, it is determined whether or not the biological image 101 is stored in a predetermined storage position in the registration candidate storage unit 35 where the biological image 101 having the highest evaluation value for “Func [i] i” is stored. The image evaluation part 34 performs the process to perform.
 なお、図11では、『Func[i] 』についての評価値が最も高い生体画像101を『TempImage[i]』と表記しており、以降の説明でもこの表記を用いることとする。
 このS105の判定処理において、登録候補記憶部35における所定の格納位置に『TempImage[i]』が格納されていない(空である)と画像評価部34が判定したとき(判定結果がYesのとき)にはS107に処理が進む。一方、登録候補記憶部35における所定の格納位置に『TempImage[i]』が格納されていると画像評価部34が判定したとき(判定結果がNoのとき)にはS106に処理が進む。
In FIG. 11, the biological image 101 having the highest evaluation value for “Func [i]” is denoted as “TempImage [i]”, and this notation is used in the following description.
In the determination process of S105, when the image evaluation unit 34 determines that “TempImage [i]” is not stored in the predetermined storage position in the registration candidate storage unit 35 (is empty) (when the determination result is Yes) ) Proceeds to S107. On the other hand, when the image evaluation unit 34 determines that “TempImage [i]” is stored at a predetermined storage position in the registration candidate storage unit 35 (when the determination result is No), the process proceeds to S106.
 S106では、この処理時点において登録候補記憶部35に格納されている『Func[i] 』についての評価値よりも、直近に実行されたS103の処理によって獲得された『Result[i] 』が大きいか否かを判定する処理を画像評価部34が行う。 In S106, “Result [i]” obtained by the most recently executed processing of S103 is larger than the evaluation value for “Func [i]” stored in the registration candidate storage unit 35 at the time of this processing. The image evaluation unit 34 performs a process for determining whether or not.
 なお、図11では、この処理時点において登録候補記憶部35に格納されている『Func[i] 』についての評価値を『TempResult[i] 』と表記しており、以降の説明でもこの表記を用いることとする。 In FIG. 11, the evaluation value for “Func [i]” stored in the registration candidate storage unit 35 at the time of this processing is indicated as “TempResult [i] 、”, and this notation is also used in the following description. We will use it.
 このS106の判定処理において、直近に実行されたS103によって獲得された『Result[i] 』が『TempResult[i] 』よりも大きいと画像評価部34が判定したとき(判定結果がYesのとき)にはS107に処理が進む。一方、直近に実行されたS103によって獲得された『Result[i] 』が『TempResult[i] 』以下であると画像評価部34が判定したとき(判定結果がNoのとき)にはS108に処理が進む。 In the determination process of S106, when the image evaluation unit 34 determines that “Result [i]” acquired by S103 executed most recently is larger than “TempResult [i]” (when the determination result is Yes). In step S107, the process proceeds. On the other hand, when the image evaluation unit 34 determines that “Result [i]” acquired by the most recently executed S103 is equal to or lower than “TempResult [i]” (when the determination result is No), the process proceeds to S108. Advances.
 S107では、『CaptImage 』を新たな『TempImage[i]』として登録候補記憶部35に格納すると共に、S103の直近の実行によって獲得された『Result[i] 』を新たな『TempResult[i] 』として登録候補記憶部35に格納する処理を画像評価部34が行う。 In S107, “CaptImage” is stored in the registration candidate storage unit 35 as a new “TempImage [i]”, and “Result [i]” obtained by the most recent execution of S103 is changed to a new “TempResult [i]”. Is stored in the registration candidate storage unit 35 by the image evaluation unit 34.
 S108では、変数iの現在の値に「1」を加算し、その加算結果を改めて変数iに代入する処理を画像評価部34が行う。
 S109では、この処理時点での変数iの値が、評価条件102の個数FuncNum以上であるか否かを判定する処理を画像評価部34が行う。ここで、変数iの値がFuncNum以上であると画像評価部34が判定したとき(判定結果がYesのとき)にはS110に処理が進む。一方、変数iの値がFuncNum未満であると画像評価部34が判定したとき(判定結果がNoのとき)にはS103に処理が戻り、変数iのこの処理時点での値の下で、上述したS103以降の処理が再度行われる。
In S108, the image evaluation unit 34 performs a process of adding “1” to the current value of the variable i and substituting the addition result into the variable i.
In S109, the image evaluation unit 34 performs a process of determining whether or not the value of the variable i at the time of this process is equal to or greater than the number FuncNum of the evaluation conditions 102. Here, when the image evaluation unit 34 determines that the value of the variable i is equal to or greater than FuncNum (when the determination result is Yes), the process proceeds to S110. On the other hand, when the image evaluation unit 34 determines that the value of the variable i is less than FuncNum (when the determination result is No), the process returns to S103, and the above-described value of the variable i is below the value at the time of this process. The processing after S103 is performed again.
 S110では、登録候補記憶部35における『TempImage[i]』(但し、i=0、…、(FuncNum-1))の格納位置の全てに、生体画像101が格納されているか否かを判定する処理を評価結果チェック部36が行う。ここで、登録候補記憶部35における『TempImage[i]』の格納位置の全てに生体画像101が格納されていると評価結果チェック部36が判定したとき(判定結果がYesのとき)にはS111に処理が進む。一方、登録候補記憶部35における『TempImage[i]』の全ての格納位置のうち、生体画像101が格納されていない格納位置が残っているとき(判定結果がNoのとき)にはS101に処理が戻る。この場合には、登録候補記憶部35における『TempImage[i]』の格納位置の全てに生体画像101が格納されるまで、上述したS101以降の処理が繰り返される。 In S110, it is determined whether or not the biometric image 101 is stored in all the storage positions of “TempImage [i]” (where i = 0,..., (FuncNum−1)) in the registration candidate storage unit 35. The evaluation result check unit 36 performs the processing. Here, when the evaluation result check unit 36 determines that the biological image 101 is stored in all of the storage positions of “TempImage [i]” in the registration candidate storage unit 35 (when the determination result is Yes), S111. The process proceeds. On the other hand, when all the storage positions of “TempImage [i]” in the registration candidate storage unit 35 remain where the biological image 101 is not stored (when the determination result is No), the process proceeds to S101. Will return. In this case, the above-described processing after S101 is repeated until the biological image 101 is stored in all the storage positions of “TempImage [i]” in the registration candidate storage unit 35.
 S111では、S107の処理により登録候補記憶部35に格納された全ての『TempImage[i]』と『TempResult[i] 』との組み合わせ(但し、i=0、…、(FuncNum-1))を読み出して、登録生体情報記憶部38に記憶させる処理を特徴抽出部37が行う。なお、この処理では、『Func[i] 』を特定する情報も上記の組み合わせに対応付けて、登録生体情報記憶部38に記憶させる処理も特徴抽出部37により行われる。その後は、この生体情報登録処理が終了する。 In S111, all combinations of “TempImage [i]” and “TempResult [i]” stored in the registration candidate storage unit 35 by the process of S107 (where i = 0,..., (FuncNum-1)) The feature extraction unit 37 performs a process of reading and storing it in the registered biometric information storage unit 38. In this process, the feature extraction unit 37 also stores information for specifying “Func [i]” in the registered biometric information storage unit 38 in association with the above combination. Thereafter, the biometric information registration process ends.
 以上までの処理が生体情報登録処理である。この処理が行われることによって、画像評価部34により抽出された評価値、その評価値を画像評価部34が出力したときの評価条件102、及び、評価対象である生体画像101が対応付けられて、生体特徴108として、登録生体情報記憶部38に記憶される。 The process up to and above is the biometric information registration process. By performing this processing, the evaluation value extracted by the image evaluation unit 34, the evaluation condition 102 when the image evaluation unit 34 outputs the evaluation value, and the biological image 101 that is the evaluation target are associated with each other. The biometric feature 108 is stored in the registered biometric information storage unit 38.
 なお、図11におけるS101の処理を撮像制御部32が実行することによって、図1における生体画像取得部11の機能が提供される。また、図11におけるS102からS109の処理を画像評価部34が実行することによって、図1における評価部12及び評価値抽出部13の機能が提供される。更に、図11におけるS110の処理を評価結果チェック部36が実行することによって図1における評価結果チェック部15の機能が提供され、S111の処理を特徴抽出部37が実行することによって図1における登録生体情報記憶部14の機能が提供される。 In addition, the function of the biological image acquisition unit 11 in FIG. 1 is provided by the imaging control unit 32 executing the process of S101 in FIG. Moreover, the functions of the evaluation unit 12 and the evaluation value extraction unit 13 in FIG. 1 are provided by the image evaluation unit 34 executing the processing from S102 to S109 in FIG. Furthermore, the function of the evaluation result check unit 15 in FIG. 1 is provided by the evaluation result check unit 36 executing the process of S110 in FIG. 11, and the registration in FIG. 1 is performed by the feature extraction unit 37 executing the process of S111. The function of the biological information storage unit 14 is provided.
 なお、図9の生体情報登録部10では、撮像部31が取得した生体画像101を生体画像一時記憶部33に一時的に格納している。ここで、撮像部31が生体画像101を1枚取得する度にその生体画像101の評価を画像評価部34が行うようにしてもよい。また、撮像部31が生体画像101を複数枚取得し、その後にその複数枚の生体画像101の評価を画像評価部34が纏めて行うようにしてもよい。 In the biometric information registration unit 10 in FIG. 9, the biometric image 101 acquired by the imaging unit 31 is temporarily stored in the biometric image temporary storage unit 33. Here, every time the imaging unit 31 acquires one biological image 101, the biological image 101 may be evaluated by the image evaluation unit 34. Alternatively, the imaging unit 31 may acquire a plurality of biological images 101, and then the image evaluation unit 34 may collectively evaluate the plurality of biological images 101.
 また、図10に図解したフローチャートでは、S108の処理によって変数iの値を「1」ずつ変更することで、『Func[i] 』による『CaptImage 』の評価を順次行うようにしている。この代わりに、『Func[i] 』による『CaptImage 』の評価の処理を並列処理として一括して実行するようにしてもよい。例えば、マルチコア・マルチプロセッサや、あるいはクラウドシステムのような処理の実施環境を用いる場合には、このような並列処理を行うことで処理時間の短縮が期待できる。 Further, in the flowchart illustrated in FIG. 10, the value of the variable i is changed by “1” by the process of S108, so that the evaluation of “CaptImage に よ る” by “Func [i]” is sequentially performed. Instead of this, the evaluation processing of “CaptImage” by “Func [i]” may be executed in a batch as parallel processing. For example, when a processing environment such as a multi-core multi-processor or a cloud system is used, processing time can be shortened by performing such parallel processing.
 また、図9の評価結果チェック部36は、画像評価部34による評価条件102ごとの評価値107の抽出において、全ての評価条件102についての評価値107の抽出がされるまで、更なる生体画像101の取得を撮像部31に行わせている。この代わりに、評価値107の抽出がなかった評価条件102が存在していても、その評価値107の個数が所定数RegistNum に達した場合には、評価結果チェック部36が撮像部31による生体画像101の取得のための制御を終了させるようにしてもよい。 Moreover, the evaluation result check unit 36 in FIG. 9 further extracts the biological image until the evaluation values 107 for all the evaluation conditions 102 are extracted in the extraction of the evaluation values 107 for each evaluation condition 102 by the image evaluation unit 34. 101 is acquired by the imaging unit 31. Instead, even if the evaluation condition 102 for which the evaluation value 107 has not been extracted exists, if the number of the evaluation values 107 reaches a predetermined number RegistNum, the evaluation result check unit 36 uses the imaging unit 31 The control for acquiring the image 101 may be terminated.
 ここで図12について説明する。図12は、図11のフローチャートの第一変形例であり、上述した動作を評価結果チェック部36に行わせる場合のものである。このS120の判定処理は、図11のフローチャートにおけるS110の判定処理を置き換えるものである。 Here, FIG. 12 will be described. FIG. 12 is a first modified example of the flowchart of FIG. 11, in which the above-described operation is performed by the evaluation result check unit 36. This determination process of S120 replaces the determination process of S110 in the flowchart of FIG.
 このS120の判定処理は、図11のS109の判定処理の結果がYesのときに評価結果チェック部36により実行される処理である。評価結果チェック部36は、この処理において、登録候補記憶部35における『TempImage[i]』(但し、i=0、…、(FuncNum-1))の各格納位置に格納されている生体画像101がRegistNum枚以上存在するか否かを判定する処理を行う。ここで、生体画像101がRegistNum枚以上格納されていると評価結果チェック部36が判定したとき(判定結果がYesのとき)には図11のS111に処理が進む。一方、格納されている生体画像101がRegistNum個よりも少ないと評価結果チェック部36が判定したとき(判定結果がNoのとき)には図11のS101に処理が戻る。この場合には、登録候補記憶部35に生体画像101がRegistNum枚以上格納されるまで、上述したS101以降の処理が繰り返される。 The determination process in S120 is a process executed by the evaluation result check unit 36 when the result of the determination process in S109 in FIG. 11 is Yes. In this process, the evaluation result check unit 36 stores the biological image 101 stored in each storage position of “TempImage [i]” (where i = 0,..., (FuncNum−1)) in the registration candidate storage unit 35. To determine whether there are more than RegistNum copies. Here, when the evaluation result check unit 36 determines that at least RegistNum biometric images 101 are stored (when the determination result is Yes), the process proceeds to S111 in FIG. On the other hand, when the evaluation result check unit 36 determines that the number of stored biological images 101 is less than RegistNum (when the determination result is No), the process returns to S101 in FIG. In this case, the processing from S101 described above is repeated until at least RegistNum biometric images 101 are stored in the registration candidate storage unit 35.
 なお、このS120の処理の実行時には、続くS111では、特徴抽出部37が、その時点で登録候補記憶部35に格納されている全ての『TempImage[i]』と『TempResult[i] 』との組み合わせを読み出して、登録生体情報記憶部38に記憶させる処理を行う。但し、i=0,…,(RegistNum-1)である。 When the process of S120 is executed, in subsequent S111, the feature extraction unit 37 stores all “TempImage [i]” and “TempResult [i]” stored in the registration candidate storage unit 35 at that time. The combination is read out and stored in the registered biometric information storage unit 38. However, i = 0,..., (RegistNum-1).
 また、図9の特徴抽出部37は、評価条件102ごとの生体画像101と評価値107との組み合わせの全てを登録候補記憶部35から読み出して登録生体情報記憶部38に記憶させる動作を行う。この代わりに、特徴抽出部37が、生体画像101と評価値107との組み合わせのうち、評価値107の大きい順(評価条件102との合致度が高い順)に所定数RegistNum組分のみを、登録生体情報記憶部38に記憶させるようにしてもよい。 9 performs an operation of reading all combinations of the biological image 101 and the evaluation value 107 for each evaluation condition 102 from the registration candidate storage unit 35 and storing them in the registered biological information storage unit 38. Instead, the feature extraction unit 37 selects only a predetermined number of RegistNum sets in the descending order of the evaluation value 107 (in descending order of coincidence with the evaluation condition 102) from among the combinations of the biological image 101 and the evaluation value 107. You may make it memorize | store in the registration biometric information memory | storage part 38. FIG.
 ここで図13について説明する。図13は、図11のフローチャートの第二変形例であり、特徴抽出部37に上述した動作を行わせる場合のものである。このS121の処理は、図11のフローチャートにおけるS111の処理を置き換えるものである。 Here, FIG. 13 will be described. FIG. 13 is a second modification of the flowchart of FIG. 11 and is a case where the feature extraction unit 37 performs the above-described operation. The process of S121 replaces the process of S111 in the flowchart of FIG.
 このS121の処理は、図11のS110の判定処理の結果がYesのときに特徴抽出部37により実行される処理である。特徴抽出部37は、この処理において、まず、S107の処理により登録候補記憶部35に格納された全ての『TempImage[i]』と『TempResult[i] 』との組み合わせ(但し、i=0、…、(FuncNum-1))を読み出す。続いて、特徴抽出部37は、この組み合わせのうち『TempResult[i] 』の値の大きい順にRegistNum組分のみを、登録生体情報記憶部38に記憶させる。なお、この処理では、『Func[i] 』を特定する情報も上記の組み合わせに対応付けて、登録生体情報記憶部38に記憶させる処理も特徴抽出部37により行われる。その後は、図11の生体情報登録処理が終了する。 The process of S121 is a process executed by the feature extraction unit 37 when the result of the determination process of S110 of FIG. 11 is Yes. In this process, the feature extraction unit 37 firstly combines all “TempImage [i]” and “TempResult [i]” stored in the registration candidate storage unit 35 by the process of S107 (provided that i = 0, ..., (FuncNum-1)) is read. Subsequently, the feature extraction unit 37 causes the registered biometric information storage unit 38 to store only the RegistNum sets in descending order of the value of “TempResult [i]” in this combination. In this process, the feature extraction unit 37 also stores information for specifying “Func [i]” in the registered biometric information storage unit 38 in association with the above combination. After that, the biometric information registration process in FIG. 11 ends.
 次に図14について説明する。図14には、図1の生体認証システム1における認証部20の具体的構成の一例が図解されている。この例では、認証部20は、撮像部51、撮像制御部52、照合用生体画像一時記憶部53、画像評価部54、特徴抽出部55、照合生体情報記憶部56、評価順序決定部57、及び特徴照合部58を備えて構成されている。なお、図14における登録生体情報記憶部38は、図9において生体情報登録部10が備えていたものであり、前述した生体特徴108が登録生体情報として記憶されている。 Next, FIG. 14 will be described. FIG. 14 illustrates an example of a specific configuration of the authentication unit 20 in the biometric authentication system 1 of FIG. In this example, the authentication unit 20 includes an imaging unit 51, an imaging control unit 52, a verification biometric image temporary storage unit 53, an image evaluation unit 54, a feature extraction unit 55, a verification biometric information storage unit 56, an evaluation order determination unit 57, And a feature matching unit 58. 14 is provided in the biometric information registration unit 10 in FIG. 9, and the biometric feature 108 described above is stored as registered biometric information.
 撮像部51は、認証時に被認証者の生体部位を撮像して照合用生体画像111を取得するカメラである。撮像制御部52は、撮像部51を制御して、被認証者の照合用生体画像111を複数枚取得させる。 The imaging unit 51 is a camera that captures a biometric part of a person to be authenticated and acquires a biometric image 111 for verification during authentication. The imaging control unit 52 controls the imaging unit 51 to acquire a plurality of verification subject biometric images 111.
 照合用生体画像一時記憶部53は、撮像部51により取得された照合用生体画像111を一時的に記憶しておく記憶部である。
 画像評価部54は、照合用生体画像一時記憶部53に記憶されている照合用生体画像111を読み出して、図9の生体情報登録部10における画像評価部34が用いたものと同一の評価条件102について評価を行い、評価条件102ごとの評価値112を獲得する。次に、画像評価部34は、獲得した評価値107から、所定の評価閾値以上の合致度の高さを示している評価値107を評価条件102ごとに抽出する。
The verification biometric image temporary storage unit 53 is a storage unit that temporarily stores the verification biometric image 111 acquired by the imaging unit 51.
The image evaluation unit 54 reads the verification biometric image 111 stored in the verification biometric image temporary storage unit 53 and uses the same evaluation conditions as those used by the image evaluation unit 34 in the biometric information registration unit 10 of FIG. 102 is evaluated, and an evaluation value 112 for each evaluation condition 102 is obtained. Next, the image evaluation unit 34 extracts, for each evaluation condition 102, an evaluation value 107 indicating a high degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the acquired evaluation value 107.
 特徴抽出部55は、照合用生体画像一時記憶部53に記憶されている照合用生体画像111を読み出して、その照合用生体画像111に映っている生体部位の特徴の情報である照合用生体特徴113を抽出して照合生体情報記憶部56に記憶させる。 The feature extraction unit 55 reads the biometric image 111 for collation stored in the biometric image temporary storage unit 53 for collation, and the biometric feature for collation, which is information on the characteristics of the biological part shown in the biometric image 111 for collation. 113 is extracted and stored in the collation biometric information storage unit 56.
 照合生体情報記憶部56は、特徴抽出部55が照合用生体画像111から抽出した照合用生体特徴113を記憶しておく記憶部である。
 評価順序決定部57は、特徴照合部58が生体特徴108を登録生体情報記憶部38から読み出して後述の照合動作を行う順序を、画像評価部54が照合用生体画像111を評価して得た評価値112に基づいて決定する。より具体的には、特徴照合部58は、まず、評価条件102ごとの評価値112を、照合用生体画像111と評価条件102との合致度が高い順に並べる。次に、特徴照合部58は、このように並べた評価値112を、評価値112に対応している評価条件102に置き換えて評価条件102の順序を得る。そして、特徴照合部58は、この評価条件102の順序を、登録生体情報記憶部38において、生体特徴108として評価条件102と対応付けられている生体画像101の読み出し順序に決定する。
The collation biometric information storage unit 56 is a storage unit that stores the biometric feature 113 for collation extracted from the biometric image 111 for collation by the feature extraction unit 55.
The evaluation order determination unit 57 obtains the order in which the feature matching unit 58 reads the biometric feature 108 from the registered biometric information storage unit 38 and performs the later-described matching operation, and the image evaluation unit 54 evaluates the matching biometric image 111. This is determined based on the evaluation value 112. More specifically, the feature matching unit 58 first arranges the evaluation values 112 for each evaluation condition 102 in descending order of matching between the matching biological image 111 and the evaluation condition 102. Next, the feature matching unit 58 obtains the order of the evaluation conditions 102 by replacing the evaluation values 112 arranged in this way with the evaluation conditions 102 corresponding to the evaluation values 112. Then, the feature matching unit 58 determines the order of the evaluation conditions 102 as the reading order of the biometric image 101 associated with the evaluation condition 102 as the biometric feature 108 in the registered biometric information storage unit 38.
 特徴照合部58は、まず、照合生体情報記憶部56から照合用生体特徴113を読み出す。次に、特徴照合部58は、登録生体情報記憶部38に記憶されている生体特徴108における生体画像101を、評価順序決定部57が決定した順序で読み出し、照合用生体特徴113と照合する。特徴照合部58は、このようにして照合用生体画像111と登録生体情報記憶部38から読み出した生体画像101との類似度の高低の判定を行い、この判定結果に基づいて被認証者の本人確認を行って確認結果を出力する。 The feature matching unit 58 first reads the matching biometric feature 113 from the matching biometric information storage unit 56. Next, the feature matching unit 58 reads the biometric image 101 in the biometric feature 108 stored in the registered biometric information storage unit 38 in the order determined by the evaluation order determining unit 57 and matches it with the biometric feature 113 for matching. The feature matching unit 58 determines the level of similarity between the matching biometric image 111 and the biometric image 101 read out from the registered biometric information storage unit 38 in this way, and based on the determination result, the identity of the person to be authenticated Confirm and output the confirmation result.
 このように、図10の認証部20では、画像評価部54が、図9の生体情報登録部10における画像評価部34と同一の評価条件102を適用して照合用生体画像111の評価を行う。そして、特徴照合部58が、評価順序決定部57の決定に従い、評価値112が照合用生体画像111との合致度が高いことを示しているときの評価条件102に対応付けられて登録生体情報記憶部38に記憶されている生体画像101との照合を優先して行う。 As described above, in the authentication unit 20 in FIG. 10, the image evaluation unit 54 evaluates the biometric image 111 for verification by applying the same evaluation condition 102 as the image evaluation unit 34 in the biometric information registration unit 10 in FIG. 9. . Then, according to the determination by the evaluation order determination unit 57, the feature matching unit 58 is registered in association with the evaluation condition 102 when the evaluation value 112 indicates that the matching degree with the matching biological image 111 is high. The collation with the biological image 101 stored in the storage unit 38 is performed with priority.
 このようにすることで、認証に要する時間の短縮化が期待できる。これは、同様な位置・姿勢・形状である生体部位の画像同士は高い認証精度が期待できるので、被認証者として本人が認証を行った場合に正しく本人と判定する確率が高くなり、従って、1回照合を行うだけで直ちに照合結果が出る場合が多いからである。一方、位置・姿勢・形状が大きく異なる生体部位の画像同士の照合では、認証精度が低いため、被認証者として本人が認証を行った場合に正しく本人と判定できない場合が多くなり、この場合には結果として照合を何度も繰り返す必要が生じることになるのである。 In this way, the time required for authentication can be shortened. This is because the images of living body parts having the same position, posture, and shape can be expected to have high authentication accuracy, so the probability of correctly identifying the person as the person to be authenticated increases when the person authenticates. This is because it is often the case that a collation result is obtained immediately after performing one collation. On the other hand, in the verification of images of biological parts with greatly different positions, postures, and shapes, the authentication accuracy is low, so if the person authenticates as the person to be authenticated, there are many cases where the person cannot be correctly determined. As a result, the verification needs to be repeated many times.
 図14の認証部20は、以上のように構成されている。なお、この図14に図解した認証部20の一部の構成要素を、図10に例示したような標準的な構成のコンピュータを用いて構成することができる。このためには、例えば、次に説明する照合処理をMPU41に行わせるための制御プログラムを作成する。作成された制御プログラムはハードディスク装置44若しくは可搬型記録媒体50に予め格納しておく。このプログラムでは、例えば、ハードディスク装置44を、照合用生体画像一時記憶部53、照合生体情報記憶部56、及び登録生体情報記憶部38として機能させるようにしておく。また、コンピュータ40のインタフェース装置47には、撮像部51の一例であるカメラをインタフェース装置47に接続して、コンピュータ40が撮像部51を制御して照合用生体画像111を取得させてコンピュータ40に取り込めるようにしておく。そして、MPU41に所定の指示を与えてこの制御プログラムを読み出させて実行させる。こうすることで、撮像制御部52、照合用生体画像一時記憶部53、画像評価部54、特徴抽出部55、照合生体情報記憶部56、評価順序決定部57、特徴照合部58、及び登録生体情報記憶部38が各々有している機能のコンピュータ40での提供が可能となる。 The authentication unit 20 in FIG. 14 is configured as described above. Note that some components of the authentication unit 20 illustrated in FIG. 14 can be configured using a computer having a standard configuration as illustrated in FIG. For this purpose, for example, a control program for causing the MPU 41 to perform a collation process described below is created. The created control program is stored in advance in the hard disk device 44 or the portable recording medium 50. In this program, for example, the hard disk device 44 is caused to function as the verification biometric image temporary storage unit 53, the verification biometric information storage unit 56, and the registered biometric information storage unit 38. In addition, a camera which is an example of the imaging unit 51 is connected to the interface device 47 of the computer 40, and the computer 40 controls the imaging unit 51 to acquire the biometric image 111 for verification and causes the computer 40 to Make sure you can capture it. Then, a predetermined instruction is given to the MPU 41 to read and execute this control program. By doing so, the imaging control unit 52, the biometric image temporary storage unit 53, the image evaluation unit 54, the feature extraction unit 55, the verification biometric information storage unit 56, the evaluation order determination unit 57, the feature verification unit 58, and the registered biometrics The functions that the information storage unit 38 has can be provided by the computer 40.
 次に図15について説明する。図15は、図10の認証部20で行われる照合処理の処理内容の一例を図解したフローチャートである。
 認証時に、この照合処理が開始されると、図15において、まず、S201では、撮像部51を制御して被認証者の生体部位を撮像させて照合用生体画像111を取得させ、照合用生体画像一時記憶部53に記憶させる処理を撮像制御部32が行う。図15では、このときに照合用生体画像一時記憶部53に記憶させた照合用生体画像111を『SampleImage 』と表記している。なお、以降の説明でもこの表記を用いることとする。
Next, FIG. 15 will be described. FIG. 15 is a flowchart illustrating an example of the processing content of the collation processing performed by the authentication unit 20 of FIG.
When this verification process is started at the time of authentication, in FIG. 15, first, in S201, the imaging unit 51 is controlled to capture the biometric part of the person to be authenticated to acquire the verification biometric image 111. The imaging control unit 32 performs processing to be stored in the image temporary storage unit 53. In FIG. 15, the biometric image 111 for collation stored in the biometric image temporary storage unit 53 for collation at this time is denoted as “SampleImage”. This notation is used in the following description.
 次に、S202では、変数iに初期値「0」を代入する処理を画像評価部54が行う。
 次に、S203では、図9における画像評価部34で使用したものと同一の複数の評価条件102のうち、変数iのこの処理時点での値に対応するもので『SampleImage 』を評価して、その評価結果である評価値を獲得する処理を画像評価部54が行う。
In step S <b> 202, the image evaluation unit 54 performs a process of substituting the initial value “0” for the variable i.
Next, in S203, "SampleImage" is evaluated with a value corresponding to the value at the time of processing of the variable i among the same plurality of evaluation conditions 102 used in the image evaluation unit 34 in FIG. The image evaluation unit 54 performs processing for obtaining an evaluation value that is the evaluation result.
 図15では、複数の評価条件102のうちで変数iに対応するものを、図11と同様に『Func[i] 』と表記しており、前述した『SampleImage 』を、この『Func[i] 』によって評価して得られた評価値を『EvalResult[i].value 』と表記している。なお、以降の説明でもこの表記を用いることとする。 In FIG. 15, among the plurality of evaluation conditions 102, the one corresponding to the variable i is expressed as “Func [i]” as in FIG. 11, and the above-described “SampleImage 、” is expressed as “Func [i]”. "EvalResult [i] .value" is expressed as the evaluation value obtained by the evaluation. This notation is used in the following description.
 次に、S204では、この処理の実行時点での変数iの値を『EvalResult[i].index 』に代入する処理を画像評価部54が行う。なお、『EvalResult[i].index 』は、『EvalResult[i].value 』を得たときの『Func[i] 』を特定する情報である。 Next, in S204, the image evaluation unit 54 performs a process of substituting the value of the variable i at the time of execution of this process into “EvalResult [i] .index”. Note that “EvalResult [i] .index” is information for specifying “Func [i]” when “EvalResult [i] .value” is obtained.
 S205では、変数iの現在の値に「1」を加算し、その加算結果を改めて変数iに代入する処理を画像評価部54が行う。
 S206では、この処理時点での変数iの値が、評価条件102の個数FuncNum以上であるか否かを判定する処理を画像評価部54が行う。ここで、変数iの値がFuncNum以上であると画像評価部54が判定したとき(判定結果がYesのとき)にはS207に処理が進む。一方、変数iの値がFuncNum未満であると画像評価部54が判定したとき(判定結果がNoのとき)にはS203に処理が戻り、変数iのこの処理時点での値の下で、上述したS203以降の処理が改めて行われる。
In S205, the image evaluation unit 54 performs a process of adding “1” to the current value of the variable i and substituting the addition result into the variable i.
In S <b> 206, the image evaluation unit 54 performs a process of determining whether or not the value of the variable i at the time of this process is equal to or greater than the number FuncNum of the evaluation conditions 102. Here, when the image evaluation unit 54 determines that the value of the variable i is greater than or equal to FuncNum (when the determination result is Yes), the process proceeds to S207. On the other hand, when the image evaluation unit 54 determines that the value of the variable i is less than FuncNum (when the determination result is No), the process returns to S203, and the above-described value of the variable i is below the value at the time of this process. The processing after S203 is performed again.
 S207では、『EvalResult[i].index 』と『EvalResult[i].value 』との組み合わせ(但し、i=0、…、(FuncNum-1))を、『EvalResult[i].value 』の値の大きい順に並べ替える処理を評価順序決定部57が行う。なお、図15では、『EvalResult[i].index 』と『EvalResult[i].value 』との組み合わせを『EvalResult[i]』と表記している。この並べ替えにより、
 EvalResult[0].value ≧ EvalResult[1].value ≧ 
   EvalResult[2].value ≧ …… ≧EvalResult[FuncNum-1].value
となる。
In S207, the combination of “EvalResult [i] .index” and “EvalResult [i] .value” (where i = 0,..., (FuncNum-1)) is used as the value of “EvalResult [i] .value”. The evaluation order determination unit 57 performs a process of rearranging in descending order. In FIG. 15, a combination of “EvalResult [i] .index” and “EvalResult [i] .value” is expressed as “EvalResult [i]”. With this sort,
EvalResult [0] .value ≧ EvalResult [1] .value ≧
EvalResult [2] .value ≧ …… ≧ EvalResult [FuncNum−1] .value
It becomes.
 S208では、照合用生体画像一時記憶部53に記憶されている『SampleImage 』を読み出し、その画像に映っている生体部位の特徴の情報を、照合用生体特徴113として抽出して照合生体情報記憶部56に記憶させる処理を、特徴抽出部55が行う。なお、図15では、照合用生体特徴113を『SampleFeature 』と表記しており、以降の説明でもこの表記を用いることとする。 In S208, “SampleImage” stored in the verification biometric image temporary storage unit 53 is read out, and information on the characteristics of the biological part shown in the image is extracted as the verification biometric feature 113, and the verification biometric information storage unit The feature extraction unit 55 performs processing to be stored in 56. In FIG. 15, the biometric feature 113 for matching is represented as “SampleFeature”, and this notation is used in the following description.
 S209では、変数jに初期値「0」を代入する処理を特徴照合部58が行う。
 S210では、『EvalResult[j].index 』の値を変数kに代入する処理を特徴照合部58が行う。
In S209, the feature matching unit 58 performs a process of substituting the initial value “0” for the variable j.
In S210, the feature matching unit 58 performs a process of substituting the value of “EvalResult [j] .index” into the variable k.
 S211では、照合用生体画像111から抽出した照合用生体特徴113と登録生体情報記憶部38から読み出した生体画像101とを照合する処理を特徴照合部58が行う。この処理では、まず、『SampleFeature 』を照合生体情報記憶部56から読み出すと共に、登録生体情報記憶部38に記憶されている生体特徴108において『Func[k] 』に対応付けられている生体画像101を読み出す処理が行われる。そして、読み出された『SampleFeature 』と生体画像101から抽出される特徴とが照合されて、照合用生体画像111と生体画像101との類似度を獲得する処理が行われる。 In S211, the feature matching unit 58 performs a process of matching the matching biometric feature 113 extracted from the matching biometric image 111 with the biometric image 101 read from the registered biometric information storage unit 38. In this process, first, “SampleFeature” is read from the collation biometric information storage unit 56 and the biometric image 101 associated with “Func [k]” in the biometric feature 108 stored in the registered biometric information storage unit 38. Is read out. Then, the read “SampleFeature” and the feature extracted from the biometric image 101 are collated, and a process of obtaining the similarity between the biometric image 111 for collation and the biometric image 101 is performed.
 なお、図15では、登録生体情報記憶部38に記憶されている生体特徴108において『Func[k] 』に対応付けられている生体画像101から抽出される特徴を『Template[k]』と表記している。 In FIG. 15, the feature extracted from the biometric image 101 associated with “Func [k] に お い て” in the biometric feature 108 stored in the registered biometric information storage unit 38 is denoted as “Template [k]”. is doing.
 次に、S212において、被認証者が本人であると確認できたか否かを判定する処理を特徴照合部58が行う。この判定処理は、S211の照合処理によって得られた照合用生体画像111と生体画像101との類似度が所定の閾値以上であるか否かを判定することによって行われる。 Next, in S212, the feature matching unit 58 performs a process of determining whether or not the person to be authenticated has been confirmed to be the person. This determination process is performed by determining whether the similarity between the biometric image 111 for collation obtained by the collation process in S211 and the biometric image 101 is equal to or greater than a predetermined threshold value.
 この判定処理において、被認証者が本人と確認できたと特徴照合部58が判定したとき(判定結果がYesのとき)にはS213に処理が進む。一方、この判定処理において、被認証者が本人と確認できなかったと特徴照合部58が判定したとき(判定結果がNoのとき)にはS214に処理が進む。 In this determination process, when the feature collation unit 58 determines that the person to be authenticated has been confirmed as the principal (when the determination result is Yes), the process proceeds to S213. On the other hand, in this determination process, when the feature matching unit 58 determines that the person to be authenticated cannot be confirmed as the person (when the determination result is No), the process proceeds to S214.
 S213では、照合結果として、「本人」を示す情報を出力する処理を特徴照合部58が行う。その後はこの照合処理が終了する。
 S214では、変数jの現在の値に「1」を加算し、その加算結果を改めて変数jに代入する処理を特徴照合部58が行う。
In S213, the feature matching unit 58 performs a process of outputting information indicating “person” as a matching result. Thereafter, the collation process ends.
In S214, the feature matching unit 58 performs a process of adding “1” to the current value of the variable j and substituting the addition result into the variable j.
 S215では、この処理時点での変数jの値が、評価条件102の個数FuncNum以上であるか否かを判定する処理を特徴照合部58が行う。ここで、変数jの値がFuncNum以上であると特徴照合部58が判定したとき(判定結果がYesのとき)にはS216に処理が進む。一方、変数jの値がFuncNum未満であると特徴照合部58が判定したとき(判定結果がNoのとき)にはS210に処理が戻り、変数jのこの処理時点での値の下で、上述したS210以降の処理が改めて行われる。 In S215, the feature matching unit 58 performs a process of determining whether or not the value of the variable j at the time of this process is equal to or greater than the number FuncNum of the evaluation conditions 102. Here, when the feature matching unit 58 determines that the value of the variable j is equal to or greater than FuncNum (when the determination result is Yes), the process proceeds to S216. On the other hand, when the feature matching unit 58 determines that the value of the variable j is less than FuncNum (when the determination result is No), the process returns to S210, and the above-described value of the variable j is below the value at the time of this process. The processes after S210 are performed again.
 S216では、照合結果として、「他人」を示す情報を出力する処理を特徴照合部58が行う。その後はこの照合処理が終了する。
 以上までの処理が照合処理である。この処理が行われることによって、登録生体情報記憶部38に記憶されている登録生体情報と、認証生体情報との類似度の高低の判定が行われて、その判定結果に基づいた被認証者の本人確認が行われる。また、この処理によれば、前述したように、登録時と認証時とで同様な位置・姿勢・形状である生体部位の像が映っている画像同士での照合を優先的に行うので、本人確認の照合に要する時間の短縮が期待できる。
In S <b> 216, the feature matching unit 58 performs a process of outputting information indicating “other” as a matching result. Thereafter, the collation process ends.
The process so far is the collation process. By performing this process, the degree of similarity between the registered biometric information stored in the registered biometric information storage unit 38 and the authenticated biometric information is determined, and the authentication subject's person based on the determination result is determined. Identity verification is performed. In addition, according to this process, as described above, since matching is performed preferentially between images showing images of living body parts having the same position, posture, and shape at the time of registration and at the time of authentication, The time required for verification can be shortened.
 なお、図14の認証部20では、照合用生体画像111を評価したときの評価値112が高い評価条件102の順序で、その評価条件102に対応する生体特徴108を特徴照合部58が登録生体情報記憶部38から読み出して照合動作を行う。この代わりに、生体特徴108を評価値107の高い順に並べて登録生体情報記憶部38に記憶させておき、特徴照合部58が、この記憶順序で生体特徴108を登録生体情報記憶部38から読み出して照合用生体画像111との照合動作を行うようにしてもよい。このようにすると、登録生体情報としてより適切である生体画像101を優先して照合用生体画像111との照合を行うことを可能とするので、本人確認の照合に要する時間の短縮が期待できる。 In the authentication unit 20 in FIG. 14, the feature matching unit 58 registers the biometric features 108 corresponding to the evaluation conditions 102 in the order of the evaluation conditions 102 having the highest evaluation value 112 when the biometric image 111 for matching is evaluated. The collation operation is performed by reading from the information storage unit 38. Instead, the biometric features 108 are arranged in the descending order of the evaluation value 107 and stored in the registered biometric information storage unit 38, and the feature matching unit 58 reads the biometric features 108 from the registered biometric information storage unit 38 in this storage order. You may make it perform collation operation | movement with the biometric image 111 for collation. In this way, it is possible to preferentially collate with the biometric image 111 for collation by giving priority to the biometric image 101 that is more appropriate as the registered biometric information, so that it can be expected to shorten the time required for collation of identity verification.
 ここで図16A及び図16Bについて説明する。図16Aは図11のフローチャートの第三変形例であり、登録生体情報記憶部38での生体特徴108の記憶を上述した順序にするための動作を特徴抽出部37が行う場合のものである。また、図16Bは図15のフローチャートの第一変形例であり、特徴照合部58に上述した動作を行わせる場合のものである。 Here, FIG. 16A and FIG. 16B will be described. FIG. 16A is a third modification of the flowchart of FIG. 11, in which the feature extraction unit 37 performs an operation for storing the biometric features 108 in the registered biometric information storage unit 38 in the order described above. FIG. 16B is a first modification of the flowchart of FIG. 15, and is a case where the above-described operation is performed by the feature matching unit 58.
 図16AのS131の処理は、図11のフローチャートにおけるS111の処理を置き換えるものである。このS131の処理は、図11のS110の判定処理の結果がYesのときに特徴抽出部37により実行される処理である。特徴抽出部37は、この処理において、まず、S107の処理により登録候補記憶部35に格納された全ての『TempImage[i]』と『TempResult[i] 』との組み合わせ(但し、i=0、…、(FuncNum-1))を読み出す。続いて、特徴抽出部37は、この組み合わせを、『TempResult[i] 』の値の大きい順序で登録生体情報記憶部38に記憶させる。なお、この処理では、『Func[i] 』を特定する情報も上記の組み合わせに対応付けて、登録生体情報記憶部38に記憶させる処理も特徴抽出部37により行われる。その後は、図11の生体情報登録処理が終了する。 The process of S131 in FIG. 16A replaces the process of S111 in the flowchart of FIG. The process of S131 is a process executed by the feature extraction unit 37 when the result of the determination process of S110 of FIG. 11 is Yes. In this process, the feature extraction unit 37 firstly combines all “TempImage [i]” and “TempResult [i]” stored in the registration candidate storage unit 35 by the process of S107 (provided that i = 0, ..., (FuncNum-1)) is read. Subsequently, the feature extraction unit 37 stores this combination in the registered biometric information storage unit 38 in the descending order of the value of “TempResult [i]”. In this process, the feature extraction unit 37 also stores information for specifying “Func [i]” in the registered biometric information storage unit 38 in association with the above combination. After that, the biometric information registration process in FIG. 11 ends.
 また、図16BのS221からS223にかけての処理は、図15のフローチャートにおけるS207からS212にかけての処理を置き換えるものである。但し、図15におけるS215の判定処理の結果がNoの場合には、図16BのS223に処理が戻る。 Also, the process from S221 to S223 in FIG. 16B replaces the process from S207 to S212 in the flowchart of FIG. However, if the result of the determination process in S215 in FIG. 15 is No, the process returns to S223 in FIG. 16B.
 図16Bにおいて、S221では、照合用生体画像一時記憶部53に記憶されている『SampleImage 』を読み出し、その画像に映っている照合用生体特徴113を抽出して照合生体情報記憶部56に記憶させる処理を、特徴抽出部55が行う。なお、このS221の処理内容は、図15におけるS208の処理と同一のものである。 In FIG. 16B, in S221, “SampleImage” stored in the verification biometric image temporary storage unit 53 is read out, and the verification biometric feature 113 shown in the image is extracted and stored in the verification biometric information storage unit 56. The feature extraction unit 55 performs the processing. Note that the processing content of S221 is the same as the processing of S208 in FIG.
 次に、S222では、変数jに初期値「0」を代入する処理を特徴照合部58が行う。なお、このS222の処理内容は、図15におけるS209の処理と同一のものである。
 次に、S223では、照合用生体画像111から抽出した照合用生体特徴113と登録生体情報記憶部38から読み出した生体画像101とを照合する処理を特徴照合部58が行う。この処理では、まず、『SampleFeature 』を照合生体情報記憶部56から読み出すと共に、登録生体情報記憶部38に記憶されている生体特徴108において『Func[j] 』に対応付けられている生体画像101を読み出す処理が行われる。そして、読み出された『SampleFeature 』と生体画像101から抽出される特徴(すなわち、『Template[j] 』)とが照合されて、照合用生体画像111と生体画像101との類似度を獲得する処理が行われる。その後は、図15のS212に処理が進む。
Next, in S222, the feature matching unit 58 performs a process of substituting the initial value “0” for the variable j. Note that the processing content of S222 is the same as the processing of S209 in FIG.
In step S223, the feature matching unit 58 performs a process of matching the matching biometric feature 113 extracted from the matching biometric image 111 with the biometric image 101 read from the registered biometric information storage unit 38. In this process, first, “SampleFeature” is read from the collation biometric information storage unit 56 and the biometric image 101 associated with “Func [j]” in the biometric feature 108 stored in the registered biometric information storage unit 38. Is read out. Then, the read “SampleFeature” and the feature extracted from the biological image 101 (that is, “Template [j]”) are collated to obtain the similarity between the collating biological image 111 and the biological image 101. Processing is performed. Thereafter, the process proceeds to S212 in FIG.
 次に図17について説明する。図17は、図9に図解した生体情報登録部10の具体的構成の変形例である。
 図17において、図9におけるものと同一の構成要素には同一の符号を付しており、これらについては詳細な説明を省略する。
Next, FIG. 17 will be described. FIG. 17 is a modification of the specific configuration of the biometric information registration unit 10 illustrated in FIG.
In FIG. 17, the same components as those in FIG. 9 are denoted by the same reference numerals, and detailed description thereof will be omitted.
 図17に図解されている構成例は、誘導メッセージ表示部39が追加されている点のみにおいて、図9に図解されている構成例と異なっている。
 図17において、評価結果チェック部36は、図9の構成例と同様に、登録候補記憶部35に評価値107が格納されていない評価条件102が存在するか否かを判定する。ここで、そのような評価条件102が存在すると判定した場合には、評価結果チェック部36は、撮像制御部32に指示を与えて撮像部31を制御させて、更なる生体画像101の取得を行わせる。但し、図17の構成例では、評価結果チェック部36は、撮像制御部32への指示と並行して誘導メッセージ表示部39にも所定の指示を与える。
The configuration example illustrated in FIG. 17 is different from the configuration example illustrated in FIG. 9 only in that a guidance message display unit 39 is added.
In FIG. 17, the evaluation result check unit 36 determines whether there is an evaluation condition 102 in which the evaluation value 107 is not stored in the registration candidate storage unit 35, as in the configuration example of FIG. 9. Here, when it is determined that such an evaluation condition 102 exists, the evaluation result check unit 36 gives an instruction to the imaging control unit 32 to control the imaging unit 31 to acquire a further biological image 101. Let it be done. However, in the configuration example of FIG. 17, the evaluation result check unit 36 gives a predetermined instruction to the guidance message display unit 39 in parallel with the instruction to the imaging control unit 32.
 誘導メッセージ表示部39は、評価結果チェック部36から所定の指示を受けると所定のメッセージの表示を行う表示装置である。このメッセージは、登録候補記憶部35に評価値107が格納されていない評価条件102を満たす生体画像101を撮像部31で取得できるように被認証者の生体部位を誘導する内容のものである。この誘導メッセージ表示部39によって、図1の案内出力部16の機能が提供される。 The guidance message display unit 39 is a display device that displays a predetermined message when receiving a predetermined instruction from the evaluation result check unit 36. This message has a content for guiding the body part of the person to be authenticated so that the imaging unit 31 can acquire the biological image 101 that satisfies the evaluation condition 102 in which the evaluation value 107 is not stored in the registration candidate storage unit 35. The guidance message display unit 39 provides the function of the guidance output unit 16 of FIG.
 誘導メッセージ表示部39が、このようなメッセージの表示を行うと、被認証者がこのメッセージの内容に従うことで、登録候補記憶部35において不足している、評価条件102を満たす生体画像101の素早い取得が期待できるようになる。 When the guidance message display unit 39 displays such a message, the biometric image 101 that satisfies the evaluation condition 102 that is insufficient in the registration candidate storage unit 35 is quickly obtained because the authenticated person follows the content of the message. Acquisition can be expected.
    1 生体認証システム
   10 生体情報登録部
   11 生体画像取得部
   12 評価部
   13 評価値抽出部
   14、38 登録生体情報記憶部
   15、36 評価結果チェック部
   16 案内出力部
   20 認証部
   31 撮像部
   32 撮像制御部
   33 生体画像一時記憶部
   34、54 画像評価部
   35 登録候補記憶部
   37、55 特徴抽出部
   39 誘導メッセージ表示部
   40 コンピュータ
   41 MPU
   42 ROM
   43 RAM
   44 ハードディスク装置
   45 入力装置
   46 表示装置
   47 インタフェース装置
   48 記録媒体駆動装置
   49 バスライン
   50 可搬型記録媒体
   51 撮像部
   52 撮像制御部
   53 照合用生体画像一時記憶部
   56 照合生体情報記憶部
   57 評価順序決定部
   58 特徴照合部
  100 従来手法による登録生体情報
  101 生体画像
  102 評価条件
  103 抽出生体画像
  104 評価結果チェック
  105 登録生体情報
  106 認証生体情報
  107、112 評価値
  108 生体特徴
  111 照合用生体画像
  113 照合用生体特徴
DESCRIPTION OF SYMBOLS 1 Biometric authentication system 10 Biometric information registration part 11 Biometric image acquisition part 12 Evaluation part 13 Evaluation value extraction part 14, 38 Registration biometric information storage part 15, 36 Evaluation result check part 16 Guidance output part 20 Authentication part 31 Imaging part 32 Imaging control Unit 33 biological image temporary storage unit 34, 54 image evaluation unit 35 registration candidate storage unit 37, 55 feature extraction unit 39 guidance message display unit 40 computer 41 MPU
42 ROM
43 RAM
44 Hard Disk Device 45 Input Device 46 Display Device 47 Interface Device 48 Recording Medium Drive Device 49 Bus Line 50 Portable Recording Medium 51 Imaging Unit 52 Imaging Control Unit 53 Biological Image Temporary Storage Unit for Verification 56 Verification Biometric Information Storage Unit 57 Evaluation Order Determination Unit 58 Feature collation unit 100 Registered biometric information by conventional method 101 Biometric image 102 Evaluation condition 103 Extracted biometric image 104 Evaluation result check 105 Registered biometric information 106 Authentication biometric information 107, 112 Evaluation value 108 Biometric feature 111 Biometric image for collation 113 For collation Biological features

Claims (12)

  1.  登録時に予め登録しておいた被認証者の登録生体情報と認証時に取得する該被認証者の認証生体情報との類似度に基づいて該被認証者の本人確認を行う生体認証システムであって、
     前記被認証者の生体画像を複数枚取得する生体画像取得部と、
     前記生体画像取得部が取得した複数枚の生体画像の各々を、予め定められている複数の評価条件の各々について評価して、該生体画像についての該評価条件との合致度の高さを示す評価値を、該生体画像の評価結果として該複数の評価条件ごとに出力する評価部と、
     前記評価部が出力する前記複数枚の生体画像の各々についての評価値から、所定の評価閾値以上の合致度の高さを示している評価値を前記評価条件ごとに抽出し、該抽出によって評価値が同一の評価条件において複数抽出された場合には、抽出された複数の評価値から、更に、該同一の評価条件との合致度が最も高い評価値の抽出を行う評価値抽出部と、
     前記評価値抽出部が抽出した評価値に対応する生体画像を、前記被認証者の登録生体情報として複数記憶しておく登録生体情報記憶部と、
     前記登録生体情報記憶部に記憶されている登録生体情報と、前記認証生体情報との類似度の高低の判定を行い、該判定結果に基づいて前記被認証者の本人確認を行う認証部と、を含む生体認証システム。
    A biometric authentication system that performs identity verification of a person to be authenticated based on a similarity between the registered biometric information of the person to be authenticated registered in advance at the time of registration and the authentication biometric information of the person to be authenticated acquired at the time of authentication ,
    A biometric image acquisition unit that acquires a plurality of biometric images of the person to be authenticated;
    Each of the plurality of biological images acquired by the biological image acquisition unit is evaluated for each of a plurality of predetermined evaluation conditions, and the degree of coincidence with the evaluation conditions for the biological images is indicated. An evaluation unit that outputs an evaluation value for each of the plurality of evaluation conditions as an evaluation result of the biological image;
    From the evaluation values for each of the plurality of biological images output by the evaluation unit, an evaluation value indicating the degree of coincidence that is equal to or higher than a predetermined evaluation threshold is extracted for each evaluation condition, and evaluation is performed by the extraction. When a plurality of values are extracted under the same evaluation condition, an evaluation value extraction unit that extracts an evaluation value having the highest degree of matching with the same evaluation condition from the plurality of extracted evaluation values;
    A registered biometric information storage unit that stores a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction unit as registered biometric information of the person to be authenticated;
    An authentication unit that determines the level of similarity between the registered biometric information stored in the registered biometric information storage unit and the authentication biometric information, and performs identity verification of the person to be authenticated based on the determination result; A biometric authentication system.
  2.  前記評価条件は、前記生体画像上での前記被認証者の生体部位の像の大きさの情報を、評価値に対応させる条件として含んでおり、
     前記評価部は、前記生体画像取得部が取得した生体画像における前記被認証者の生体部位の像の該生体画像上での大きさと、前記評価条件に含まれている大きさの情報との合致度の高さを示す評価値を出力する、
    請求項1に記載の生体認証システム。
    The evaluation condition includes information on the size of the image of the body part of the person to be authenticated on the biometric image as a condition that corresponds to an evaluation value,
    The evaluation unit matches the size of the image of the body part of the person to be authenticated in the biometric image acquired by the biometric image acquisition unit on the biometric image and the size information included in the evaluation condition Output an evaluation value indicating the height of the degree,
    The biometric authentication system according to claim 1.
  3.  前記評価条件は、前記生体画像上での前記被認証者の生体部位の像の傾きの情報を、前記評価値に対応させる条件として含んでおり、
     前記評価部は、前記生体画像取得部が取得した生体画像における前記被認証者の生体部位の像の該生体画像上での傾きと、前記評価条件に含まれている傾きの情報との合致度の高さを示す評価値を出力する、
    請求項1に記載の生体認証システム。
    The evaluation condition includes information on the inclination of the image of the biometric part of the person to be authenticated on the biometric image as a condition that corresponds to the evaluation value.
    The evaluation unit is a degree of coincidence between an inclination on the biological image of an image of the biological part of the person to be authenticated in the biological image acquired by the biological image acquisition unit, and inclination information included in the evaluation condition. Output an evaluation value indicating the height of
    The biometric authentication system according to claim 1.
  4.  前記評価条件は、前記生体画像上での前記被認証者の生体部位の像の形状の情報を、前記評価値に対応させる条件として含んでおり、
     前記評価部は、前記生体画像取得部が取得した生体画像における前記被認証者の生体部位の像の該生体画像上での形状と、前記評価条件に含まれている形状の情報との合致度の高さを示す評価値を出力する、
    請求項1に記載の生体認証システム。
    The evaluation condition includes information on the shape of the image of the body part of the person to be authenticated on the biometric image as a condition that corresponds to the evaluation value.
    The evaluation unit is a degree of coincidence between the shape on the biological image of the image of the biological part of the person to be authenticated in the biological image acquired by the biological image acquisition unit and information on the shape included in the evaluation condition Output an evaluation value indicating the height of
    The biometric authentication system according to claim 1.
  5.  前記評価値抽出部による前記評価条件ごとの前記評価値の抽出において評価値の抽出がされなかった評価条件が存在したか否かを判定し、該評価値の抽出がされなかった評価条件が存在したと判定した場合に、前記生体画像取得部による前記生体画像の取得と、前記評価部による該生体画像の評価と、該評価値抽出部による前記評価値の抽出とを再度行わせる制御を行う評価結果チェック部を更に有する請求項1に記載の生体認証システム。 In the extraction of the evaluation value for each of the evaluation conditions by the evaluation value extraction unit, it is determined whether there is an evaluation condition for which the evaluation value has not been extracted, and there is an evaluation condition for which the evaluation value has not been extracted. If it is determined that the biometric image is acquired by the biometric image acquisition unit, the biometric image is evaluated by the evaluation unit, and the evaluation value is extracted by the evaluation value extraction unit. The biometric authentication system according to claim 1, further comprising an evaluation result check unit.
  6.  前記評価結果チェック部は、前記評価値抽出部による前記評価条件ごとの前記評価値の抽出において評価値の抽出がされなかった評価条件が存在している場合でも、該評価値抽出部に抽出された評価値が所定数に達した場合には、前記制御を終了する請求項5に記載の生体認証システム。 The evaluation result check unit is extracted by the evaluation value extraction unit even when there is an evaluation condition in which the evaluation value is not extracted in the evaluation value extraction for each of the evaluation conditions by the evaluation value extraction unit. The biometric authentication system according to claim 5, wherein when the evaluation value reaches a predetermined number, the control is terminated.
  7.  前記登録生体情報記憶部は、前記評価値抽出部により抽出された評価値が前記評価部による評価結果である生体画像のうちで、該評価値により示されている前記評価条件との合致度が高い順に所定枚数の生体画像を記憶しておく請求項1に記載の生体認証システム。 The registered biometric information storage unit has a degree of coincidence with the evaluation condition indicated by the evaluation value among the biometric images in which the evaluation value extracted by the evaluation value extraction unit is an evaluation result by the evaluation unit. The biometric authentication system according to claim 1, wherein a predetermined number of biometric images are stored in descending order.
  8.  前記登録生体情報記憶部は、前記評価値抽出部により抽出された評価値に対応する生体画像を、該評価値を該評価部が出力したときの評価条件に対応付けて、前記登録生体情報として記憶しておき、
     前記認証部は、前記認証生体情報として取得する該被認証者の前記認証時の生体画像に対して前記評価部と同一の評価を行って該認証時の生体画像についての評価値を前記複数の評価条件ごとに取得し、該評価値が得られたときの評価条件に対応付けられて前記登録生体情報記憶部に記憶されている生体画像を、該評価値が示している前記合致度が高い順序で読み出して、該認証時の生体画像との類似度の高低の判定を行う、
    請求項1に記載の生体認証システム。
    The registered biometric information storage unit associates a biometric image corresponding to the evaluation value extracted by the evaluation value extraction unit with an evaluation condition when the evaluation unit outputs the evaluation value as the registered biometric information. Remember,
    The authentication unit performs the same evaluation as the evaluation unit on the biometric image at the time of authentication of the person to be authenticated acquired as the authentication biometric information, and obtains an evaluation value for the biometric image at the time of the authentication. The degree of coincidence indicated by the evaluation value is high for a biometric image acquired for each evaluation condition and associated with the evaluation condition when the evaluation value is obtained and stored in the registered biometric information storage unit Read in order to determine the level of similarity with the biometric image at the time of authentication,
    The biometric authentication system according to claim 1.
  9.  前記登録生体情報記憶部は、前記評価値抽出部により抽出された評価値が前記評価部による評価結果である生体画像を、該評価値により示されている前記評価条件との合致度の高さが高い順の順序に並べて記憶しておき、
     前記認証部は、前記登録生体情報記憶部に記憶されている生体画像を該登録生体情報記憶部に並べられている順序で読み出して前記認証生体情報との類似度の高低の判定を行う、
    請求項1に記載の生体認証システム。
    The registered biometric information storage unit has a high degree of coincidence with a biometric image whose evaluation value extracted by the evaluation value extraction unit is an evaluation result by the evaluation unit and the evaluation condition indicated by the evaluation value. Store them in the order of the highest
    The authentication unit reads out biometric images stored in the registered biometric information storage unit in the order arranged in the registered biometric information storage unit, and determines whether the similarity with the authentication biometric information is high or low.
    The biometric authentication system according to claim 1.
  10.  前記評価結果チェック部が前記評価値抽出部による前記評価条件ごとの前記評価値の抽出において評価値の抽出がされなかった評価条件が存在したと判定した場合に、該評価条件を満たす生体画像を前記生体画像取得部が得られるように前記被認証者の生体部位を誘導するための該被認証者への案内を出力する案内出力部を更に有する請求項1に記載の生体認証システム。 When the evaluation result check unit determines that there is an evaluation condition in which the evaluation value is not extracted in the extraction of the evaluation value for each evaluation condition by the evaluation value extraction unit, a biological image satisfying the evaluation condition is obtained. The biometric authentication system according to claim 1, further comprising a guidance output unit that outputs a guidance to the person to be authenticated for guiding the biological part of the person to be authenticated so that the biometric image acquisition unit is obtained.
  11.  登録時に予め登録しておいた被認証者の登録生体情報と認証時に取得する該被認証者の認証生体情報との類似度に基づいて該被認証者の本人確認を行う生体認証方法であって、
     評価部が、前記被認証者の複数枚の生体画像の各々を、予め定められている複数の評価条件の各々について評価して、該生体画像についての該評価条件との合致度の高さを示す評価値を、該生体画像の評価結果として該複数の評価条件ごとに出力し、
     評価値抽出部が、前記複数枚の生体画像の各々について前記評価部が出力した評価値から、所定の評価閾値以上の合致度の高さを示している評価値を前記評価条件ごとに抽出し、該抽出によって評価値が同一の評価条件において複数抽出された場合には、抽出された複数の評価値から、更に、該同一の評価条件との合致度が最も高い評価値の抽出を行い、
     登録生体情報記憶部が、前記評価値抽出部により抽出された評価値に対応する生体画像を、前記被認証者の登録生体情報として複数記憶しておき、
     認証部が、前記登録生体情報記憶部に記憶されている登録生体情報と、前記認証生体情報との類似度の高低の判定を行い、該判定結果に基づいて前記被認証者の本人確認を行う、
    生体認証方法。
    A biometric authentication method for performing identity verification of a person to be authenticated based on a similarity between registered biometric information of a person to be authenticated registered in advance at the time of registration and the authentication biometric information of the person to be authenticated acquired at the time of authentication. ,
    The evaluation unit evaluates each of the plurality of biological images of the person to be authenticated for each of a plurality of predetermined evaluation conditions, and determines the degree of coincidence with the evaluation conditions for the biological image. An evaluation value to be output as an evaluation result of the biological image for each of the plurality of evaluation conditions,
    An evaluation value extraction unit extracts, for each of the evaluation conditions, an evaluation value indicating a high degree of coincidence that is equal to or higher than a predetermined evaluation threshold value from the evaluation values output by the evaluation unit for each of the plurality of biological images. In the case where a plurality of evaluation values are extracted by the extraction under the same evaluation condition, an evaluation value having the highest degree of matching with the same evaluation condition is further extracted from the plurality of extracted evaluation values,
    A registered biometric information storage unit stores a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction unit as registered biometric information of the person to be authenticated,
    The authentication unit determines the level of similarity between the registered biometric information stored in the registered biometric information storage unit and the authenticated biometric information, and performs identity verification of the person to be authenticated based on the determination result ,
    Biometric authentication method.
  12.  登録時に予め登録しておいた被認証者の登録生体情報と認証時に取得する該被認証者の認証生体情報との類似度に基づいた該被認証者の本人確認をコンピュータに行わせるためのプログラムであって、該コンピュータに、
     前記被認証者の複数枚の生体画像の各々を、予め定められている複数の評価条件の各々について評価して、該生体画像についての該評価条件との合致度の高さを示す評価値を、該生体画像の評価結果として該複数の評価条件ごとに出力する評価処理と、
     前記複数枚の生体画像の各々について前記評価処理により出力された評価値から、所定の評価閾値以上の合致度の高さを示している評価値を前記評価条件ごとに抽出し、該抽出によって評価値が同一の評価条件において複数抽出された場合には、抽出された複数の評価値から、更に、該同一の評価条件との合致度が最も高い評価値の抽出を行う評価値抽出処理と、
     前記評価値抽出処理によって抽出された評価値に対応する生体画像を、前記被認証者の登録生体情報として記憶部に複数記憶させる登録生体情報記憶処理と、
     前記記憶部に記憶されている登録生体情報と、前記認証生体情報との類似度の高低の判定を行い、該判定結果に基づいて前記被認証者の本人確認を行う認証処理と、
    を行わせるためのプログラム。
    A program for causing a computer to perform identity verification of a person to be authenticated based on the similarity between the registered biometric information of the authenticated person registered in advance at the time of registration and the authenticated biometric information of the authenticated person acquired at the time of authentication And the computer
    Each of the plurality of biological images of the person to be authenticated is evaluated for each of a plurality of predetermined evaluation conditions, and an evaluation value indicating a high degree of coincidence with the evaluation conditions for the biological image is obtained. An evaluation process for outputting the evaluation result of the biological image for each of the plurality of evaluation conditions;
    For each of the plurality of biological images, an evaluation value indicating a degree of coincidence that is equal to or higher than a predetermined evaluation threshold is extracted for each evaluation condition from the evaluation values output by the evaluation process, and evaluation is performed by the extraction. When a plurality of values are extracted under the same evaluation condition, an evaluation value extraction process for extracting an evaluation value having the highest degree of coincidence with the same evaluation condition from the plurality of extracted evaluation values;
    A registered biometric information storage process for storing a plurality of biometric images corresponding to the evaluation values extracted by the evaluation value extraction process in the storage unit as registered biometric information of the person to be authenticated;
    An authentication process for determining the degree of similarity between the registered biometric information stored in the storage unit and the authentication biometric information, and performing identity verification of the person to be authenticated based on the determination result;
    A program to let you do.
PCT/JP2010/064970 2010-09-01 2010-09-01 Biometric authentication system, biometric authentication method and program WO2012029150A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/064970 WO2012029150A1 (en) 2010-09-01 2010-09-01 Biometric authentication system, biometric authentication method and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/064970 WO2012029150A1 (en) 2010-09-01 2010-09-01 Biometric authentication system, biometric authentication method and program

Publications (1)

Publication Number Publication Date
WO2012029150A1 true WO2012029150A1 (en) 2012-03-08

Family

ID=45772286

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/064970 WO2012029150A1 (en) 2010-09-01 2010-09-01 Biometric authentication system, biometric authentication method and program

Country Status (1)

Country Link
WO (1) WO2012029150A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013145280A1 (en) * 2012-03-30 2013-10-03 富士通株式会社 Biometric authentication device, biometric authentication method, and biometric authentication computer program
JP2017049955A (en) * 2015-09-04 2017-03-09 富士通株式会社 Biometric authentication apparatus, biometric authentication method, and biometric authentication program
CN109388926A (en) * 2017-08-14 2019-02-26 三星电子株式会社 Handle the method for biometric image and the electronic equipment including this method
CN109923581A (en) * 2016-10-31 2019-06-21 株式会社Dds Skin information processing routine and skin information processing unit
CN110603568A (en) * 2017-05-09 2019-12-20 株式会社Dds Authentication information processing program and authentication information processing device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006085268A (en) * 2004-09-14 2006-03-30 Fuji Photo Film Co Ltd Biometrics system and biometrics method
JP2007159610A (en) * 2005-12-09 2007-06-28 Matsushita Electric Ind Co Ltd Registration device, authentication device, registration authentication device, registration method, authentication method, registration program, and authentication program
JP2007249588A (en) * 2006-03-15 2007-09-27 Omron Corp Face image registration device, method, program, and storage medium
JP2009003866A (en) * 2007-06-25 2009-01-08 Panasonic Corp Face authentication device
JP2009258991A (en) * 2008-04-16 2009-11-05 Panasonic Electric Works Co Ltd Face image registration device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006085268A (en) * 2004-09-14 2006-03-30 Fuji Photo Film Co Ltd Biometrics system and biometrics method
JP2007159610A (en) * 2005-12-09 2007-06-28 Matsushita Electric Ind Co Ltd Registration device, authentication device, registration authentication device, registration method, authentication method, registration program, and authentication program
JP2007249588A (en) * 2006-03-15 2007-09-27 Omron Corp Face image registration device, method, program, and storage medium
JP2009003866A (en) * 2007-06-25 2009-01-08 Panasonic Corp Face authentication device
JP2009258991A (en) * 2008-04-16 2009-11-05 Panasonic Electric Works Co Ltd Face image registration device

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013145280A1 (en) * 2012-03-30 2013-10-03 富士通株式会社 Biometric authentication device, biometric authentication method, and biometric authentication computer program
EP2833319A4 (en) * 2012-03-30 2015-06-24 Fujitsu Ltd Biometric authentication device, biometric authentication method, and biometric authentication computer program
JPWO2013145280A1 (en) * 2012-03-30 2015-08-03 富士通株式会社 Biometric authentication apparatus, biometric authentication method, and biometric authentication computer program
US9305209B2 (en) 2012-03-30 2016-04-05 Fujitsu Limited Biometric authentication apparatus, biometric authentication method, and computer program for biometric authentication
JP2017049955A (en) * 2015-09-04 2017-03-09 富士通株式会社 Biometric authentication apparatus, biometric authentication method, and biometric authentication program
CN109923581A (en) * 2016-10-31 2019-06-21 株式会社Dds Skin information processing routine and skin information processing unit
CN109923581B (en) * 2016-10-31 2022-12-02 株式会社Dds Skin information processing method and skin information processing device
CN110603568A (en) * 2017-05-09 2019-12-20 株式会社Dds Authentication information processing program and authentication information processing device
CN110603568B (en) * 2017-05-09 2023-05-02 株式会社Dds Authentication information processing method and authentication information processing apparatus
CN109388926A (en) * 2017-08-14 2019-02-26 三星电子株式会社 Handle the method for biometric image and the electronic equipment including this method
CN109388926B (en) * 2017-08-14 2024-02-09 三星电子株式会社 Method of processing biometric image and electronic device including the same

Similar Documents

Publication Publication Date Title
JP7318691B2 (en) Image processing device, image processing method, face authentication system and program
JP4340618B2 (en) Biometric information authentication apparatus and method, biometric information authentication program, and computer-readable recording medium recording the biometric information authentication program
US9298996B2 (en) Biometric authentication device and method
JP6167733B2 (en) Biometric feature vector extraction device, biometric feature vector extraction method, and biometric feature vector extraction program
JP5831193B2 (en) User detection device, method and program
JP5363587B2 (en) Biometric information registration method, biometric authentication method, and biometric authentication device
JP5919944B2 (en) Non-contact biometric authentication device
JP5812109B2 (en) Biological information processing apparatus, biological information processing method, and computer program for biological information processing
WO2014112081A1 (en) Biometric authentication device, biometric authentication system, and biometric authentication method
WO2013161077A1 (en) Biometric authentication device, biometric authentication program, and biometric authentication method
JP2011159035A (en) Biometric authentication apparatus, biometric authentication method and program
WO2012144105A1 (en) Biometric authentication system
WO2012029150A1 (en) Biometric authentication system, biometric authentication method and program
JP2023063314A (en) Information processing device, information processing method, and recording medium
JP5915336B2 (en) Biometric authentication apparatus, biometric authentication method, and biometric authentication computer program
JP5480532B2 (en) Image processing apparatus, image processing method, and program for causing computer to execute the method
JPWO2020050413A1 (en) Face image candidate determination device for authentication, face image candidate determination method for authentication, program, and recording medium
JP6908843B2 (en) Image processing equipment, image processing method, and image processing program
KR101656212B1 (en) system for access control using hand gesture cognition, method thereof and computer recordable medium storing the method
CN108875549A (en) Image-recognizing method, device, system and computer storage medium
JP6187262B2 (en) Biological information processing apparatus, biological information processing method, and computer program for biological information processing
JP2010240215A (en) Vein depth determination apparatus, vein depth determination method and program
JP2006277146A (en) Collating method and collating device
JP2018092272A (en) Biometric authentication apparatus, biometric authentication method and program
JP6364828B2 (en) Biometric authentication device and portable electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10856701

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10856701

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP