WO2011052085A1 - Procédé d'enregistrement d'informations biométriques, procédé d'authentification biométrique, et dispositif d'authentification biométrique - Google Patents

Procédé d'enregistrement d'informations biométriques, procédé d'authentification biométrique, et dispositif d'authentification biométrique Download PDF

Info

Publication number
WO2011052085A1
WO2011052085A1 PCT/JP2009/068728 JP2009068728W WO2011052085A1 WO 2011052085 A1 WO2011052085 A1 WO 2011052085A1 JP 2009068728 W JP2009068728 W JP 2009068728W WO 2011052085 A1 WO2011052085 A1 WO 2011052085A1
Authority
WO
WIPO (PCT)
Prior art keywords
vein
data
feature amount
biometric authentication
image
Prior art date
Application number
PCT/JP2009/068728
Other languages
English (en)
Japanese (ja)
Inventor
一博 古村
裕之 高松
Original Assignee
富士通フロンテック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通フロンテック株式会社 filed Critical 富士通フロンテック株式会社
Priority to EP09850864.1A priority Critical patent/EP2495699B1/fr
Priority to JP2011538182A priority patent/JP5363587B2/ja
Priority to PCT/JP2009/068728 priority patent/WO2011052085A1/fr
Publication of WO2011052085A1 publication Critical patent/WO2011052085A1/fr
Priority to US13/448,872 priority patent/US8660318B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Definitions

  • the present invention relates to a biometric information registration method, biometric authentication method, and biometric authentication device, and in particular, biometric information registration that extracts a human physical feature and collates it with a large number of preregistered biometric information for personal authentication.
  • the present invention relates to a method, a biometric authentication method, and a biometric authentication apparatus.
  • a biometric authentication device that performs personal authentication using human physical characteristics is widely used.
  • physical characteristics fingerprints, irises in the eyes, veins of palms and fingers, voice prints, facial shapes, and the like are used, and authentication based on these has been put to practical use.
  • palms and finger veins are physical features that are suitable for personal authentication because of their large number, complex patterns, life-long changes, and difficult to counterfeit. .
  • Vein authentication uses the characteristics of hemoglobin in red blood cells in the blood. That is, the hemoglobin in the red blood cells flowing in the veins is changed to oxygenated hemoglobin that is bound to oxygen in the artery to reduced hemoglobin that is not bound to oxygen, and the reduced hemoglobin is around 760 nanometers. It has a characteristic of absorbing near infrared rays having a wavelength of. Therefore, when a near infrared ray is irradiated toward the palm and the reflected light is used as an image, a captured image in which a vein portion having weak reflection becomes black is obtained.
  • the biometric authentication device registers the captured image obtained in this way in the biometric information database as biometric information.
  • the acquired captured image is collated with the registered biometric information, It will be determined whether or not you are the person.
  • the captured image is collated with the registered biometric information one by one, and after the collation with all the registered biometric information is completed, whether the similarity between the two is equal to or greater than a predetermined value. Judging whether or not you are the person.
  • Biometric information such as a captured image has a very large amount of data. Therefore, it takes time for each verification process, and the processing time required for determination increases as the number of registered cases increases. In order to shorten the waiting time until the authentication is completed, it is necessary to limit the number of registrations. In order to increase the number of registrations, it is necessary to provide a plurality of sets of biometric authentication devices that limit the number of registrations.
  • a biometric authentication device For such a so-called 1-to-N authentication process, a biometric authentication device has been proposed in which the verification time can be shortened and the number of registrations can be increased (see, for example, Patent Document 1).
  • a vein feature amount with a small data amount generated from a captured image and vein data of the captured image are registered in advance.
  • a score representing a rough similarity is calculated from the vein feature amount generated from the captured image and the registered vein feature amount, and the collation order is determined according to the score sorting result. deep. Then, only the vein data that is higher in the collation order is subjected to collation, and detailed collation processing is performed.
  • a vein feature amount for narrowing down the verification target a frequency component representing a sparse state of the vein pattern, an angle component representing the direction of the vein pattern, and a curve representing the direction of the curved portion of the vein pattern Use ingredients and.
  • the collation order is determined based on the three vein feature quantities.
  • the accuracy of the vein feature quantities is low, the probability that vein data with high similarity is higher in the collation order is obtained. Therefore, there is a problem that the maximum number of registrations cannot be increased.
  • the present invention has been made in view of these points, and an object thereof is to provide a biometric information registration method, biometric authentication method, and biometric authentication device capable of increasing the speed of collation processing and expanding the maximum number of registrations.
  • a computer extracts vein data and feature amounts from a captured image that is the biometric information captured by the imaging device.
  • a biometric information registration method comprising: a fourth vein feature amount that is a component; and a fifth vein feature amount that is a vein amount that characterizes the amount of veins included in the vein image is provided. Is done.
  • a computer extracts vein data and a feature amount from a captured image, which is the biometric information captured by the imaging device, and is registered in the vein database.
  • the registered vein data and the registered feature amount of each record are acquired, a score representing a rough similarity is obtained from the extracted feature amount and the acquired registered feature amount, and the collation order is determined based on the sorting result of the scores.
  • the first vein feature amount which is a frequency component that characterizes the periodicity of the vein by Fourier transform, and the Fourier transform of the vein image.
  • a second vein feature amount that is an angle component that characterizes the directionality of the vein, a third vein feature amount that is a curvature direction component that characterizes the direction of vein curvature relative to the vein image, and the vein image.
  • a fourth vein feature amount that is a segment direction component that characterizes the direction of the vein segment into which the vein is divided, and a fifth vein feature amount that is a vein amount that characterizes the amount of veins included in the vein image , And a biometric authentication method.
  • a biometric authentication device that performs biometric authentication of a user from biometric information
  • an imaging device that captures a user's vein, and vein data from a captured image that is the biometric information captured by the imaging device
  • a data extraction unit that extracts a feature amount
  • a vein database that stores the vein data and the feature amount extracted by the data extraction unit as registered vein data and a registered feature amount
  • the vein that is extracted by the data extraction unit
  • a data registration unit for registering data and the feature quantity in the vein database as the registered vein data and the registered feature quantity
  • a data acquisition unit for obtaining the registered vein data and the registered feature quantity from the vein database, and the data
  • the feature amount extracted by the extraction unit and the data acquired by the data acquisition unit A score representing a rough similarity is obtained from the registered feature value, a collation order determining unit that determines a collation order according to the score sorting result, and the data extracting unit according to the collation order determined by the collation order determining unit
  • a collation processing unit that coll
  • the first vein feature amount which is a frequency component that characterizes the periodicity of the vein by Fourier transformation with respect to the vein image, and the angle component that characterizes the directionality of the vein by Fourier transformation with respect to the vein image.
  • 2 vein feature quantities a third vein feature quantity that is a curve direction component that characterizes the direction of the vein curvature with respect to the vein image, and a vein segmented with respect to the vein image
  • a fourth vein feature amount that is a segment direction component that characterizes the direction of the vein segment
  • a fifth vein feature amount that is a vein amount that characterizes the amount of veins included in the vein image.
  • a biometric authentication device is provided.
  • the accuracy as a feature amount is improved by determining the collation order using five types of feature amounts.
  • the collation order is determined, the probability that the record of the user being collated will be higher can be increased. Therefore, the ratio of collation can be reduced and the number of collations can be reduced, thereby speeding up the collation process. And the maximum number of registrations can be expanded.
  • biometric information registration method biometric authentication method, and biometric authentication device configured as described above
  • biometric authentication method by increasing the accuracy of the feature amount, it is possible to calculate a stable vein feature amount with respect to unstable vein data unique to the living body, By classifying the registration data in this way, there is an advantage that in a biometrics authentication system such as a palm vein authentication device, it is possible to expect a faster 1-to-N verification time and an increase in the maximum number of registered persons.
  • FIG. 1 is a block diagram showing the configuration of an entrance / exit management system to which the present invention is applied.
  • This entrance / exit management system includes an imaging device 1 that captures a palm vein, an authentication processing device 2 that performs authentication processing based on the captured image, a vein database 3 that registers a large amount of vein data, and an authentication processing device 2. And a locking / unlocking unit 4 that operates according to the processing result of the above.
  • the imaging device 1 has a guide 6 that supports a hand 5 to be imaged, and a sensor unit 7 is installed below the guide 6.
  • the sensor unit 7 is provided with an infrared sensor 8 which can be a CMOS (Complementary Metal Oxide Semiconductor) image sensor, for example, at the center bottom, and a distance sensor 9 is provided beside the sensor.
  • CMOS Complementary Metal Oxide Semiconductor
  • a plurality of (for example, eight) near-infrared light emitting elements 10 that irradiate near-infrared rays toward the upper side of the figure are installed around the infrared sensor 8.
  • the authentication processing device 2 is connected to the outputs of the distance sensor 9 and the infrared sensor 8 of the imaging device 1 and detects the distance from the infrared sensor 8 to the hand 5 and the contour of the hand 5 and the hand / hand contour detection unit 11.
  • a guidance message output unit 12 that outputs a guidance message is provided.
  • the distance / hand contour detection unit 11 receives the measurement distance of the distance sensor 9 from the imaging device 1, determines whether or not an object such as a palm is within a predetermined range from the sensor unit 7, and the infrared sensor The contour of the hand is detected from the captured image taken by 8, and it is determined from the contour whether or not the image can be used for registration and verification processing.
  • the guidance message output unit 12 puts the palm in a predetermined position.
  • the guiding message is output to a display (not shown).
  • the authentication processing device 2 also extracts a vein data that is data related to the vein based on the captured image from the distance / hand outline detector 11 and a vein extractor that represents the vein features representing the characteristics of the vein pattern. 13 is provided.
  • the output of the data extraction unit 13 is connected to a data registration unit 14 that registers the extracted vein data and vein feature amount in the vein database 3 and a collation order determination unit 15 that determines the collation order.
  • the output of the collation order determination unit 15 is connected to a collation processing unit 16 that performs collation processing and outputs the result to the locking / unlocking unit 4.
  • the collation order determination unit 15 and the collation processing unit 16 are connected to the data acquisition unit 17 so as to acquire the vein feature amount and vein data registered from the vein database 3.
  • FIG. 2 is a flowchart showing a flow of vein data registration processing
  • FIG. 3 is a diagram showing an example of a captured image of a palm vein
  • FIG. 4 is a conceptual diagram showing a state in which vein data is extracted from the captured image
  • FIG. 6 is a diagram illustrating the data structure of data stored in the vein database.
  • the palm vein is imaged by the imaging device 1 (step S1).
  • the captured image of the palm vein photographed by the imaging device 1 absorbs the near infrared light emitted by the near infrared light emitting element 10 in the vein portion, so that the vein blood vessel becomes black as the image. It becomes a pattern.
  • the captured image obtained in this way is input to the data extraction unit 13, where vein data is generated (step S2).
  • the vein data can be classified into vein data including a trunk R1, a thick branch R2, and a thin branch R3 connected to the thick branch R2.
  • the classification from the vein data into the trunk R1, the thick branch R2, and the thin branch R3 is executed by the data extraction unit 13.
  • step S1 and step S2 are repeated a plurality of times, in this embodiment, three times. Therefore, the vein data for three times is obtained in the processes of steps S1 and S2.
  • the number of times of photographing can be changed by setting.
  • the vein data generated by the data extraction unit 13 becomes data to be registered in the vein database 3.
  • the data extraction unit 13 uses five types of vein feature amounts that are not easily affected by the photographing situation from the vein image obtained by removing a part of the thin branch R3 and the thick trunk R1 from the vein data, that is, the concept shown in FIG. First to fifth vein feature amounts V1-V5 as shown in FIG. 5 are calculated (step S3).
  • the first vein feature amount V1 is a frequency component representing the interval and the number of the trunk R1 and the thick branch R2 in the vein image.
  • the second vein feature amount V2 is a distribution of direction components of the trunk R1 and the thick branch R2 of the vein image.
  • the third vein feature amount V3 is the distribution of the bending direction components of the trunk R1 and the thick branch R2 of the vein image.
  • the fourth vein feature amount V4 is a distribution of direction components of segments obtained by dividing the trunk R1 and the thick branch R2 of the vein image in the length direction.
  • the fifth vein feature amount V5 is a distribution of amounts of the trunk R1 and the thick branch R2 of the vein image in the area into which the vein image is divided.
  • the first to fifth vein feature amounts V1-V5 calculated by the data extraction unit 13 in this way are registered in the vein database 3 by the data registration unit 14 (step S4).
  • the data structure of data registered in the vein database 3 is assigned one record for each user.
  • Each record has a user identification number (ID)
  • Five types of vein feature values and vein data of the captured image are stored.
  • the third to fifth vein feature amounts V3-V5 in order to prevent the loss of data on the boundary lines of the divided areas, two patterns of vein feature amounts having different division positions are calculated and stored. ing.
  • the vein data all the data of the captured image taken three times are stored.
  • FIG. 7 is a flowchart showing the flow of vein data identification processing
  • FIG. 8 is a diagram for explaining collation processing
  • (A) is vein data for collation
  • (B) is a diagram showing an example of registered vein data. It is.
  • the imaging device 1 captures the palm vein of the user (step S11), and the data extraction unit 13 generates vein data (step S12). Then, the vein feature amount is calculated (step S13).
  • the processes in steps S11 to S13 are the same as in the registration process except that the number of times of shooting is one.
  • the vein characteristic amount registered from the vein database 3 is acquired by the data acquisition unit 17 (step S14).
  • the acquired vein feature amount is sent to the collation order determination unit 15.
  • the collation order determination unit 15 calculates a score that is the sum of the differences based on the vein feature amount calculated by the data extraction unit 13 and the vein feature amount acquired from the vein database 3 (step S15). Based on the score, the records in the registered data group are sorted to determine the collation order (step S16).
  • the collation processing unit 16 follows the collation order determined by the collation order determination unit 15 and the vein data generated by the data extraction unit 13 and the registered vein data acquired by the data acquisition unit 17. Are collated (step S17).
  • the similarity is sequentially calculated for the top m% records in the collation order, and if there is a record whose similarity is equal to or greater than a predetermined value, it is determined that the authentication is OK and the identity of the user is confirmed. .
  • the determination result is sent to the locking / unlocking unit 4, and the locking / unlocking unit 4 unlocks the electric lock installed on the door and locks it after a predetermined time.
  • the records are sorted by similarity, and the record with the highest similarity is determined as the user's record, and the user is specified. If there is no record whose similarity is equal to or greater than a predetermined value, the user is notified that the authentication has failed by, for example, sounding a buzzer.
  • the vein data D1 and D2 shown in FIG. 8 are values obtained by digitizing vein data of a captured image taken in the identification process and registered vein data.
  • Each of these vein data D1 and D2 is composed of 25 pixels which are extremely simplified for the sake of explanation and divided into 5 parts vertically and horizontally, and each pixel depends on the intensity of reflected light of near infrared rays. It has represented gradation data.
  • Each pixel can hold a value from 0 to 255, but is assigned to two types of values of 0 and 255 among the values from 0 to 255 by the binarization processing of the imaging apparatus 1. Yes.
  • the pixel with the pixel value “0” indicates the black color of the vein image
  • the pixel value “255” indicates the other portion (white).
  • the similarity is calculated by comparing the 25 pixel values of the vein data D1 for verification with the 25 pixel values of the registered vein data D2 one by one, and corresponding to both the vein data D1 and D2.
  • the number of pixels having the same coordinate pixel value is counted, and the obtained count value becomes the similarity value.
  • the similarity is “25”
  • the similarity is “0”
  • the similarity is “ 0 ".
  • vein feature amount calculation As pre-processing for performing the above collation processing, vein feature amount calculation, vein feature amount score calculation, and collation order determination processing, which are techniques for narrowing down records to be collated, will be described in detail. To do.
  • FIG. 9 is a conceptual diagram for explaining the frequency component of the first vein feature amount.
  • V1 vein data excluding a part of the thin branch R3 and the thick trunk R1 is developed at the center of an image having a size of 256 ⁇ 256, for example, and the image size is reduced.
  • a vein image f (x, y) is assumed.
  • the vein image f (x, y) is subjected to a two-dimensional fast Fourier transform by the following equation (1) to obtain a spatial frequency component F (u, v).
  • this two-dimensional fast Fourier transform first, the pixels of each line are Fourier transformed in the x direction of the vein image f (x, y), and then the transformation result of the line is Fourier transformed in the y direction.
  • is in the range of 0 to ⁇ .
  • the first vein feature amount V1 is the sum of the energy of the donut-shaped region centered on the origin in the polar coordinate system power spectrum space, as shown in FIG.
  • 32 frequencies when the radius r is changed from 1 to 32 are used as the index Index1 of the first vein feature amount V1
  • this index Index1 is calculated by the following equation (5).
  • Index1 [32] ⁇ p (1), p (2),. . , P (32) ⁇ (5)
  • FIG. 10 is a conceptual diagram for explaining the angle component of the second vein feature amount.
  • the vein image f (x, y) is Fourier-transformed to calculate a spatial frequency component F (u, v), and this spatial frequency component A power spectrum P (u, v) is calculated from F (u, v).
  • the power spectrum P (u, v) is set to the polar power spectrum P (r, ⁇ ), and the angle energy is obtained by the following equation (6).
  • w is the size of the definition area of P (u, v), and ⁇ indicates a direction obtained by dividing 180 degrees into 12 equal parts.
  • q ′ ( ⁇ ) the energy ratio of each angle is obtained by the following equation (7). That is, the energy ratio in each angular range divided into 12 equal parts is calculated.
  • q ( ⁇ ) 10000 * q ′ ( ⁇ ) / ⁇ q ′ ( ⁇ ) (7)
  • 10000 is a correction value for integer type conversion.
  • the second vein feature amount V2 is the sum of energy in an angular range of 15 degrees.
  • twelve angle components when the angle ⁇ is changed from 0 to 180 are used as the index Index2 of the second vein feature amount V2, and the index Index2 is calculated by the following equation (8).
  • Index2 [12] ⁇ q (0), q (1),. . , Q (11) ⁇ (8) Since the first angle component q (0) is energy of an angle from 0 to 14 degrees, ⁇ is calculated by the following equation (9). Is calculated.
  • FIG. 11A and 11B are diagrams showing a calculation area of the third vein feature amount, where FIG. 11A shows the first division pattern, FIG. 11B shows the second division pattern, and FIG. FIG. 13 is a diagram for explaining a vein segment that is a calculation target of the bending direction, (A) shows a vectorized vein segment, and (B) is a calculation target first. (C) shows the vein segment to be calculated next.
  • the third vein feature amount V3 is a feature amount in the direction in which the vein is bent (curvature direction) from the vein image, and the curve direction component is calculated for each divided area by dividing the vein image into several parts. Will do.
  • the vein image is divided into, for example, six areas. That is, in the first division pattern P1 shown in FIG. 11A, the vein image is divided into 2 rows and 3 columns, and in the second division pattern P2 shown in FIG. Yes. As a result, veins on the boundary line between the areas divided by the first division pattern P1 may not be correctly recognized. However, the second division pattern P2 is not correctly recognized because it is not on the boundary line. It becomes like this. As described above, by using the first division pattern P1 and the second division pattern P2 having different division positions, it is possible to complement the missing data.
  • the vein data is vectorized, the curved vein is converted into a continuous thin line called a segment, and a component that causes noise is removed.
  • the noise component there are an independent short vein segment, a thin vein segment, and a vein segment constituting a circle.
  • the bending direction is defined. As shown in FIG. 12, when there are vectorized vein segments and the coordinates of the end points of these two adjacent vein segments are A, B, and C, both ends of the two adjacent vein segments A direction in which a perpendicular is drawn from a vector point (bending point) B with respect to a straight line AC connecting the two is defined as a bending direction.
  • the connected vein segment is divided, the vein segment to be calculated is determined therefrom, and the bending direction is calculated for the determined vein segment.
  • the coordinates of the vector points are obtained for the vein segment shown in FIG.
  • a segment segment whose end point is a vein segment start coordinate A1, a coordinate B1 separated by a certain length, and a coordinate C1 separated by the same length is obtained.
  • These divided segments are targets for calculating the bending direction of the coordinate B1.
  • a vector point jumping from the start coordinate A1 of the vein segment is taken as coordinates B1 and C1.
  • one coordinate is shifted from each of the coordinates A1, B1, and C1, and these are set as the coordinates A2, B2, and C2 of the vein segment to be calculated next.
  • the determination of the vein segment to be calculated is sequentially performed while shifting the vector point by one coordinate and continues to the end coordinate of the segment.
  • the angle of direction B is calculated
  • Direction B atan2 (Hy, Hx) * (180 / ⁇ ) (12)
  • the histogram (curv [ ⁇ ]) of the bending direction of the vector points of all the divided segments is obtained for each area of the first and second divided patterns P1 and P2. Created.
  • 36 angle regions obtained by dividing 360 degrees in units of 10 degrees are set, and when the bending direction is calculated, the value of the angle region including the direction is incremented so that the bending direction A histogram (curv [ ⁇ ]) is generated. This histogram (curv [ ⁇ ]) is used as the index preIndex3.
  • the index of the curved 36-direction component for the first area of the first division pattern P1 is expressed by the following equation (13).
  • preIndex3_1 [0] [36] ⁇ curv (0), curv (1),. . , Curv (35) ⁇ (13)
  • curv (0) indicates a value obtained by integrating the bending directions included in the range from 0 degrees to 9 degrees, and is calculated by the following equation (14).
  • This histogram is performed in the same manner for the other areas (preIndex [1] to preIndex [5]).
  • index of the curved 36-direction component for the first area of the first division pattern P1 is obtained by the following equation (15).
  • Index3_P1 [0] [36] preIndex3_1 [0] [0] / Allcnt1, preIndex3_1 [0] [1] / ALLcnt1,. . , PreIndex3_1 [0] [35] / ALLcnt1
  • the index of the curved 36-direction component for the second area of the first division pattern P1 is obtained by the following equation (16).
  • Index3_P1 [1] [36] preIndex3_1 [1] [0] / Allcnt1, preIndex3_1 [1] [1] [1] / ALLcnt1,. .
  • index of the curved 36-direction component for the first area of the second division pattern P2 is obtained by the following equation (18).
  • Index3_P2 [0] [36] preIndex3_2 [0] [0] / Allcnt2, preIndex3_2 [0] [1] / ALLcnt2,. . , PreIndex3_2 [0] [35] / ALLcnt2 (18)
  • the index of the curved 36-direction component for the second area of the second division pattern P2 is obtained by the following equation (19).
  • Index3_P2 [1] [36] preIndex3_2 [1] [0] / Allcnt2, preIndex3_2 [1] [1] / ALLcnt2,. .
  • FIG. 14 is a diagram for explaining the definition of the segment direction of the fourth vein feature amount
  • FIG. 15 is a diagram for explaining a vein segment for which the segment direction is to be calculated
  • (A) is a vectorized vein segment
  • (B) shows the vein segment to be calculated first
  • (C) shows the vein segment to be calculated next.
  • the fourth vein feature quantity V4 uses the direction (inclination) of the vein segment divided from the vein data as the feature quantity.
  • the calculation of the segment direction component is performed for each divided area by dividing the vein image into several parts as in the calculation of the third vein feature amount.
  • the division pattern is the same as that shown in FIG. 11. Two patterns are prepared by equally dividing a vein image into six areas, and a histogram is calculated in each area.
  • segment direction is defined.
  • segment direction As shown in FIG. 14, there are vectorized vein segments, and the inclination of a straight line AB connecting the coordinates of two vector points among the segments is defined as the segment direction.
  • the connected vein segment is divided, the target vein segment is determined therefrom, and the segment direction is calculated for the determined vein segment.
  • the coordinates of the vector points are obtained for the vein segment shown in FIG.
  • a segment segment having both ends thereof is obtained from the start coordinate A1 of the vein segment and the coordinate B1 separated by a certain length.
  • a vector point jumping from the start coordinate A1 of the vein segment is set as the coordinate B1.
  • the length of the divided segment for obtaining the inclination is determined based on the experimental result.
  • coordinates A2 and B2 obtained by shifting one coordinate from the coordinates A1 and B1 are both end coordinates. In this way, the determination of the segment to be calculated is sequentially performed while shifting the coordinates one coordinate at a time until the end coordinates of the segment.
  • segment direction atan2 (yb ⁇ ya, xb ⁇ xa) * (180 / ⁇ ) (21)
  • 180 degrees is added when the angle is negative, and 180 degrees is subtracted when the angle is 180 degrees or more. This is because, for example, the inclination of a straight line of 270 degrees or minus 90 degrees is assumed to be the same as 90 degrees.
  • a histogram (segdir [ ⁇ ]) of all segment segments is created for each area of the first and second segment patterns P1 and P2.
  • Then 18 angle regions obtained by dividing 180 degrees in units of 10 degrees are set, and when the segment direction is calculated, the value of the angle region including the direction is incremented, so that the segment direction A histogram (segdir [ ⁇ ]) is generated.
  • This histogram (segdir [ ⁇ ]) is used as the index preIndex4.
  • the index of the segment 18 direction component relating to the first area of the first division pattern P1 is expressed by the following equation (22).
  • preIndex4_1 [0] [18] ⁇ sigdir (0), segdir (1),. . , Segdir (17) ⁇ (22)
  • segdir (0) indicates a value obtained by integrating segment directions from 0 degrees to 9 degrees, and is calculated by the following equation (23).
  • This histogram is similarly performed for the other areas (preIndex4_1 [1] to preIndex4_1 [5]).
  • preIndex4_1 [m] [n] ⁇ m 0 to 5
  • the sum (ALLcnt1) of them is obtained, and each element is divided by the sum.
  • normalization is performed by calculating the ratio.
  • index of the segment 18 direction component relating to the first area of the first division pattern P1 is obtained by the following equation (24).
  • Index4_P1 [0] [18] preIndex4_1 [0] [0] / Allcnt1, preIndex4_1 [0] [1] / ALLcnt1,. . , PreIndex4_1 [0] [17] / ALLcnt1
  • the index of the segment 18 direction component regarding the second area of the first division pattern P1 is obtained by the following equation (25).
  • Index4_P1 [1] [18] preIndex4_1 [1] [0] / Allcnt1, preIndex4_1 [1] [1] / ALLcnt1,. .
  • the index of the segment 18 direction component relating to the sixth area of the first division pattern P1 is obtained by the following equation (26).
  • Index4_P1 [5] [18] preIndex4_1 [5] [0] / Allcnt1, preIndex4_1 [5] [1] / ALLcnt1,. . , PreIndex4_1 [5] [17] / ALLcnt1 (26)
  • the second segmentation pattern P2 is also subjected to the following normalization by performing vein segment division, calculation of the gradient of the segment, and creation of a histogram.
  • index of the segment 18 direction component relating to the first area of the second division pattern P2 is obtained by the following equation (27).
  • Index4_P2 [0] [18] preIndex4_2 [0] [0] / Allcnt2, preIndex4_2 [0] [1] / ALLcnt2,. . , PreIndex4_2 [0] [17] / ALLcnt2
  • index4_P2 [1] [18] preIndex4_2 [1] [0] / Allcnt2, preIndex4_2 [1] [1] / ALLcnt2,. .
  • FIGS. 16A and 16B are diagrams showing the fifth vein feature amount calculation area, where FIG. 16A shows a first division pattern and FIG. 16B shows a second division pattern.
  • the fifth vein feature quantity V5 uses the amount of veins included in the vein data as a feature quantity and uses it as an index.
  • the vein volume is calculated for each divided area by dividing the vein image into several parts.
  • a pattern divided into 7 ⁇ 7 49 areas is prepared as the first divided pattern P1a
  • a pattern divided into 8 ⁇ 8 64 areas is prepared as the second divided pattern P2a.
  • the amount of veins means the density of veins in each area, and is thus calculated by counting how many pixels of the vein image are in each area.
  • the calculation of the vein volume is performed for all areas of 49 areas of the first division pattern P1a and 64 areas of the second division pattern P2a, and a histogram (segment1 [n]) is created. This histogram (segment1 [n]) is used as the index preIndex5.
  • preIndex5_1 [49] ⁇ segist1 [0], segist1 [1],. .
  • the vein volume index for the 49 area of the first division pattern P1a is normalized by the following equation (31).
  • Index5_P1a [49] preIndex5_1 [0] / Allcnt1, preIndex5_1 [1] / ALLcnt1,. . , PreIndex5_1 [48] / ALLcnt1 (31)
  • histograms of all areas are created for the other second divided pattern P2a.
  • Index5_P2a [64] preIndex5_2 [0] / Allcnt2, preIndex5_2 [1] / ALLcnt2,. . , PreIndex5_2 [63] / ALLcnt2 (32)
  • the five types of vein feature quantities obtained as described above are stored in the vein database 3 during the vein data registration process, and verified with the registration data during the vein data identification process. It becomes collation data.
  • the vein feature amount score calculation and collation order determination processing performed in the vein data identification processing will be described.
  • vein feature score calculation When the first to fifth vein feature amounts V1-V5 as the collation data are calculated, the feature amounts of all registered records are acquired from the vein database 3, and the similarity between the two relating to the vein feature amount can be viewed. A score is calculated.
  • the score of the second vein feature amount V2 is calculated by the following equation (34).
  • score3_P2
  • the score score4_P1 of the fourth vein feature value V4 (first division pattern P1), the score score4_P2 of the fourth vein feature value V4 (second division pattern P2), and the fifth vein feature value V5 (first value)
  • the score score5_P1a of one division pattern P1a) and the score score5_P2 of the fifth vein feature amount V5 (second division pattern P2a) are calculated. The calculation of the vein feature amount score is performed for all the registered records.
  • the vein feature amount score calculated as described above is used to calculate the total score for each record.
  • the total score total [N] (N is a record number) is calculated by the following equation (37).
  • total [N] ⁇ ⁇ score1 + ⁇ ⁇ score2 + ⁇ ⁇ (score3_P1 + score3_P2) + ⁇ ⁇ (score4_P1 + score4_P2) + ⁇ ⁇ (score5_P1a + score5_P2a) (37)
  • ⁇ , ⁇ , ⁇ , ⁇ , and ⁇ are weighting factors.
  • FIG. 17 is a diagram for explaining the flow of 1-to-N identification processing.
  • the collation data is not collated with all registered data, but the collation target is narrowed down to only the top m% records of the total score (S23).
  • S23 the collation target is narrowed down to only the top m% records of the total score (S23).
  • S23 the total score
  • collation between the vein data of the collation data and the vein data of the top three selected records is performed (S24).
  • the similarity between the collation data and the registered data group is calculated, and a record whose similarity exceeds a predetermined value is authenticated.
  • “record 7” has a similarity of “0”, and therefore authentication NG.
  • the records that have been authenticated are sorted in descending order by similarity (S25).
  • the user identification number (ID) of the record with the highest similarity is output as the result of this collation processing, and is recorded together with the date and time, for example, in the log file of entrance / exit.
  • the use of five types of vein feature amounts in the vein feature amount calculation which is the pre-processing of the vein data collation processing, improves the accuracy of the feature amounts, and the similarity is high when they are sorted. Probability that the record will be higher. For this reason, it is possible to reduce the ratio to be collated when narrowing down records.
  • the top 30% can be narrowed down, but in this embodiment, the top 6% can be narrowed down. Since there are fewer target records, the matching process can be speeded up. And since collation processing can be sped up, it becomes possible to expand the maximum registration number of vein data significantly.
  • 8 types of data are used as the vein feature amount, but since all of them are much smaller than the collation processing, the time taken from the vein feature amount calculation to the collation order determination processing Will not be long.
  • FIG. 18 is a diagram illustrating an example of a hardware configuration of a computer that realizes vein data registration processing and vein data identification processing.
  • the computer 20 includes a CPU (Central Processing Unit) 21 that controls the entire apparatus.
  • the CPU 21 is connected to a RAM (Random Access Memory) 23, a hard disk drive (HDD: Hard Disk Drive) 24, an image processing unit 25, and an input / output interface 26 via a bus 22.
  • RAM Random Access Memory
  • HDD Hard Disk Drive
  • the RAM 23 temporarily stores at least a part of an OS (Operating System) program and application programs to be executed by the CPU 21.
  • the RAM 23 stores various data and parameters necessary for processing by the CPU 21.
  • the hard disk drive 24 stores an OS and an application program for performing authentication processing.
  • a monitor 27 is connected to the image processing unit 25, and an image is displayed on the screen of the monitor 27 in accordance with a command from the CPU 21.
  • a keyboard 28, a mouse 29, an electric lock 30, and the imaging device 1 are connected to the input / output interface 26. Signals output from the keyboard 28 and mouse 29 are received by the input / output interface 26 and transmitted to the CPU 21 via the bus 22.
  • a signal of a palm vein captured image output from the imaging apparatus 1 is received by the input / output interface 26 and transferred to the RAM 23 or the hard disk drive 24 via the bus 22.
  • the electric lock 30 is installed on the door, the execution result of the application program for performing the authentication process is received via the input / output interface 26, and the locking or unlocking operation is performed depending on whether or not the personal authentication is OK. .
  • the 1-to-N identification processing function of the present embodiment can be realized.
  • the above merely illustrates the principle of the present invention.
  • many modifications and changes can be made by those skilled in the art, and the present invention is not limited to the precise configuration and application shown and described above, and all corresponding modifications and equivalents may be And the equivalents thereof are considered to be within the scope of the invention.

Abstract

L'invention porte sur un dispositif d'authentification biométrique permettant un appariement rapide et une amélioration du nombre maximal de données d'enregistrement. Un dispositif d'authentification par les veines sert à enregistrer des données concernant une veine et les valeurs caractéristiques de la veine et à identifier la veine, l'ordre d'appariement étant déterminé par tri d'enregistrements par ordre croissant du degré de similarité sur la base des valeurs caractéristiques de la veine enregistrées pour un appariement, et un appariement de veine est réalisé conformément à l'ordre. À titre de valeurs caractéristiques, sont utilisées les composantes (premières valeurs caractéristiques de la veine) à 32 fréquences obtenues par transformation de Fourier, les composantes d'angle (deuxièmes valeurs caractéristiques de la veine) dans 12 directions obtenues par transformation de Fourier, les composantes de courbe (troisièmes valeurs caractéristiques de la veine) dans 36 directions obtenues par transformation de Fourier, des composantes de segment (quatrièmes valeurs caractéristiques de la veine) dans 18 directions obtenues par transformation de Fourier, et la quantité d'un fluide veineux (cinquièmes valeurs caractéristiques de la veine). Grâce à cela, la précision lorsque l'ordre d'appariement est déterminé est plus élevée et un appariement à des enregistrements ayant un plus bas degré de similitude peut être omis. En conséquence, un appariement rapide et une amélioration du nombre maximal de données d'enregistrement sont réalisés.
PCT/JP2009/068728 2009-10-30 2009-10-30 Procédé d'enregistrement d'informations biométriques, procédé d'authentification biométrique, et dispositif d'authentification biométrique WO2011052085A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP09850864.1A EP2495699B1 (fr) 2009-10-30 2009-10-30 Procédé d'enregistrement d'informations biométriques, procédé d'authentification biométrique, et dispositif d'authentification biométrique
JP2011538182A JP5363587B2 (ja) 2009-10-30 2009-10-30 生体情報登録方法、生体認証方法および生体認証装置
PCT/JP2009/068728 WO2011052085A1 (fr) 2009-10-30 2009-10-30 Procédé d'enregistrement d'informations biométriques, procédé d'authentification biométrique, et dispositif d'authentification biométrique
US13/448,872 US8660318B2 (en) 2009-10-30 2012-04-17 Living body information registration method, biometrics authentication method, and biometrics authentication apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/068728 WO2011052085A1 (fr) 2009-10-30 2009-10-30 Procédé d'enregistrement d'informations biométriques, procédé d'authentification biométrique, et dispositif d'authentification biométrique

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/448,872 Continuation US8660318B2 (en) 2009-10-30 2012-04-17 Living body information registration method, biometrics authentication method, and biometrics authentication apparatus

Publications (1)

Publication Number Publication Date
WO2011052085A1 true WO2011052085A1 (fr) 2011-05-05

Family

ID=43921527

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/068728 WO2011052085A1 (fr) 2009-10-30 2009-10-30 Procédé d'enregistrement d'informations biométriques, procédé d'authentification biométrique, et dispositif d'authentification biométrique

Country Status (4)

Country Link
US (1) US8660318B2 (fr)
EP (1) EP2495699B1 (fr)
JP (1) JP5363587B2 (fr)
WO (1) WO2011052085A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015028723A (ja) * 2013-07-30 2015-02-12 富士通株式会社 生体特徴ベクトル抽出装置、生体特徴ベクトル抽出方法、および生体特徴ベクトル抽出プログラム
WO2015147088A1 (fr) * 2014-03-25 2015-10-01 富士通フロンテック株式会社 Méthode d'enregistrement d'informations biométriques, méthode d'authentification biométrique, dispositif d'enregistrement d'informations biométriques, dispositif d'authentification biométrique, et programme
US9898673B2 (en) 2014-03-25 2018-02-20 Fujitsu Frontech Limited Biometrics authentication device and biometrics authentication method
CN108256456A (zh) * 2018-01-08 2018-07-06 杭州电子科技大学 一种基于多特征阈值融合的手指静脉识别方法
US10019617B2 (en) 2014-03-25 2018-07-10 Fujitsu Frontech Limited Biometrics authentication device and biometrics authentication method
US10019616B2 (en) 2014-03-25 2018-07-10 Fujitsu Frontech Limited Biometrics authentication device and biometrics authentication method
WO2020070590A1 (fr) * 2018-10-05 2020-04-09 株式会社半導体エネルギー研究所 Système d'authentification, et procédé pour l'enregistrement d'un historique de déverrouillage à l'aide d'un système d'authentification

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5821964B2 (ja) * 2011-09-28 2015-11-24 富士通株式会社 情報処理装置、制御方法及びプログラム
US10043279B1 (en) * 2015-12-07 2018-08-07 Apple Inc. Robust detection and classification of body parts in a depth map
KR102515831B1 (ko) 2016-07-21 2023-03-29 삼성전자주식회사 스펙트럼 획득 장치 및 방법
US10366278B2 (en) 2016-09-20 2019-07-30 Apple Inc. Curvature-based face detector
CN106570469A (zh) * 2016-11-01 2017-04-19 北京无线电计量测试研究所 自助式双目虹膜注册方法及利用该方法的虹膜注册装置
US10635884B2 (en) * 2017-06-08 2020-04-28 Moqi Inc. System and method for fingerprint recognition
CN108875621B (zh) * 2018-06-08 2023-04-18 平安科技(深圳)有限公司 图像处理方法、装置、计算机设备及存储介质
CN110192843B (zh) * 2019-05-31 2022-04-15 Oppo广东移动通信有限公司 信息推送方法及相关产品

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS54105929A (en) * 1978-02-07 1979-08-20 Nippon Telegr & Teleph Corp <Ntt> Hierarchical discrimination process system for character pattern
JPS5866176A (ja) * 1981-10-16 1983-04-20 Toshiba Corp 図形認識装置
JP2007206991A (ja) * 2006-02-02 2007-08-16 Hitachi Ltd 生体情報処理装置及び生体情報処理プログラム
JP2007249339A (ja) 2006-03-14 2007-09-27 Fujitsu Ltd 生体認証方法及び生体認証装置

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2108306B (en) 1981-10-16 1985-05-15 Tokyo Shibaura Electric Co Pattern recognition apparatus and method
JP3695899B2 (ja) 1997-06-04 2005-09-14 三菱電機株式会社 指紋照合装置
JP2001243465A (ja) 2000-03-01 2001-09-07 Nippon Telegr & Teleph Corp <Ntt> 指紋画像照合方法および指紋画像照合装置
US6950536B2 (en) * 2002-01-25 2005-09-27 Houvener Robert C High volume mobile identity verification system and method using tiered biometric analysis
JP2006277407A (ja) 2005-03-29 2006-10-12 Sanyo Electric Co Ltd 照合方法および照合装置
JP4864632B2 (ja) * 2006-10-12 2012-02-01 株式会社リコー 画像入力装置、画像入力方法、個人認証装置及び電子機器
WO2008120317A1 (fr) * 2007-03-28 2008-10-09 Fujitsu Limited Dispositif de vérification, dispositif d'authentification, procédé de vérification, procédé d'authentification, programme de vérification et programme d'authentification
JP5045344B2 (ja) 2007-09-28 2012-10-10 ソニー株式会社 登録装置、登録方法、認証装置及び認証方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS54105929A (en) * 1978-02-07 1979-08-20 Nippon Telegr & Teleph Corp <Ntt> Hierarchical discrimination process system for character pattern
JPS5866176A (ja) * 1981-10-16 1983-04-20 Toshiba Corp 図形認識装置
JP2007206991A (ja) * 2006-02-02 2007-08-16 Hitachi Ltd 生体情報処理装置及び生体情報処理プログラム
JP2007249339A (ja) 2006-03-14 2007-09-27 Fujitsu Ltd 生体認証方法及び生体認証装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2495699A4

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015028723A (ja) * 2013-07-30 2015-02-12 富士通株式会社 生体特徴ベクトル抽出装置、生体特徴ベクトル抽出方法、および生体特徴ベクトル抽出プログラム
WO2015147088A1 (fr) * 2014-03-25 2015-10-01 富士通フロンテック株式会社 Méthode d'enregistrement d'informations biométriques, méthode d'authentification biométrique, dispositif d'enregistrement d'informations biométriques, dispositif d'authentification biométrique, et programme
JP2015185046A (ja) * 2014-03-25 2015-10-22 富士通フロンテック株式会社 生体情報登録方法、生体認証方法、生体情報登録装置、生体認証装置及びプログラム
US9898673B2 (en) 2014-03-25 2018-02-20 Fujitsu Frontech Limited Biometrics authentication device and biometrics authentication method
US10019617B2 (en) 2014-03-25 2018-07-10 Fujitsu Frontech Limited Biometrics authentication device and biometrics authentication method
US10019616B2 (en) 2014-03-25 2018-07-10 Fujitsu Frontech Limited Biometrics authentication device and biometrics authentication method
CN108256456A (zh) * 2018-01-08 2018-07-06 杭州电子科技大学 一种基于多特征阈值融合的手指静脉识别方法
CN108256456B (zh) * 2018-01-08 2020-04-07 杭州电子科技大学 一种基于多特征阈值融合的手指静脉识别方法
WO2020070590A1 (fr) * 2018-10-05 2020-04-09 株式会社半導体エネルギー研究所 Système d'authentification, et procédé pour l'enregistrement d'un historique de déverrouillage à l'aide d'un système d'authentification
US11847195B2 (en) 2018-10-05 2023-12-19 Semiconductor Energy Laboratory Co., Ltd. Authentication system and method for recording unlocking history using authentication system
JP7472029B2 (ja) 2018-10-05 2024-04-22 株式会社半導体エネルギー研究所 認証システム

Also Published As

Publication number Publication date
US8660318B2 (en) 2014-02-25
EP2495699B1 (fr) 2019-07-10
EP2495699A4 (fr) 2018-01-03
JPWO2011052085A1 (ja) 2013-03-14
JP5363587B2 (ja) 2013-12-11
US20120201431A1 (en) 2012-08-09
EP2495699A1 (fr) 2012-09-05

Similar Documents

Publication Publication Date Title
JP5363587B2 (ja) 生体情報登録方法、生体認証方法および生体認証装置
JP4937607B2 (ja) 生体認証方法及び生体認証装置
Ribarić et al. Multimodal biometric user-identification system for network-based applications
KR100940902B1 (ko) 손가락 기하학 정보를 이용한 바이오 인식 방법
Liliana et al. The combination of palm print and hand geometry for biometrics palm recognition
CN105760841A (zh) 一种身份识别方法及系统
Shawkat et al. The new hand geometry system and automatic identification
EP3859663A1 (fr) Dispositif de reconnaissance d&#39;iris, procédé de reconnaissance d&#39;iris et support de stockage
Chaudhari et al. Implementation of minutiae based fingerprint identification system using crossing number concept
Aleem et al. Fast and accurate retinal identification system: Using retinal blood vasculature landmarks
Mukhaiyar Cancellable biometric using matrix approaches
Doroz et al. An accurate fingerprint reference point determination method based on curvature estimation of separated ridges
Sarfraz Introductory chapter: On fingerprint recognition
Bhattacharyya et al. Vascular Pattern Analysis towards Pervasive Palm Vein Authentication.
Malik et al. Personal authentication using palmprint with Sobel code, Canny edge and phase congruency feature extraction method
EP4167179A1 (fr) Procédé d&#39;authentification, programme d&#39;authentification et dispositif de traitement d&#39;informations
CN105701411A (zh) 一种信息安全传输方法
Li Fingerprint identification by improved method of minutiae matching
Sharma et al. Fingerprint matching Using Minutiae Extraction Techniques
Chopra et al. Finger print and finger vein recognition using repeated line tracking and minutiae
Nestorovic et al. Extracting unique personal identification number from iris
Bahmed et al. A survey on hand biometry
Olajide et al. A ear and tongue based multimodal access control system
Patil et al. Multimodal biometric identification system: Fusion of Iris and fingerprint
Al-khassaweneh et al. A hybrid system of iris and fingerprint recognition for security applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09850864

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2011538182

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2009850864

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE