WO2019142479A1 - Biometric authentication device - Google Patents

Biometric authentication device Download PDF

Info

Publication number
WO2019142479A1
WO2019142479A1 PCT/JP2018/042458 JP2018042458W WO2019142479A1 WO 2019142479 A1 WO2019142479 A1 WO 2019142479A1 JP 2018042458 W JP2018042458 W JP 2018042458W WO 2019142479 A1 WO2019142479 A1 WO 2019142479A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
finger
biometric
biometric authentication
authentication device
Prior art date
Application number
PCT/JP2018/042458
Other languages
French (fr)
Japanese (ja)
Inventor
渓一郎 中崎
三浦 直人
友輔 松田
洋 野々村
長坂 晃朗
宮武 孝文
Original Assignee
株式会社日立産業制御ソリューションズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立産業制御ソリューションズ filed Critical 株式会社日立産業制御ソリューションズ
Publication of WO2019142479A1 publication Critical patent/WO2019142479A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a biometric authentication apparatus that authenticates an individual using a living body.
  • finger vein authentication is known as one that can realize highly accurate authentication.
  • Finger vein authentication uses blood vessel patterns inside the finger to achieve excellent authentication accuracy. Since finger vein authentication is more difficult to forge and tamper as compared with fingerprint authentication, high security can be realized.
  • biometric authentication device In recent years, increasing cases of securing the security of each device by installing a biometric authentication device on devices such as mobile phones, laptop PCs (Personal Computers), smartphones and tablet terminals, lockers, safes, printers, etc. ing. Further, as fields to which biometrics authentication is applied, biometrics authentication has been used in recent years in addition to entry and exit management, attendance management, login to a computer, and the like. In particular, it is important for biometric authentication devices used in the public to realize reliable personal authentication. Furthermore, in view of the spread of tablet type portable terminals in recent years and the trend of wearable computing, it is also one of the important requirements to realize the miniaturization of the device while securing the convenience as described above.
  • Patent Document 1 discloses a biometric authentication technology that extracts a plurality of features from a narrow-area biometric image acquired by a small device and uses it for authentication.
  • Patent Document 2 discloses a biometric authentication technology that acquires a wide area biological image while being a compact device, and realizes robust authentication against posture change.
  • Patent Document 1 a plurality of visible light sources having different wavelengths are illuminated on a finger, and a pattern of vein pattern and fat pattern is extracted as a biological feature from an image obtained by photographing the reflected light, and those features are mutually extracted.
  • a technique for performing high-accuracy authentication by collating the information in an efficient manner.
  • the characteristics of each feature are not considered.
  • Patent Document 2 near-infrared and green light sources are illuminated on a finger, and patterns such as vein pattern, joint pattern, fat pattern and finger outline are extracted from an image obtained by photographing the reflected light, It refers to combining and matching the patterns of However, no study has been made on a specific combination method of those patterns.
  • the object of the present invention is to solve the above-mentioned problems, to provide a biometric authentication device which realizes stable and high-accuracy authentication by efficiently combining and collating features in consideration of characteristics of a plurality of biometric features. It is to do.
  • an image input unit for capturing a living body to acquire a living body image
  • an authentication processing unit for processing the acquired living body image to perform biometric authentication
  • a storage unit for storing registration information about the feature
  • the authentication processing unit performs alignment by combining a plurality of biological features having different spatial characteristics obtained by processing and performing alignment, and registering the biological features after alignment
  • a biometric authentication device that performs biometric authentication using information and information.
  • an authentication device that performs stable and high-accuracy authentication by efficiently combining in consideration of the characteristics of a plurality of biometric features in a biometric authentication device.
  • FIG. 1 is a diagram illustrating an example of the entire configuration of a biometric authentication system according to a first embodiment.
  • FIG. 2 is a diagram illustrating an example of the configuration of an input device according to a first embodiment.
  • FIG. 7 is a diagram illustrating an example of a processing flow at the time of registration of the first embodiment.
  • FIG. 8 is a diagram showing an example of a process flow at the time of authentication in the first embodiment.
  • FIG. 8 is a diagram showing an example of a process flow of verification in the process flow at the time of authentication in the first embodiment.
  • FIG. 3 is a view showing an example of a photographed image of Example 1 and an image obtained as a result of feature extraction.
  • FIG. 1 is a diagram illustrating an example of the entire configuration of a biometric authentication system according to a first embodiment.
  • FIG. 2 is a diagram illustrating an example of the configuration of an input device according to a first embodiment.
  • FIG. 7 is a diagram illustrating an example of a processing flow at the
  • FIG. 16 is a schematic view showing an example of a configuration for photographing and authenticating a living body with a built-in camera of a mobile terminal according to a second embodiment.
  • FIG. 18 is a diagram illustrating an example of a process flow at the time of registration of the second embodiment.
  • FIG. 16 is a view showing an example of a photographed image of Example 2 and an image obtained as a result of feature extraction.
  • FIG. 18 is a diagram illustrating an example of a process flow at the time of authentication of the second embodiment.
  • FIG. 18 is a diagram showing an example of a process flow of verification in the process flow at the time of authentication in the second embodiment.
  • FIG. 18 is a diagram showing a configuration of a transmitted light system of an input device used for data collection for optimization of feature extraction processing in the third embodiment.
  • FIG. 18 is a diagram showing an example of a configuration of an input device used at the time of registration and authentication in the third embodiment.
  • FIG. 18 is a diagram showing a configuration of a reflected light scheme of an input device used for data collection for optimization of feature extraction processing in the third embodiment.
  • FIG. 18 is a diagram illustrating an example of a process flow of optimization of extraction of biological features in the third embodiment.
  • FIG. 18 is a diagram illustrating an example of a process flow of optimization of the image converter A in the third embodiment.
  • FIG. 18 is a diagram illustrating an example of image conversion by an image converter B in Embodiment 3.
  • the first embodiment stores an image input unit that captures a living body and acquires a biometric image, an authentication processing unit that processes the acquired biometric image to perform biometric authentication, and registration information on biometric features obtained from the biometric image.
  • the authentication processing unit combines a plurality of biological features with different spatial characteristics obtained by processing and performs alignment, and performs biometrics using the biological features after alignment and registration information.
  • the spatial characteristic is a characteristic of the spatial distribution of the dye concentration of the image of the living body, and in the present embodiment, after alignment is performed by combining a plurality of biological features having different spatial characteristics, registration with the registration information is performed. It is an Example of the biometrics apparatus which performs collation.
  • FIG. 1 is a diagram illustrating an example of an entire configuration of a biometric authentication system using a blood vessel of a finger according to a first embodiment.
  • the configuration of the present embodiment is not limited to the system configuration shown in FIG. 1, and it goes without saying that the configuration may be a device in which all or a part of the configuration is mounted on a housing.
  • the apparatus may be a biometric authentication apparatus including an authentication process, or the authentication process may be performed outside the apparatus and may be a blood vessel image acquisition apparatus specialized for acquiring a blood vessel image or a blood vessel image extraction apparatus.
  • the embodiment may be a terminal such as a smartphone.
  • all the embodiments including the biometric authentication system may be collectively referred to as a biometric authentication device.
  • the biometric authentication system includes an input device 2, an authentication processing unit 3, a storage unit 4, a display unit 5, an input unit 6, an audio output unit 7, and an image input unit 8.
  • the input device 2 includes the light source 9 installed in the housing and the imaging device 10 installed inside the housing, and inputs a living body image to the authentication processing unit 3 through the image input unit 8.
  • the image input unit 8 acquires a living body image captured by the imaging device 10 of the input device 2, and inputs the acquired living body image to the authentication processing unit 3. Therefore, in the present specification, the input device 2 and the image input unit 8 may be collectively referred to as an image input unit.
  • the authentication processing unit 3 is a generic name of processing units that execute processing related to biometrics, and a determination unit that determines the distance between a living body (finger) and a system or the posture of a living body (finger) from an image, and the living body (finger)
  • a state control unit that instructs the display unit etc. to correct the distance of the subject or the posture of the living body (finger), an unnecessary information removal unit that removes unnecessary information (wrinkling, background, etc.) from the captured image, the living body from the captured image
  • a feature extraction unit that extracts feature information and a matching unit that matches the extracted biometric feature information with registered information stored in advance in the storage unit are provided as the functional processing unit.
  • the light source 9 disposed in the input device 2 is, for example, a light emitting element such as a light emitting diode (LED), and emits light to the finger 1 presented on the upper portion of the input device 2.
  • the imaging device 10 captures an image of the finger 1 presented to the input device 2.
  • the finger 1 to be presented may be not only one but a plurality.
  • the authentication processing unit 3 includes, as its hardware configuration, a central processing unit (CPU: Central Processing Unit) 11, a memory 12 and various interfaces (IF) 13.
  • CPU Central Processing Unit
  • IF interfaces
  • the interface 13 connects the authentication processing unit 3 to an external device. Specifically, the interface 13 connects the input device 2, the storage unit 4, the display unit 5, the input unit 6, the audio output unit 7, the image input unit 8 and the like to the CPU 11, the memory 12 and the like.
  • the storage unit 4 stores in advance registration data of the user.
  • the registration data is information for matching the user, and is, for example, an image of a finger vein pattern.
  • the image of the finger vein pattern is an image obtained by imaging blood vessels (finger veins) mainly distributed subcutaneously on the palm side of the finger as a dark shadow pattern.
  • the authentication processing unit 3 extracts, from the image of the finger vein pattern, a plurality of biological features having different spatial characteristics, which is a characteristic of the spatial distribution of the dye concentration of the image of the finger, and extracts and collates the registered information It has a function.
  • the display unit 5 is, for example, a liquid crystal display (LCD), and is an output device that displays various information received from the authentication processing unit 3.
  • the input unit 6 is, for example, a keyboard, and transmits information input by the user to the authentication processing unit 3.
  • the voice output unit 7 is an output device that transmits information received from the authentication processing unit 3 as an acoustic signal such as voice.
  • the display unit 5 and the voice output unit 7 are instruction units for instructing the user who uses the biometric authentication system to correct the distance between the living body (finger) and the system and the posture of the living body (finger).
  • This embodiment is an example, and the present embodiment is not limited to this device configuration.
  • the authentication processing unit described above may perform all processing by one CPU, or may use a CPU for each function processing unit.
  • FIG. 2 is a diagram for explaining an example of a specific structure of the input device 2 of the biometric authentication system of the first embodiment.
  • the input device 2 captures biological features such as blood vessels (finger veins) distributed on the surface of the finger or under the skin.
  • the input device 2 is enclosed by a device housing 14, and one imaging device 10 is disposed in the inside thereof.
  • the plurality of infrared light sources 9 are annularly arranged around the imaging device 10, and can uniformly illuminate the finger 1 through the opening.
  • the infrared light source 9 emits light of an infrared wavelength.
  • the infrared light source 9 can emit light with an arbitrary intensity.
  • the infrared light source 9 selects a wavelength of 850 nm as an example of a specific wavelength.
  • An acrylic material 15 is inserted into the opening to prevent dust and the like from intruding into the inside of the apparatus, and has an effect of physically protecting members inside the apparatus.
  • a polarizing plate A16 and a polarizing plate B17 are inserted between the imaging device 10 and the acrylic material 15 and between the light source 9 and the acrylic material 15, respectively.
  • the polarizing plate A is a polarizing plate that polarizes the P wave
  • the polarizing plate B is a polarizing plate that polarizes the S wave.
  • a polarizing plate that polarizes the S wave may be used as the polarizing plate A
  • a polarizing plate that polarizes the P wave may be used as the polarizing plate B.
  • a polarizing plate (a generic name of a polarizing plate A16 and a polarizing plate B17), it is suppressed that the light reflected by the surface of the finger 1 among the light irradiated by the infrared light source 9 is received by the imaging device 10 Can.
  • a polarizing plate (a generic name of a polarizing plate A16 and a polarizing plate B17)
  • the imaging device 10 of the present embodiment is a monochrome camera, and has a light receiving element having sensitivity in only the wavelength band of infrared light.
  • the imaging device 10 may use a monochrome camera or a color camera having a light receiving element having sensitivity to the wavelength band of infrared light and visible light.
  • an optical filter for example, a band pass filter or a low pass filter for blocking visible light is inserted in front of or inside the camera lens so that only the wavelength band of infrared light is received by the light receiving element.
  • FIG. 3 shows an example of a processing flow at the time of registration of the biometric authentication system of the present embodiment, in which authentication is performed by efficiently combining a plurality of biometric features based on spatial characteristics of biometric features.
  • the user presents a finger to the input device 2 of the system, and the system shoots the finger with the camera of the imaging device 10 while emitting infrared light (S11).
  • the exposure time or the irradiation light amount of the light source may be adjusted, and the exposure time at which the luminance saturation disappears or the irradiation light amount of the light source may be set.
  • the position and posture of the finger are detected from the image captured by the camera (S12).
  • the posture information of the finger includes the position of the fingertip or the root of the finger, and the image of each finger is cut out using the positional information of one or more fingers to be authenticated.
  • a plurality of biological features having different spatial characteristics are extracted from the finger image (S13).
  • Whether or not the predetermined registration quality is satisfied may be, for example, whether or not the density of the extracted vein pattern or the amount of change in the pattern falls within a predetermined range. If the extracted biometric feature does not meet the predetermined quality, the finger imaging is performed again. If the extracted biometric feature satisfies a predetermined quality, the biometric feature is stored as registration data, which is registration information (S15).
  • FIG. 4A is an example of a process flow at the time of authentication of the biometric authentication system of the present embodiment, in which authentication is performed by efficiently combining a plurality of biometric features, focusing on spatial characteristics of biometric features.
  • the processing flow from the photographing of the finger (S11), the detection of the finger position / posture from the photographed image (S12), and the extraction of the biometric feature from the finger image (S13) is the same as at the time of registration.
  • S16 After extraction of the biometric feature, matching of the registered data with the biometric feature is performed (S16).
  • Verification is carried out with a combination of all registration data and authentication data, and authentication propriety determination is performed for the combination with the smallest degree of dissimilarity score obtained (S17). That is, if the calculated dissimilarity score is below the threshold set in advance, the authentication is successful (Yes) and the process ends. If the degree of difference score exceeds the threshold value, the authentication fails (No), and the process returns to photographing of the finger.
  • FIG. 4B shows an example of the process flow of verification (S16) in the process flow at the time of authentication of the system of this embodiment.
  • a plurality of biological features having different spatial characteristics, such as vein patterns, joint patterns, wrinkles of the epidermis, and fatty patterns.
  • the vein pattern and the joint pattern appear relatively clearly in the photographed image acquired by the system configuration of the present embodiment.
  • the vein pattern has a line pattern flowing in the long axis direction of the finger
  • the joint pattern has a line pattern flowing in the short axis direction of the finger. Therefore, by applying a filter for extracting a line pattern in a specific direction, such as a Gabor filter, to the captured image, it is possible to separate and extract the vein pattern and the joint pattern from each other.
  • the difference between the feature of the authentication data obtained in the extraction of the biometric feature (S13) and the biometric feature of the registered data stored inside or outside the device is calculated.
  • calculation of the degree of difference is performed for all registered IDs (0 ⁇ i ⁇ N) (S161).
  • the degree of difference is calculated by rounding one or more biometric features (S162).
  • part of the finger position / posture information between the registration data and the authentication data is normalized by the first biological feature (S163), and the entire finger position / posture information is calculated by the second biological feature. Are normalized, and the degree of difference is calculated by the second biometric feature (S164).
  • the biological feature to be handled is a two-dimensional image, and the joint pattern 19 extracted from the finger image 18 And a vein pattern 20 are used.
  • the amount of positional deviation of the registration data and the authentication data in the long axis direction of the finger between the joint print 19 as the first biological feature is determined.
  • the positional deviation is corrected based on the positional deviation amount in the long axis direction of the finger, and then the positional deviation amount in the short axis direction of the finger is determined using the vein pattern 20 as the second biological feature.
  • the positional deviation is corrected based on the positional deviation amount in the minor axis direction of the finger, and then the degree of difference between the data of the vein pattern 20 which is the second biological feature is calculated. Do.
  • the joint print 19 and the vein print 20 are exchanged, position correction of the finger in the short axis direction is performed by the vein print 20 as the second biological feature, and the long axis direction of the finger is performed by the joint print 19 as the first biological feature.
  • the position correction is performed to calculate the degree of difference between the data of the joint print 19 which is the first biological feature, and this is taken as the second degree of difference.
  • the first degree of difference and the second degree of difference between the determined vein pattern 20 and joint pattern 19 data are synthesized, the degree of synthesized difference is determined, and biometrics is executed. That is, in the collation processing of the present embodiment, the position of only the long axis direction of the finger is corrected in the joint print 19 in consideration of the spatial characteristics of each of a plurality of biological features such as the joint print and the vein print. In this case, position correction is performed only on the short axis direction of the finger to obtain the degree of synthetic difference, and comparison with registered information is performed.
  • the position correction in the short axis direction and the long axis direction of the finger is performed only with the vein pattern 20, and the difference degree of the vein pattern 20 is calculated based on the position correction result.
  • the vein pattern 20 has a pattern that flows in the direction of either the short axis or the long axis of the finger, so that the amount of positional deviation between the registered data and the authentication data can be accurately determined by template matching.
  • template matching the degree of difference is repeatedly calculated while shifting the positions of two images, and the position at which the degree of difference is minimum is determined.
  • the position at which the degree of difference between the data is the smallest is the position at which the position and posture of the finger substantially match between the data.
  • the position where the degree of difference between the data is the smallest is not necessarily the position where the position and posture of the finger roughly match between the data. It will be a final difference score, which will be a factor in increasing false acceptance.
  • the second embodiment is a camera which is a Tsu image device having sensitivity to visible light, which is standard mounted on a smartphone or the like, and is an example of a biometric authentication device that performs authentication under ambient light.
  • the system configuration of the second embodiment is the same as that of the first embodiment, but as shown in FIG. 6, the structure of the input device is different.
  • the input device 2 of the present embodiment includes a color camera 21 as an imaging device of the smartphone 23, and has a plurality of light receiving elements having sensitivity in the wavelength band of visible light.
  • the color camera 21 has solid-state imaging elements such as three types of CMOS or CCD elements having sensitivity to blue (B), green (G), red (R), for example, and these are arranged in a grid for each pixel of an image. It is arranged.
  • the color camera 21 of this embodiment has a plurality of three light receiving sensors having different peak wavelengths of light receiving sensitivity.
  • each light receiving sensor is, for example, a sensor having a peak of light receiving sensitivity in the vicinity of 480 nm in blue, in the vicinity of 550 nm in green, and in the vicinity of 620 nm in red.
  • the spatial color distribution of light ie, the spatial characteristics that are characteristic of the spatial distribution of dye concentration, can be obtained.
  • the smartphone 23 incorporates a white light source 22 as a light source.
  • FIG. 7 is a process flow at the time of registration of a biometric authentication apparatus that performs authentication by combining a plurality of biometric features efficiently based on the spatial characteristics of biometric features in the second embodiment.
  • the processing flow at the time of registration of the second embodiment is the same as that of the first embodiment, but the extraction of the biological feature (S13) is different.
  • vein pattern, joint pattern, and wrinkles of the epidermis are extracted as features.
  • a plurality of biological tissues such as vein pattern, joint pattern, wrinkles of the epidermis, fat pattern and the like are distributed in an overlapping manner, but in the present embodiment, as shown in FIG. From the photographed image 18 of the finger, a vein pattern 20 and a joint pattern 19 as well as the wrinkles 24 of the epidermis are extracted as biological features.
  • veins are less reddish than other living tissues.
  • the color image is an RGB image
  • the vein pattern 20 has a pattern flowing in the long axis direction of the finger. Therefore, by applying an edge enhancement filter that emphasizes the long axis direction of the finger, such as an unsharp mask or a Gabor filter, to the image in which the vein color is enhanced, the vein pattern is separated from the other biological tissues. Can be extracted.
  • the wrinkles of the epidermis are largely unstable due to dryness or life style compared to the vein pattern 20 and the joint pattern 19, the deep wrinkle pattern is relatively stable and is useful for authentication.
  • the above-described joint pattern extraction is performed by taking out a single band image of R with few mottled patterns due to capillary blood vessels and fat patterns from the acquired color image of the finger.
  • the joint print pattern is removed from the single band image of R to obtain a joint removed image. Wrinkles of the epidermis are predominant in the living tissue shown in the joint removal image.
  • a noise removal filter such as a Gaussian filter or an average filter
  • a skin pattern 25 obtained by combining the joint pattern 19 and the wrinkles 24 of the epidermis is used as a new biological feature.
  • This is because, as described in the first embodiment, in the joint fingerprint alone, normalization of finger position / posture information between registered data and authentication data can not be accurately performed, but on the spatial characteristics of biological features. Based on this, it is possible to perform the normalization of the posture information of the finger with higher accuracy by setting it as a new biological feature using not only the joint print but also the wrinkles of the epidermis.
  • FIG. 9A is a process flow at the time of authentication of the biometric apparatus according to the second embodiment, in which authentication is performed by efficiently combining a plurality of biometric features, focusing on the spatial characteristics of the biometric features described above.
  • the process flow at the time of authentication in the second embodiment is basically the same as that of the first embodiment, the processing content of the collation (S16) is different.
  • FIG. 9B shows the details of the process flow of verification (S16).
  • the degree of difference between the biometric feature of the authentication data obtained in the biometric feature extraction (S13) and the biometric feature of the registered data stored inside or outside the apparatus is calculated, but the calculation of the degree of difference is the same as in the first embodiment.
  • All registered IDs (0 ⁇ i ⁇ N) (S161).
  • the degree of difference is calculated by rounding one or more features (S162).
  • the position / posture information of the finger between the registration data and the authentication data is normalized by the first biometric feature (S165), and the difference between the two is calculated by the second biometric feature. (S166).
  • the biometric feature to be handled is a two-dimensional image
  • the authentication processing unit of the apparatus of the second embodiment using the skin pattern 25 and the vein pattern 20 extracted from the finger image 18 An example of collation by First, the amount of positional deviation of the finger between the skin pattern 25 which is the first biometric feature of the registration data and the authentication data is determined. Next, after the positional deviation is corrected based on the positional deviation amount of the finger, the degree of difference between the data of the vein pattern 20 which is the second biological feature is calculated, and this is taken as the first degree of difference.
  • the skin pattern 25 and the vein pattern 20 are exchanged, the positional deviation of the finger is corrected by the vein pattern 20 which is the second living body feature, and the second difference degree is calculated with the skin pattern 25 which is the first living body feature. . Finally, the first and second differences between the determined vein pattern 20 and skin pattern 25 data are combined, the final combined difference is determined, and the biometric authentication is performed based on the difference.
  • biometrics can be used to match other biometric features and efficiently reject each other by combining a plurality of biometric features, thereby suppressing another person's acceptance and the person's rejection. It becomes possible to provide an apparatus.
  • the third embodiment is an embodiment of the biometric device in which the light source and the camera are disposed on the opposite side to the finger.
  • the system configuration of this embodiment is the same as that of Embodiments 1 and 2, but the structure of the input device 2 is different as shown in FIGS. 10A and 10B.
  • the infrared light source 9 and the red light source 26 are disposed on the opposite side of the imaging device 10 with respect to the finger.
  • the infrared light source 9 is disposed on the opposite side of the imaging device 10 with respect to the finger.
  • the processing flow at the time of registration of biometric information in the biometric authentication device of the third embodiment and the processing flow at the time of authentication are basically the same as in the first embodiment, but the processing content of extraction of the biometric feature (S13) is different.
  • FIG. 12 shows a process flow of optimization of the extraction (S13) of the biological feature of the present embodiment.
  • image reading S131
  • two types of finger images different in how to look at the living tissue can be acquired by alternately lighting the infrared light source 9 and the red light source 26 and photographing the fingers.
  • an image (infrared image) obtained by photographing a finger with the infrared light source 9 a blood vessel image is relatively clearly seen, but obtained by photographing the finger with the red light source 26
  • the blood vessel image is relatively unclear. A large amount of these infrared and red images are acquired, and two images taken in the same trial are taken as one pair.
  • the image converter A is optimized (S132) using the pair of the infrared image and the red image read in the image reading (S131).
  • the image converter A receives the red image as an input to the image converter based on machine learning such as deep learning and outputs the infrared image of the red image and the pair. To optimize.
  • the specific processing flow of optimization (S132) of the image converter A is shown in FIG.
  • updating (S1321) of image conversion parameters is repeated so that a desired output image can be obtained from the input image.
  • the image conversion parameter update (S1321) is repeated N times, processing in the s-th image conversion parameter update (S1321) will be described.
  • a red image is input to the image converter A, and image conversion (S1322) is performed.
  • the distance between the output image output from the image converter A by the image conversion (S1322) and the infrared image of the correct answer paired with the input red image is calculated (S1323).
  • the measure of the distance may be Manhattan distance, Euclidean distance, or a distance determined by hostile learning.
  • the update amount of the image conversion parameter is calculated (S1324).
  • the update amount of the parameter is determined by the distance measure obtained above and the method of image conversion.
  • the image conversion parameters are updated (S1325).
  • the image converter A By iteratively updating the parameters of the image converter A using a large number of image pairs, it is possible to construct an image converter A that can be universally applied to any hand image. As a result, the image converter A receives the transmitted light image of the finger, and outputs the transmitted light image of the finger in which the blood vessel image is sharpened.
  • the authentication processing unit 3 first makes a red image of a finger and an infrared image a pair, inputs the red image to the image converter A, and outputs an infrared image of the pair.
  • the image conversion parameters are updated to optimize the image converter A.
  • the infrared image collected by the apparatus shown in FIG. 10A is input to the image converter A and the blood vessel image is sharpened as an output of the image conversion. Obtain an infrared image.
  • generates a vein pattern image from an infrared image is performed.
  • an infrared image collected by the apparatus shown in FIG. 10A is input to the image converter based on machine learning such as deep learning, and the image conversion by the image converter A (S133)
  • the image converter B is optimized so that a vein pattern image generated from the infrared image in which the blood vessel image obtained in the above is sharpened is output.
  • the specific process flow of the optimization (S134) of the image converter B may be the same as the optimization of the image converter A shown in FIG.
  • the image converter B outputs a vein pattern image obtained by sharpening a vein by inputting an infrared image.
  • the image conversion (S135) by the image converter B shown in FIG. 14 is performed.
  • the finger detection (S12) is performed on the image obtained by photographing (S11) the transmitted light of the infrared light finger irradiated by the infrared light source 9 shown in FIG. 10B.
  • the clear image of vein pattern is obtained by inputting the infrared image obtained by performing the above into the image converter B.
  • the configuration of the imaging apparatus may be a reflected light method in which the camera and the light source as shown in FIG. 11 are on the same side with respect to the finger, as in FIG. 2 of the embodiment.
  • the present invention is not limited to the embodiments described above, but includes various modifications.
  • the embodiments described above have been described in detail for better understanding of the present invention, and are not necessarily limited to those having all the configurations of the description.
  • part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
  • each configuration, function, authentication processing unit, etc. described above has described an example of creating a program for realizing a part or all of them
  • hardware is designed by designing a part or all of them with an integrated circuit etc. It goes without saying that it may be realized by wear. That is, all or part of the functions of the processing unit may be realized by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA) instead of the program.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Collating Specific Patterns (AREA)
  • Image Input (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

In order to achieve a stable and highly precise biometric authentication by efficiently combining and collating biometric features while taking into consideration the characteristics of a plurality of biometric features, a biometric authentication device comprises: an image input unit that acquires a biometric image by photographing a living body; an authentication processing unit that processes the acquired biometric image and carries out a biometric authentication; and a storage unit that stores registration information pertaining to a plurality of biometric features obtained from the biometric image. The authentication processing unit carries out the biometric authentication by combining and positioning a plurality of different biometric features having different spatial characteristics and obtained by processing, and using the biometric features that have been positioned and the registration information to carry out a degree-of-difference calculation (S161 – S164).

Description

生体認証装置Biometric authentication device
 本発明は、生体を用いて個人を認証する生体認証装置に関する。 The present invention relates to a biometric authentication apparatus that authenticates an individual using a living body.
 様々な生体認証技術の中でも、指静脈認証は高精度な認証を実現できるものとして知られている。指静脈認証は、指内部の血管パターンを使用して優れた認証精度を実現する。
指静脈認証は、指紋認証に比べて偽造及び改ざんが困難であるため、高度なセキュリティを実現できる。
Among various biometric authentication techniques, finger vein authentication is known as one that can realize highly accurate authentication. Finger vein authentication uses blood vessel patterns inside the finger to achieve excellent authentication accuracy.
Since finger vein authentication is more difficult to forge and tamper as compared with fingerprint authentication, high security can be realized.
 近年では、携帯電話機、ノート型PC(Personal Computer)、スマートフォンやタブレット端末などの携帯端末、ロッカー、金庫、プリンターなどの機器に生体認証装置を搭載し、各機器のセキュリティを確保する事例が増加している。また、生体認証が適用される分野として、入退室管理、勤怠管理、コンピュータへのログインなどに加え、近年では決済などにも生体認証が利用されてきている。特に公共で利用される生体認証装置は、確実な個人認証を実現することが重要である。さらには、近年のタブレット型携帯端末の普及やウェアラブル・コンピューティングの潮流を鑑みると、上記のように利便性を担保しつつ、装置の小型化を実現することも重要な要件の一つとなる。 In recent years, increasing cases of securing the security of each device by installing a biometric authentication device on devices such as mobile phones, laptop PCs (Personal Computers), smartphones and tablet terminals, lockers, safes, printers, etc. ing. Further, as fields to which biometrics authentication is applied, biometrics authentication has been used in recent years in addition to entry and exit management, attendance management, login to a computer, and the like. In particular, it is important for biometric authentication devices used in the public to realize reliable personal authentication. Furthermore, in view of the spread of tablet type portable terminals in recent years and the trend of wearable computing, it is also one of the important requirements to realize the miniaturization of the device while securing the convenience as described above.
 特許文献1では、小型な装置で取得した狭い領域の生体画像から複数の特徴を抽出し、認証に利用する生体認証技術が開示されている。 Patent Document 1 discloses a biometric authentication technology that extracts a plurality of features from a narrow-area biometric image acquired by a small device and uses it for authentication.
 特許文献2では、小型な装置でありながら広い領域の生体画像を取得し、姿勢変動に頑健な認証を実現する生体認証技術が開示されている。 Patent Document 2 discloses a biometric authentication technology that acquires a wide area biological image while being a compact device, and realizes robust authentication against posture change.
特開2016-96987号公報JP, 2016-96987, A 特開2017-91186号公報JP, 2017-91186, A
 小型で使い勝手が良く、そして高精度な個人認証装置を実現するためには、複数の生体特徴を効率良く組み合わせて認証に利用することが重要となる。特許文献1では、波長の異なる複数の可視光源を指に照明し、その反射光を撮影して得られた画像から、静脈紋及び脂肪紋のパターンを生体特徴として抽出し、それらの特徴を相互的に利用して照合することで高精度に認証する技術が提案されている。しかしながら、複数の生体特徴の組み合わせ照合に関し、それぞれの特徴の特性が考慮されていない。 In order to realize a small-sized, easy-to-use and highly accurate personal identification device, it is important to use a plurality of biometric features efficiently for identification. In Patent Document 1, a plurality of visible light sources having different wavelengths are illuminated on a finger, and a pattern of vein pattern and fat pattern is extracted as a biological feature from an image obtained by photographing the reflected light, and those features are mutually extracted. There has been proposed a technique for performing high-accuracy authentication by collating the information in an efficient manner. However, regarding combination matching of a plurality of biometric features, the characteristics of each feature are not considered.
 特許文献2では、近赤外及び緑色の光源を指に照明し、その反射光を撮影して得られた画像から、静脈紋・関節紋・脂肪紋・指輪郭などのパターンを抽出し、それらのパターンを組み合わせて照合することに言及している。しかしながら、それらのパターンの具体的な組み合わせ手法に関する検討がなされていない。 In Patent Document 2, near-infrared and green light sources are illuminated on a finger, and patterns such as vein pattern, joint pattern, fat pattern and finger outline are extracted from an image obtained by photographing the reflected light, It refers to combining and matching the patterns of However, no study has been made on a specific combination method of those patterns.
 本発明の目的は、上記の課題を解決し、複数の生体特徴の特性を考慮し、それらの特徴を効率良く組み合わせて照合することで、安定かつ高精度な認証を実現する生体認証装置を提供することにある。 The object of the present invention is to solve the above-mentioned problems, to provide a biometric authentication device which realizes stable and high-accuracy authentication by efficiently combining and collating features in consideration of characteristics of a plurality of biometric features. It is to do.
 上記目的を達成するため、本発明においては、生体を撮影して生体画像を取得する画像入力部と、取得した生体画像を処理して生体認証を行う認証処理部と、生体画像から得られる生体特徴に関する登録情報を記憶する記憶部と、を備え、認証処理部は、処理して得られた、空間特性の異なる複数の生体特徴を組み合わせて位置合わせを行い、位置合わせ後の生体特徴と登録情報とを用いて、生体認証を行う生体認証装置を提供する。 In order to achieve the above object, in the present invention, an image input unit for capturing a living body to acquire a living body image, an authentication processing unit for processing the acquired living body image to perform biometric authentication, and a living body obtained from the living body image And a storage unit for storing registration information about the feature, the authentication processing unit performs alignment by combining a plurality of biological features having different spatial characteristics obtained by processing and performing alignment, and registering the biological features after alignment Provided is a biometric authentication device that performs biometric authentication using information and information.
 本発明によれば、生体認証装置において、複数の生体特徴の特性を考慮して効率的に組み合わせることで、安定かつ高精度な認証を行う認証装置を提供することができる。 According to the present invention, it is possible to provide an authentication device that performs stable and high-accuracy authentication by efficiently combining in consideration of the characteristics of a plurality of biometric features in a biometric authentication device.
実施例1の生体認証システムの全体構成の一例を示す図である。FIG. 1 is a diagram illustrating an example of the entire configuration of a biometric authentication system according to a first embodiment. 実施例1の入力装置の構成の一例を示す図である。FIG. 2 is a diagram illustrating an example of the configuration of an input device according to a first embodiment. 実施例1の登録時の処理フローの一例を示す図である。FIG. 7 is a diagram illustrating an example of a processing flow at the time of registration of the first embodiment. 実施例1の認証時の処理フローの一例を示す図である。FIG. 8 is a diagram showing an example of a process flow at the time of authentication in the first embodiment. 実施例1の認証時の処理フローにおける照合の処理フローの一例を示す図である。FIG. 8 is a diagram showing an example of a process flow of verification in the process flow at the time of authentication in the first embodiment. 実施例1の撮影画像及び特徴抽出の結果得られる画像の一例を示す図である。FIG. 3 is a view showing an example of a photographed image of Example 1 and an image obtained as a result of feature extraction. 実施例2の、携帯端末の内蔵カメラで生体を撮影して認証する構成の一例を示す模式図である。FIG. 16 is a schematic view showing an example of a configuration for photographing and authenticating a living body with a built-in camera of a mobile terminal according to a second embodiment. 実施例2の登録時の処理フローの一例を示す図である。FIG. 18 is a diagram illustrating an example of a process flow at the time of registration of the second embodiment. 実施例2の撮影画像及び特徴抽出の結果得られる画像の一例を示す図である。FIG. 16 is a view showing an example of a photographed image of Example 2 and an image obtained as a result of feature extraction. 実施例2の認証時の処理フローの一例を示す図である。FIG. 18 is a diagram illustrating an example of a process flow at the time of authentication of the second embodiment. 実施例2の認証時の処理フローにおける照合の処理フローの一例を示す図である。FIG. 18 is a diagram showing an example of a process flow of verification in the process flow at the time of authentication in the second embodiment. 実施例3における、特徴抽出処理の最適化のためのデータ収集に用いる入力装置の透過光方式の構成を示す図である。FIG. 18 is a diagram showing a configuration of a transmitted light system of an input device used for data collection for optimization of feature extraction processing in the third embodiment. 実施例3における、登録時および認証時に用いる入力装置の構成の一例を示す図である。FIG. 18 is a diagram showing an example of a configuration of an input device used at the time of registration and authentication in the third embodiment. 実施例3における、特徴抽出処理の最適化のためのデータ収集に用いる入力装置の反射光方式の構成を示す図である。FIG. 18 is a diagram showing a configuration of a reflected light scheme of an input device used for data collection for optimization of feature extraction processing in the third embodiment. 実施例3における、生体特徴の抽出の最適化の処理フローの一例を示す図である。FIG. 18 is a diagram illustrating an example of a process flow of optimization of extraction of biological features in the third embodiment. 実施例3における、画像変換器Aの最適化の処理フローの一例を示す図である。FIG. 18 is a diagram illustrating an example of a process flow of optimization of the image converter A in the third embodiment. 実施例3における、画像変換器Bによる画像変換の一例を示す図である。FIG. 18 is a diagram illustrating an example of image conversion by an image converter B in Embodiment 3.
 以下、添付図面を参照して本発明の実施例について説明する。なお、添付図面は本発明の原理に則った具体的な実施例を示しているが、これらは本発明の理解のためのものであり、決して本発明を限定的に解釈するために用いられるものではない。また、各図において共通の構成については同一の参照番号が付されている。 Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. Although the attached drawings show specific embodiments in accordance with the principles of the present invention, these are for understanding the present invention, and are used for limiting interpretation of the present invention. is not. In addition, the same reference numerals are assigned to the same components in the respective drawings.
 実施例1は、生体を撮影して生体画像を取得する画像入力部と、取得した生体画像を処理して生体認証を行う認証処理部と、生体画像から得られる生体特徴に関する登録情報を記憶する記憶部とを備え、認証処理部は、処理して得られた、空間特性の異なる複数の生体特徴を組み合わせて位置合わせを行い、位置合わせ後の生体特徴と登録情報とを用いて、生体認証を行う構成の生体認証装置の実施例である。ここで、空間特性とは、生体の画像の色素濃度の空間分布の特性であり、本実施例は、この空間特性の異なる複数の生体特徴を組み合わせて位置合わせを行った後、登録情報との照合を実行する生体認証装置の実施例である。 The first embodiment stores an image input unit that captures a living body and acquires a biometric image, an authentication processing unit that processes the acquired biometric image to perform biometric authentication, and registration information on biometric features obtained from the biometric image. The authentication processing unit combines a plurality of biological features with different spatial characteristics obtained by processing and performs alignment, and performs biometrics using the biological features after alignment and registration information. Is an embodiment of a biometric authentication device configured to Here, the spatial characteristic is a characteristic of the spatial distribution of the dye concentration of the image of the living body, and in the present embodiment, after alignment is performed by combining a plurality of biological features having different spatial characteristics, registration with the registration information is performed. It is an Example of the biometrics apparatus which performs collation.
 図1は、実施例1の指の血管を用いた生体認証システムの全体構成の一例を示す図である。尚、本実施例の構成は、図1に示したシステム構成としてではなく、全てまたは一部の構成を筐体に搭載した装置としての構成であってもよいことは言うまでも無い。装置は、認証処理を含めた生体認証装置としても良いし、認証処理は装置外部で行い、血管画像の取得に特化した血管画像取得装置、血管画像抽出装置としても良い。また、後述のようにスマートフォンなどの端末としての実施形態であってもよい。本明細書においいては、生体認証システムを含め、全ての実施形態を総称して生体認証装置と呼ぶ場合がある。 FIG. 1 is a diagram illustrating an example of an entire configuration of a biometric authentication system using a blood vessel of a finger according to a first embodiment. Note that the configuration of the present embodiment is not limited to the system configuration shown in FIG. 1, and it goes without saying that the configuration may be a device in which all or a part of the configuration is mounted on a housing. The apparatus may be a biometric authentication apparatus including an authentication process, or the authentication process may be performed outside the apparatus and may be a blood vessel image acquisition apparatus specialized for acquiring a blood vessel image or a blood vessel image extraction apparatus. In addition, as described later, the embodiment may be a terminal such as a smartphone. In the present specification, all the embodiments including the biometric authentication system may be collectively referred to as a biometric authentication device.
 実施例1の生体認証システムは、入力装置2、認証処理部3、記憶部4、表示部5、入力部6、音声出力部7、画像入力部8を含む。 The biometric authentication system according to the first embodiment includes an input device 2, an authentication processing unit 3, a storage unit 4, a display unit 5, an input unit 6, an audio output unit 7, and an image input unit 8.
 入力装置2は、その筐体に設置された光源9及び筐体内部に設置された撮像装置10を含み、画像入力部8を介して、認証処理部3に生体画像を入力する。言い換えるなら、画像入力部8は、入力装置2の撮像装置10で撮影された生体画像を取得し、取得した生体画像を認証処理部3へ入力する。そのため本明細書においては、入力装置2と画像入力部8を纏めて、画像入力部と総称する場合がある。 The input device 2 includes the light source 9 installed in the housing and the imaging device 10 installed inside the housing, and inputs a living body image to the authentication processing unit 3 through the image input unit 8. In other words, the image input unit 8 acquires a living body image captured by the imaging device 10 of the input device 2, and inputs the acquired living body image to the authentication processing unit 3. Therefore, in the present specification, the input device 2 and the image input unit 8 may be collectively referred to as an image input unit.
 認証処理部3は生体認証に関わる処理を実行する処理部の総称であり、画像から生体(指)とシステムとの距離又は生体(指)の姿勢を判断する判断部や、生体(指)との距離又は生体(指)の姿勢の修正指示を表示部等に行う状態制御部や、撮像した画像から不要情報(しわ、背景、等)を除去する不要情報除去部や、撮像した画像から生体特徴の情報を抽出する特徴抽出部や、抽出した生体特徴の情報と記憶部に予め格納した登録情報とを照合する照合部等をその機能処理部として備える。 The authentication processing unit 3 is a generic name of processing units that execute processing related to biometrics, and a determination unit that determines the distance between a living body (finger) and a system or the posture of a living body (finger) from an image, and the living body (finger) A state control unit that instructs the display unit etc. to correct the distance of the subject or the posture of the living body (finger), an unnecessary information removal unit that removes unnecessary information (wrinkling, background, etc.) from the captured image, the living body from the captured image A feature extraction unit that extracts feature information and a matching unit that matches the extracted biometric feature information with registered information stored in advance in the storage unit are provided as the functional processing unit.
 入力装置2に設置された光源9は、例えば、LED(Light Emitting Diode)などの発光素子であり、入力装置2の上部に提示された指1に光を照射する。撮像装置10は、入力装置2に提示された指1の画像を撮影する。なお、提示する指1は、1本のみならず複数本であっても良い。 The light source 9 disposed in the input device 2 is, for example, a light emitting element such as a light emitting diode (LED), and emits light to the finger 1 presented on the upper portion of the input device 2. The imaging device 10 captures an image of the finger 1 presented to the input device 2. The finger 1 to be presented may be not only one but a plurality.
 図1に示すように、認証処理部3は、そのハードウェア構成として、中央処理部(CPU:Central Processing Unit)11、メモリ12及び種々のインターフェイス(IF)13を含む。 As shown in FIG. 1, the authentication processing unit 3 includes, as its hardware configuration, a central processing unit (CPU: Central Processing Unit) 11, a memory 12 and various interfaces (IF) 13.
 インターフェイス13は、認証処理部3と外部の装置とを接続する。具体的には、インターフェイス13は、入力装置2、記憶部4、表示部5、入力部6、音声出力部7及び画像入力部8等をCPU11、メモリ12などと接続する。 The interface 13 connects the authentication processing unit 3 to an external device. Specifically, the interface 13 connects the input device 2, the storage unit 4, the display unit 5, the input unit 6, the audio output unit 7, the image input unit 8 and the like to the CPU 11, the memory 12 and the like.
 記憶部4は、利用者の登録データを予め記憶している。登録データは、利用者を照合するための情報であり、例えば、指静脈パターンの画像等である。通常、指静脈パターンの画像は、主に指の掌側の皮下に分布する血管(指静脈)を暗い影のパターンとして撮像した画像である。認証処理部3は、この指静脈パターンの画像から、指の画像の色素濃度の空間分布の特性である、空間特性の異なる複数の生体特徴を抽出し、登録情報と照合するための抽出・照合機能を備えている。 The storage unit 4 stores in advance registration data of the user. The registration data is information for matching the user, and is, for example, an image of a finger vein pattern. Usually, the image of the finger vein pattern is an image obtained by imaging blood vessels (finger veins) mainly distributed subcutaneously on the palm side of the finger as a dark shadow pattern. The authentication processing unit 3 extracts, from the image of the finger vein pattern, a plurality of biological features having different spatial characteristics, which is a characteristic of the spatial distribution of the dye concentration of the image of the finger, and extracts and collates the registered information It has a function.
 表示部5は、例えば、液晶ディスプレイ(LCD)であり、認証処理部3から受信した種々の情報を表示する出力装置である。入力部6は、例えば、キーボードであり、利用者から入力された情報を認証処理部3に送信する。音声出力部7は、認証処理部3から受信した情報を、音声などの音響信号で発信する出力装置である。 The display unit 5 is, for example, a liquid crystal display (LCD), and is an output device that displays various information received from the authentication processing unit 3. The input unit 6 is, for example, a keyboard, and transmits information input by the user to the authentication processing unit 3. The voice output unit 7 is an output device that transmits information received from the authentication processing unit 3 as an acoustic signal such as voice.
 ここで、表示部5及び音声出力部7は、この生体認証システムを利用するユーザに対して生体(指)とシステムとの距離や、生体(指)の姿勢の修正を指示するための指示部の一例であり、本実施例はこの装置構成に限定されるものではない。また、上記で説明した認証処理部は、ひとつのCPUで全ての処理を行っても良いし、機能処理部毎にCPUを用いても良い。 Here, the display unit 5 and the voice output unit 7 are instruction units for instructing the user who uses the biometric authentication system to correct the distance between the living body (finger) and the system and the posture of the living body (finger). This embodiment is an example, and the present embodiment is not limited to this device configuration. Further, the authentication processing unit described above may perform all processing by one CPU, or may use a CPU for each function processing unit.
 図2は、実施例1の生体認証システムの入力装置2の具体的構造の一例を説明するための図である。入力装置2は、指の表面あるいは皮下に分布する血管(指静脈)などの生体特徴を撮影する。入力装置2は装置筐体14で囲われ、その内部には1台の撮像装置10が配置されている。また、複数の赤外光源9は撮像装置10の周囲に円環状に配置され、開口部を介して指1を一様に照らすことができる。赤外光源9は赤外の波長の光を照射する。赤外光源9は任意の強度で照射できるものとする。具体的な波長の一例として、赤外光源9は850nmの波長を選択する。開口部にはアクリル材15がはめ込まれており、埃などが装置内部に侵入することを防いだり、装置内部の部材を物理的に保護したりする効果を有する。 FIG. 2 is a diagram for explaining an example of a specific structure of the input device 2 of the biometric authentication system of the first embodiment. The input device 2 captures biological features such as blood vessels (finger veins) distributed on the surface of the finger or under the skin. The input device 2 is enclosed by a device housing 14, and one imaging device 10 is disposed in the inside thereof. Further, the plurality of infrared light sources 9 are annularly arranged around the imaging device 10, and can uniformly illuminate the finger 1 through the opening. The infrared light source 9 emits light of an infrared wavelength. The infrared light source 9 can emit light with an arbitrary intensity. The infrared light source 9 selects a wavelength of 850 nm as an example of a specific wavelength. An acrylic material 15 is inserted into the opening to prevent dust and the like from intruding into the inside of the apparatus, and has an effect of physically protecting members inside the apparatus.
 撮像装置10とアクリル材15の間及び光源9とアクリル材15の間にはそれぞれ偏光板A16、偏光板B17が挿入されている。偏光板AはP波を偏光する偏光板であり、偏光板BはS波を偏光する偏光板である。また、偏光板AにS波を偏光する偏光板を利用し、偏光板BにP波を偏光する偏光板を利用してもよい。偏光板(偏光板A16と偏光板B17の総称)を利用することで、赤外光源9が照射した光のうち、指1の表面で反射した光を撮像装置10で受光するのを抑制することができる。これにより、空気の乾燥や生活習慣などにより変化しやすい生体外部の組織である指紋や表皮のしわなどのパターンを観測せず、比較的変化しづらい生体内部の組織である静脈のパターンのみを観測することができる。 A polarizing plate A16 and a polarizing plate B17 are inserted between the imaging device 10 and the acrylic material 15 and between the light source 9 and the acrylic material 15, respectively. The polarizing plate A is a polarizing plate that polarizes the P wave, and the polarizing plate B is a polarizing plate that polarizes the S wave. In addition, a polarizing plate that polarizes the S wave may be used as the polarizing plate A, and a polarizing plate that polarizes the P wave may be used as the polarizing plate B. By using a polarizing plate (a generic name of a polarizing plate A16 and a polarizing plate B17), it is suppressed that the light reflected by the surface of the finger 1 among the light irradiated by the infrared light source 9 is received by the imaging device 10 Can. As a result, without observing patterns such as fingerprints and wrinkles of the outer skin which are easily changed due to drying of the air and lifestyle, etc., only patterns of veins which are relatively hard to change inside the living body are observed. can do.
 本実施例の撮像装置10はモノクロカメラであり、赤外光の波長帯のみに感度を持つ受光素子を有する。撮像装置10には、赤外光および可視光の波長帯に感度を持つ受光素子を有するモノクロカメラもしくはカラーカメラを利用してもよい。この際、カメラレンズの前面もしくは内部に可視光を遮断する光学フィルタ(例えば、バンドパスフィルタやローパスフィルタ)を挿入し、赤外光の波長帯のみを受光素子で受光するようにする。 The imaging device 10 of the present embodiment is a monochrome camera, and has a light receiving element having sensitivity in only the wavelength band of infrared light. The imaging device 10 may use a monochrome camera or a color camera having a light receiving element having sensitivity to the wavelength band of infrared light and visible light. At this time, an optical filter (for example, a band pass filter or a low pass filter) for blocking visible light is inserted in front of or inside the camera lens so that only the wavelength band of infrared light is received by the light receiving element.
 図3は、生体特徴の空間的な特性に基づき、複数の生体特徴を効率良く組み合わせて認証を行う、本実施例の生体認証システムの登録時の処理フローの一例を示す。 FIG. 3 shows an example of a processing flow at the time of registration of the biometric authentication system of the present embodiment, in which authentication is performed by efficiently combining a plurality of biometric features based on spatial characteristics of biometric features.
 まず、利用者がシステムの入力装置2に対して指を提示し、システムは赤外光を照射しながら撮像装置10のカメラで指を撮影する(S11)。このとき、外光などによって輝度飽和が見られる場合は露光時間や光源の照射光量の調整を実施し、輝度飽和が消失する露光時間や光源の照射光量に設定しても良い。続いてカメラで撮影された映像から手指の位置や姿勢を検出する(S12)。指の姿勢情報には、指先や指の根元の位置が含まれており、認証の対象となる一本ないし複数本の指の位置情報を用いてそれぞれの画像を切り出す。さらに、切り出された指画像に対して指の位置や姿勢に応じて画像の拡大率を補正した上で、指画像から空間特性の異なる複数の生体特徴を抽出する(S13)。 First, the user presents a finger to the input device 2 of the system, and the system shoots the finger with the camera of the imaging device 10 while emitting infrared light (S11). At this time, when luminance saturation is observed due to external light or the like, the exposure time or the irradiation light amount of the light source may be adjusted, and the exposure time at which the luminance saturation disappears or the irradiation light amount of the light source may be set. Subsequently, the position and posture of the finger are detected from the image captured by the camera (S12). The posture information of the finger includes the position of the fingertip or the root of the finger, and the image of each finger is cut out using the positional information of one or more fingers to be authenticated. Furthermore, after correcting the enlargement ratio of the extracted finger image according to the position and posture of the finger, a plurality of biological features having different spatial characteristics are extracted from the finger image (S13).
 次に、抽出した複数の生体特徴が所定の登録品質を満たすかどうかを判定する(S14)。所定の登録品質を満たすかどうかとは、例えば、抽出した静脈パターンの密度やパターンの変化量が所定の範囲に収まっているかどうかなどとしても良い。抽出した生体特徴が所定の品質を満たさない場合、指の撮影をやり直す。抽出した生体特徴が所定の品質を満たす場合、生体特徴を登録情報である登録データとして保存する(S15)。 Next, it is determined whether the plurality of extracted biometric features satisfy a predetermined registration quality (S14). Whether or not the predetermined registration quality is satisfied may be, for example, whether or not the density of the extracted vein pattern or the amount of change in the pattern falls within a predetermined range. If the extracted biometric feature does not meet the predetermined quality, the finger imaging is performed again. If the extracted biometric feature satisfies a predetermined quality, the biometric feature is stored as registration data, which is registration information (S15).
 図4Aは、生体特徴の空間的な特性に着目し、複数の生体特徴を効率良く組み合わせて認証を行う、本実施例の生体認証システムの認証時の処理フローの一例である。 FIG. 4A is an example of a process flow at the time of authentication of the biometric authentication system of the present embodiment, in which authentication is performed by efficiently combining a plurality of biometric features, focusing on spatial characteristics of biometric features.
 指の撮影(S11)、撮影画像からの指位置・姿勢の検出(S12)、指画像からの生体特徴の抽出(S13)までの処理フローは登録時と共通である。認証時は、生体特徴の抽出の後、登録データの生体特徴との照合を行う(S16)。照合では、抽出した生体特徴と、全ての登録データの生体特徴との相違度の算出を実施する。照合を全ての登録データと認証データの組み合わせで実施し、得られた相違度スコアが最小となる組み合わせについて、認証可否判定を行う(S17)。すなわち、算出した相違度スコアがあらかじめ設定した閾値を下回っていれば、認証成功(Yes)とし終了する。相違度スコアが閾値を上回っていれば、認証失敗(No)とし指の撮影に戻る。 The processing flow from the photographing of the finger (S11), the detection of the finger position / posture from the photographed image (S12), and the extraction of the biometric feature from the finger image (S13) is the same as at the time of registration. At the time of authentication, after extraction of the biometric feature, matching of the registered data with the biometric feature is performed (S16). In the collation, calculation of the degree of difference between the extracted biometric feature and the biometric feature of all the registered data is performed. Verification is carried out with a combination of all registration data and authentication data, and authentication propriety determination is performed for the combination with the smallest degree of dissimilarity score obtained (S17). That is, if the calculated dissimilarity score is below the threshold set in advance, the authentication is successful (Yes) and the process ends. If the degree of difference score exceeds the threshold value, the authentication fails (No), and the process returns to photographing of the finger.
 図4Bは、本実施例のシステムの認証時の処理フローにおける照合(S16)の処理フローの一例を示す。指画像からの生体特徴の抽出処理では、静脈紋や関節紋、表皮のしわ、脂肪紋などの空間特性の異なる複数の生体特徴を抽出することが可能である。特に、本実施例のシステム構成で取得する撮影画像には、静脈紋および関節紋が比較的鮮明に映っている。静脈紋は指の長軸方向に流れる線パターンをもち、関節紋は指の短軸方向に流れる線パターンをもつ。そこで、ガボールフィルタなどの特定の方向の線パターンを抽出するフィルタを撮影画像に適用することにより、静脈紋や関節紋のパターンをそれぞれ分離して抽出することが可能である。 FIG. 4B shows an example of the process flow of verification (S16) in the process flow at the time of authentication of the system of this embodiment. In the extraction process of biological features from finger images, it is possible to extract a plurality of biological features having different spatial characteristics, such as vein patterns, joint patterns, wrinkles of the epidermis, and fatty patterns. In particular, the vein pattern and the joint pattern appear relatively clearly in the photographed image acquired by the system configuration of the present embodiment. The vein pattern has a line pattern flowing in the long axis direction of the finger, and the joint pattern has a line pattern flowing in the short axis direction of the finger. Therefore, by applying a filter for extracting a line pattern in a specific direction, such as a Gabor filter, to the captured image, it is possible to separate and extract the vein pattern and the joint pattern from each other.
 図4Bに示すように、照合(S16)では、生体特徴の抽出(S13)で得られた認証データの特徴と、装置内部もしくは外部に保存してある登録データの生体特徴との相違度を算出するが、相違度の算出は登録済みの全てのID(0≦i<N)に対して行う(S161)。一つのIDとの相違度の算出では、一つ以上の生体特徴を総当たりで相違度を算出する(S162)。相違度の算出では、第一の生体特徴により登録データと認証データの間の指の位置・姿勢情報の一部を正規化し(S163)、第二の生体特徴により指の位置・姿勢情報の全体を正規化し、第二の生体特徴により相違度を算出する(S164)。 As shown in FIG. 4B, in the collation (S16), the difference between the feature of the authentication data obtained in the extraction of the biometric feature (S13) and the biometric feature of the registered data stored inside or outside the device is calculated. However, calculation of the degree of difference is performed for all registered IDs (0 ≦ i <N) (S161). In the calculation of the degree of difference with one ID, the degree of difference is calculated by rounding one or more biometric features (S162). In the calculation of the degree of difference, part of the finger position / posture information between the registration data and the authentication data is normalized by the first biological feature (S163), and the entire finger position / posture information is calculated by the second biological feature. Are normalized, and the degree of difference is calculated by the second biometric feature (S164).
 図5に模式的に示した撮影画像及び特徴抽出の結果得られる画像の一例のように、本実施例においては、取り扱う生体特徴を2次元画像であるとし、指画像18から抽出した関節紋19と静脈紋20を用いる。まず、登録データと認証データの、第一の生体特徴としての関節紋19の間の指の長軸方向の位置ずれ量を求める。次に、求めた指の長軸方向の位置ずれ量に基づき位置ずれを補正した上で、第二の生体特徴としての静脈紋20を用いて指の短軸方向の位置ずれ量を求める。最後に、求めた指の短軸方向の位置ずれ量に基づき位置ずれを補正した上で、第二の生体特徴である静脈紋20のデータ間の相違度を算出し、第一の相違度とする。 As an example of the photographed image schematically shown in FIG. 5 and an image obtained as a result of feature extraction, in the present embodiment, the biological feature to be handled is a two-dimensional image, and the joint pattern 19 extracted from the finger image 18 And a vein pattern 20 are used. First, the amount of positional deviation of the registration data and the authentication data in the long axis direction of the finger between the joint print 19 as the first biological feature is determined. Next, the positional deviation is corrected based on the positional deviation amount in the long axis direction of the finger, and then the positional deviation amount in the short axis direction of the finger is determined using the vein pattern 20 as the second biological feature. Finally, the positional deviation is corrected based on the positional deviation amount in the minor axis direction of the finger, and then the degree of difference between the data of the vein pattern 20 which is the second biological feature is calculated. Do.
 更に、関節紋19と静脈紋20を入れ替え、第二の生体特徴としての静脈紋20で指の短軸方向の位置補正を行い、第一の生体特徴としての関節紋19で指の長軸方向の位置補正を行い、第一の生体特徴である関節紋19のデータ間の相違度を算出し、第二の相違度とする。 Furthermore, the joint print 19 and the vein print 20 are exchanged, position correction of the finger in the short axis direction is performed by the vein print 20 as the second biological feature, and the long axis direction of the finger is performed by the joint print 19 as the first biological feature. The position correction is performed to calculate the degree of difference between the data of the joint print 19 which is the first biological feature, and this is taken as the second degree of difference.
 最後に、求めた静脈紋20および関節紋19のデータ間の第一の相違度と第二の相違度を合成し、合成相違度を求め、生体認証を実行する。つまり、本実施例の照合処理においては、関節紋や静脈紋などの複数の生体特徴それぞれの空間的な特性を考慮し、関節紋19では指の長軸方向のみを位置補正し、静脈紋20では指の短軸方向のみを位置補正して、合成相違度を求め、登録情報との間の照合を行う。 Finally, the first degree of difference and the second degree of difference between the determined vein pattern 20 and joint pattern 19 data are synthesized, the degree of synthesized difference is determined, and biometrics is executed. That is, in the collation processing of the present embodiment, the position of only the long axis direction of the finger is corrected in the joint print 19 in consideration of the spatial characteristics of each of a plurality of biological features such as the joint print and the vein print. In this case, position correction is performed only on the short axis direction of the finger to obtain the degree of synthetic difference, and comparison with registered information is performed.
 ここで、例えば、静脈紋20のみで指の短軸及び長軸方向の位置補正を実施し、位置補正結果に基づき静脈紋20の相違度を算出したと仮定する。静脈紋20は指の短軸と長軸いずれの方向にも流れるパターンをもつため、テンプレートマッチングにより登録データと認証データの位置ずれ量を正確に求めることができる。テンプレートマッチングでは2つの画像の位置をずらしながら反復的に相違度を算出し、相違度が最小となる位置を求める。本人照合時は、データ間の相違度が最小となる位置がデータ間で指の位置や姿勢がおおむね一致する位置であるといえる。 Here, for example, it is assumed that the position correction in the short axis direction and the long axis direction of the finger is performed only with the vein pattern 20, and the difference degree of the vein pattern 20 is calculated based on the position correction result. The vein pattern 20 has a pattern that flows in the direction of either the short axis or the long axis of the finger, so that the amount of positional deviation between the registered data and the authentication data can be accurately determined by template matching. In template matching, the degree of difference is repeatedly calculated while shifting the positions of two images, and the position at which the degree of difference is minimum is determined. At the time of personal identification, it can be said that the position at which the degree of difference between the data is the smallest is the position at which the position and posture of the finger substantially match between the data.
 しかしながら、他人照合時は、データ間の相違度が最小となる位置がデータ間で指の位置や姿勢がおおむね一致する位置であるとは限らず、最も他人受入しやすい位置で求めた相違度を最終的な相違度スコアとしてしまい、他人受入増加の要因となる。 However, at the time of other people's collation, the position where the degree of difference between the data is the smallest is not necessarily the position where the position and posture of the finger roughly match between the data. It will be a final difference score, which will be a factor in increasing false acceptance.
 また、例えば、関節紋19のみで指の短軸および長軸方向の位置補正を実施し、位置補正結果に基づき静脈紋の相違度を算出したとする。関節紋は指の短軸方向に流れるパターンをもつため、登録データと認証データで指の短軸方向に位置ずれがあったとしても、関節紋を用いて正確な位置ずれ量を求めるのは難しい。よって、関節紋のみでは位置ずれ量を補正しきれず、本人照合時の静脈紋20による相違度が大きくなり、本人拒否増加の要因となる。 Further, for example, it is assumed that position correction in the short axis direction and the long axis direction of the finger is performed only with the joint print 19, and the degree of difference in vein print is calculated based on the position correction result. Since the joint print has a pattern flowing in the short axis direction of the finger, it is difficult to obtain an accurate positional deviation amount using the joint print even if there is a positional deviation in the short axial direction of the finger in the registration data and authentication data . Therefore, the position shift amount can not be corrected only with the joint print, and the degree of difference due to the vein pattern 20 at the time of the person verification increases, which causes an increase in the person rejection.
 それに対し、上述の本実施例の構成のように、複数の生体特徴の空間的な特性を考慮して位置合わせして登録情報と照合することにより、他人受入や本人拒否を抑制することが可能な生体認証システム、装置を提供することが可能となる。 On the other hand, as in the configuration of the above-described embodiment, it is possible to suppress another person's acceptance or the person's rejection by matching the position information in consideration of the spatial characteristics of a plurality of biometric features and matching it with the registration information. It is possible to provide a biometric authentication system and apparatus.
 実施例2は、スマートフォンなどに標準搭載された可視光に感度をもつ津像装置であるカメラで、環境光の下で認証を行う生体認証装置の一実施例である。実施例2のシステム構成は実施例1と同一であるが、図6に示すように、その入力装置の構造が異なる。 The second embodiment is a camera which is a Tsu image device having sensitivity to visible light, which is standard mounted on a smartphone or the like, and is an example of a biometric authentication device that performs authentication under ambient light. The system configuration of the second embodiment is the same as that of the first embodiment, but as shown in FIG. 6, the structure of the input device is different.
 図6において、本実施例の入力装置2は、スマートフォン23の撮像装置としてカラーカメラ21を含み、可視光の波長帯に感度を持つ複数の受光素子を有する。カラーカメラ21は、例えば青(B)、緑(G)、赤(R)に感度を持つ三種類のCMOS又はCCD素子などの固体撮像素子を有し、これらが画像の画素ごとに格子状に配置されている。このような本実施例のカラーカメラ21は、受光感度のピーク波長が異なる三つの受光センサを複数持つ。各受光センサの受光感度は、例えば、青で480nm付近、緑で550nm付近、赤で620nm付近に受光感度のピークを持つセンサから構成されているため、それぞれの波長の感度より生体から放射された光の空間的な色分布、すなわち色素濃度の空間分布の特性である空間特性を取得できる。また、スマートフォン23は、光源として白色光源22を内蔵する。 In FIG. 6, the input device 2 of the present embodiment includes a color camera 21 as an imaging device of the smartphone 23, and has a plurality of light receiving elements having sensitivity in the wavelength band of visible light. The color camera 21 has solid-state imaging elements such as three types of CMOS or CCD elements having sensitivity to blue (B), green (G), red (R), for example, and these are arranged in a grid for each pixel of an image. It is arranged. The color camera 21 of this embodiment has a plurality of three light receiving sensors having different peak wavelengths of light receiving sensitivity. The light receiving sensitivity of each light receiving sensor is, for example, a sensor having a peak of light receiving sensitivity in the vicinity of 480 nm in blue, in the vicinity of 550 nm in green, and in the vicinity of 620 nm in red. The spatial color distribution of light, ie, the spatial characteristics that are characteristic of the spatial distribution of dye concentration, can be obtained. Further, the smartphone 23 incorporates a white light source 22 as a light source.
 図7は、実施例2における、生体特徴の空間的な特性に基づき、複数の生体特徴を効率良く組み合わせて認証を行う生体認証装置の登録時の処理フローである。実施例2の登録時の処理フローは実施例1と同一であるが、生体特徴の抽出(S13)が異なる。 FIG. 7 is a process flow at the time of registration of a biometric authentication apparatus that performs authentication by combining a plurality of biometric features efficiently based on the spatial characteristics of biometric features in the second embodiment. The processing flow at the time of registration of the second embodiment is the same as that of the first embodiment, but the extraction of the biological feature (S13) is different.
 本実施例において、指画像からの生体特徴の抽出(S13)では、静脈紋および関節紋、表皮のしわを特徴として抽出する。指には、静脈紋や関節紋、表皮のしわ、脂肪紋などの複数の生体組織が重畳して分布しているが、本実施例では、図8に示すように、取得した生体画像である指の撮影画像18から、静脈紋20および関節紋19、更に表皮のしわ24を生体特徴として抽出する。 In the present embodiment, in the extraction of the biological feature from the finger image (S13), vein pattern, joint pattern, and wrinkles of the epidermis are extracted as features. On the finger, a plurality of biological tissues such as vein pattern, joint pattern, wrinkles of the epidermis, fat pattern and the like are distributed in an overlapping manner, but in the present embodiment, as shown in FIG. From the photographed image 18 of the finger, a vein pattern 20 and a joint pattern 19 as well as the wrinkles 24 of the epidermis are extracted as biological features.
 静脈紋20の抽出について述べる。上述した撮像装置であるカラーカメラで指を撮影して得られたカラー画像において、静脈はその他の生体組織に比べ赤みが弱い。カラー画像がRGB画像であるとすると、Rの画素値が周辺の画素より小さい画素を強調することで静脈の色を強調することができる。また、静脈紋20は指の長軸方向に流れるパターンをもつ。よって、静脈の色を強調した画像に対し、指の長軸方向を強調するエッジ強調フィルタ、例えばアンシャープマスクやガボールフィルタなどを適用することで、静脈紋のパターンをその他の生体組織から分離して抽出することが可能である。 The extraction of the vein pattern 20 will be described. In a color image obtained by photographing a finger with a color camera, which is the above-described imaging device, veins are less reddish than other living tissues. Assuming that the color image is an RGB image, it is possible to emphasize the color of the vein by emphasizing the pixel whose R pixel value is smaller than the surrounding pixels. In addition, the vein pattern 20 has a pattern flowing in the long axis direction of the finger. Therefore, by applying an edge enhancement filter that emphasizes the long axis direction of the finger, such as an unsharp mask or a Gabor filter, to the image in which the vein color is enhanced, the vein pattern is separated from the other biological tissues. Can be extracted.
 次に、関節紋19の抽出について述べる。RGB画像の各バンドの画像を確認すると、GやBの単バンド画像には、毛細血管と脂肪紋によるまだらパターンが多く映る。その一方、Rの単バンド画像には、上記のまだら模様のパターンは少なく、関節紋のパターンが比較的鮮明に映る。また、関節紋のパターンは指の短軸方向に流れている。よって、Rの単バンド画像に対し、指の短軸方向を強調するエッジ強調フィルタを適用することで、関節紋のパターンをその他の生体組織から分離して抽出することが可能である。 Next, extraction of the joint print 19 will be described. When the image of each band of the RGB image is confirmed, a single band image of G and B shows many mottled patterns due to capillary blood vessels and fat streaks. On the other hand, in the single-band image of R, the above-mentioned mottled pattern is small, and the pattern of the joint pattern is relatively clear. In addition, the pattern of the joint print flows in the short axis direction of the finger. Therefore, it is possible to separate and extract the pattern of the joint pattern from the other living tissues by applying an edge emphasis filter that emphasizes the short axis direction of the finger to the R single band image.
 次に、表皮のしわ24の抽出について述べる。表皮のしわは、静脈紋20や関節紋19に比べると、乾燥や生活習慣などによりパターンの変動が大きく不安定ではあるが、深いしわパターンは比較的安定しており、認証に有用である。認証に利用する表皮のしわパターンの抽出には、まず、取得した指のカラー画像から毛細血管と脂肪紋によるまだらパターンの少ないRの単バンド画像を取り出し、上述の関節紋パターン抽出を行う。次に、抽出した関節紋パターンを用い、Rの単バンド画像から関節紋パターンを除去し、関節除去画像を得る。関節除去画像に映る生体組織は、表皮のしわが支配的である。ガウシアンフィルタやアベレージフィルタなどのノイズ除去フィルタを関節除去画像に適用し浅いしわを除去することで、しわの深い表皮のしわパターンのみを抽出することができる。 Next, extraction of the wrinkles 24 of the epidermis will be described. Although the wrinkles of the epidermis are largely unstable due to dryness or life style compared to the vein pattern 20 and the joint pattern 19, the deep wrinkle pattern is relatively stable and is useful for authentication. In order to extract the wrinkle pattern of the epidermis used for authentication, first, the above-described joint pattern extraction is performed by taking out a single band image of R with few mottled patterns due to capillary blood vessels and fat patterns from the acquired color image of the finger. Next, using the extracted joint print pattern, the joint print pattern is removed from the single band image of R to obtain a joint removed image. Wrinkles of the epidermis are predominant in the living tissue shown in the joint removal image. By applying a noise removal filter such as a Gaussian filter or an average filter to the joint removal image to remove shallow wrinkles, it is possible to extract only the wrinkle pattern of the deep epidermis of the wrinkles.
 さらに本実施例では、図8に示すように関節紋19と表皮のしわ24を合成して得られる表皮紋25を新たな生体特徴とする。これは、実施例1で述べたように、関節紋単体では登録データと認証データの間の指の位置・姿勢情報の正規化が正確に行えないのに対し、生体特徴の空間的な特性に基づき、関節紋だけでなく表皮のしわも用いた新たな生体特徴とすることで、指の姿勢情報の正規化をより高精度に実施可能となる。 Further, in the present embodiment, as shown in FIG. 8, a skin pattern 25 obtained by combining the joint pattern 19 and the wrinkles 24 of the epidermis is used as a new biological feature. This is because, as described in the first embodiment, in the joint fingerprint alone, normalization of finger position / posture information between registered data and authentication data can not be accurately performed, but on the spatial characteristics of biological features. Based on this, it is possible to perform the normalization of the posture information of the finger with higher accuracy by setting it as a new biological feature using not only the joint print but also the wrinkles of the epidermis.
 図9Aは、上述した生体特徴の空間的な特性に着目し、複数の生体特徴を効率良く組み合わせて認証を行う、実施例2の生体認証装置の認証時の処理フローである。実施例2における認証時の処理フローは実施例1と基本的に同一であるが、照合(S16)の処理内容が異なる。 FIG. 9A is a process flow at the time of authentication of the biometric apparatus according to the second embodiment, in which authentication is performed by efficiently combining a plurality of biometric features, focusing on the spatial characteristics of the biometric features described above. Although the process flow at the time of authentication in the second embodiment is basically the same as that of the first embodiment, the processing content of the collation (S16) is different.
 図9Bに、照合(S16)の処理フローの詳細を示す。生体特徴の抽出(S13)で得られた認証データの生体特徴と、装置内部もしくは外部に保存してある登録データの生体特徴の相違度を算出するが、相違度の算出は実施例1と同様、登録済みの全てのID(0≦i<N)に対して行う(S161)。一つのIDとの相違度の算出では、一つ以上の特徴を総当たりで相違度を算出する(S162)。本実施例の相違度の算出では、第一の生体特徴により登録データと認証データの間の指の位置・姿勢情報を正規化し(S165)、第二の生体特徴により両者間の相違度を算出する(S166)。 FIG. 9B shows the details of the process flow of verification (S16). The degree of difference between the biometric feature of the authentication data obtained in the biometric feature extraction (S13) and the biometric feature of the registered data stored inside or outside the apparatus is calculated, but the calculation of the degree of difference is the same as in the first embodiment. , All registered IDs (0 ≦ i <N) (S161). In the calculation of the degree of difference with one ID, the degree of difference is calculated by rounding one or more features (S162). In the calculation of the degree of difference in this embodiment, the position / posture information of the finger between the registration data and the authentication data is normalized by the first biometric feature (S165), and the difference between the two is calculated by the second biometric feature. (S166).
 ここで、図8に示すように実施例1同様、取り扱う生体特徴を2次元画像であるとし、指画像18から抽出した表皮紋25と静脈紋20を用いた実施例2の装置の認証処理部による照合の一例を述べる。まず、登録データと認証データの第一の生体特徴である表皮紋25の間の指の位置ずれ量を求める。次に、求めた指の位置ずれ量に基づき位置ずれを補正した上で、第二の生体特徴である静脈紋20のデータ間の相違度を算出し、第一の相違度とする。 Here, as shown in FIG. 8, as in the first embodiment, the biometric feature to be handled is a two-dimensional image, and the authentication processing unit of the apparatus of the second embodiment using the skin pattern 25 and the vein pattern 20 extracted from the finger image 18 An example of collation by First, the amount of positional deviation of the finger between the skin pattern 25 which is the first biometric feature of the registration data and the authentication data is determined. Next, after the positional deviation is corrected based on the positional deviation amount of the finger, the degree of difference between the data of the vein pattern 20 which is the second biological feature is calculated, and this is taken as the first degree of difference.
 さらに、表皮紋25と静脈紋20を入れ替え、第二の生体特徴である静脈紋20で指の位置ずれ補正を行い、第一の生体特徴である表皮紋25で第二の相違度を算出する。最後に、求めた静脈紋20および表皮紋25のデータ間の第一、第二の相違度を合成し、最終的な合成相違度を求め、この相違度に基づき、前記生体認証を実行する。 Furthermore, the skin pattern 25 and the vein pattern 20 are exchanged, the positional deviation of the finger is corrected by the vein pattern 20 which is the second living body feature, and the second difference degree is calculated with the skin pattern 25 which is the first living body feature. . Finally, the first and second differences between the determined vein pattern 20 and skin pattern 25 data are combined, the final combined difference is determined, and the biometric authentication is performed based on the difference.
 以上の構成により、本実施例においても、生体特徴の空間的な特性を考慮して複数の生体特徴を効率よく組み合わせることによって照合することで、他人受入や本人拒否を抑制することができる生体認証装置を提供することが可能となる。 According to the above configuration, also in the present embodiment, biometrics can be used to match other biometric features and efficiently reject each other by combining a plurality of biometric features, thereby suppressing another person's acceptance and the person's rejection. It becomes possible to provide an apparatus.
 実施例3は、光源とカメラが手指に対し反対側に配置した生体認証装置の実施例である。本実施例のシステム構成は、実施例1、2と同一であるが、図10A及び図10Bに示すように、入力装置2の構造が異なる。 The third embodiment is an embodiment of the biometric device in which the light source and the camera are disposed on the opposite side to the finger. The system configuration of this embodiment is the same as that of Embodiments 1 and 2, but the structure of the input device 2 is different as shown in FIGS. 10A and 10B.
 図10Aの入力装置2では、赤外光源9及び赤色光源26が手指に対し撮像装置10の反対側に配置する。図10Bの入力装置2では、赤外光源9が手指に対し撮像装置10の反対側に配置する。 In the input device 2 of FIG. 10A, the infrared light source 9 and the red light source 26 are disposed on the opposite side of the imaging device 10 with respect to the finger. In the input device 2 of FIG. 10B, the infrared light source 9 is disposed on the opposite side of the imaging device 10 with respect to the finger.
 実施例3の生体認証装置における生体情報の登録時の処理フロー及び認証時の処理フローは、基本的に実施例1と同一であるが、生体特徴の抽出(S13)の処理内容が異なる。まず、図10Aに示す入力装置2で収集した生体情報に基づき、図10Bに示す装置で実施する生体特徴の抽出(S13)方法の最適化を行う。更に、本実施例の構成において、実際の生体情報の登録及び認証は図10Bに示す入力装置2を用いて行う。 The processing flow at the time of registration of biometric information in the biometric authentication device of the third embodiment and the processing flow at the time of authentication are basically the same as in the first embodiment, but the processing content of extraction of the biometric feature (S13) is different. First, based on the biological information collected by the input device 2 shown in FIG. 10A, optimization of a method of extracting biological features (S13) performed by the device shown in FIG. 10B is performed. Furthermore, in the configuration of the present embodiment, registration and authentication of actual biometric information are performed using the input device 2 shown in FIG. 10B.
 図12に、本実施例の生体特徴の抽出(S13)の最適化の処理フローを示す。まず、画像読み込み(S131)では、図10Aに示す装置で収集した生体画像を読み込む。図10Aの装置では、赤外光源9、及び赤色光源26を交互に点灯してそれぞれ手指を撮影することで、生体組織の見え方の異なる二種類の指画像を取得することができる。具体的には、赤外光源9で手指を撮影して得られた画像(赤外画像)では、血管像が比較的鮮明に映っているが、赤色光源26で手指を撮影して得られた画像(赤色画像)では、血管像が比較的不鮮明に映っている。これらの赤外画像と赤色画像を大量に取得し、同一の試行で撮影した二枚の画像を一つのペアとする。 FIG. 12 shows a process flow of optimization of the extraction (S13) of the biological feature of the present embodiment. First, in image reading (S131), a biological image collected by the apparatus shown in FIG. 10A is read. In the apparatus of FIG. 10A, two types of finger images different in how to look at the living tissue can be acquired by alternately lighting the infrared light source 9 and the red light source 26 and photographing the fingers. Specifically, in an image (infrared image) obtained by photographing a finger with the infrared light source 9, a blood vessel image is relatively clearly seen, but obtained by photographing the finger with the red light source 26 In the image (red image), the blood vessel image is relatively unclear. A large amount of these infrared and red images are acquired, and two images taken in the same trial are taken as one pair.
 次に、画像読み込み(S131)で読み込んだ赤外画像と赤色画像のペアを用い、画像変換器Aの最適化(S132)を行う。画像変換器Aの最適化(S132)では、深層学習などの機械学習による画像変換器に対し、赤色画像を入力とし、その赤色画像とペアの赤外画像を出力となるように画像変換器Aを最適化する。 Next, the image converter A is optimized (S132) using the pair of the infrared image and the red image read in the image reading (S131). In the optimization (S132) of the image converter A, the image converter A receives the red image as an input to the image converter based on machine learning such as deep learning and outputs the infrared image of the red image and the pair. To optimize.
 図13に画像変換器Aの最適化(S132)の具体的な処理フローを示す。画像変換器Aの最適化(S132)では、入力画像から所望の出力画像が得られるように、画像変換パラメータの更新(S1321)を繰り返す。ここでは、画像変換パラメータの更新(S1321)をN回繰り返すとして、s回目の画像変換パラメータの更新(S1321)での処理を説明する。 The specific processing flow of optimization (S132) of the image converter A is shown in FIG. In the optimization (S132) of the image converter A, updating (S1321) of image conversion parameters is repeated so that a desired output image can be obtained from the input image. Here, assuming that the image conversion parameter update (S1321) is repeated N times, processing in the s-th image conversion parameter update (S1321) will be described.
 まず、赤色画像を画像変換器Aに入力し、画像変換(S1322)を行う。次に、画像変換(S1322)により画像変換器Aから出力される出力画像と、入力した赤色画像とペアになっている正解の赤外画像の距離を算出する(S1323)。距離の尺度は、マンハッタン距離やユークリッド距離、敵対的学習で決められる距離などでもよい。次に、画像変換パラメータの更新量の算出を行う(S1324)。パラメータの更新量は、上記で求めた距離の尺度や画像変換の方法によって決められる。最後に画像変換パラメータを更新する(S1325)。 First, a red image is input to the image converter A, and image conversion (S1322) is performed. Next, the distance between the output image output from the image converter A by the image conversion (S1322) and the infrared image of the correct answer paired with the input red image is calculated (S1323). The measure of the distance may be Manhattan distance, Euclidean distance, or a distance determined by hostile learning. Next, the update amount of the image conversion parameter is calculated (S1324). The update amount of the parameter is determined by the distance measure obtained above and the method of image conversion. Finally, the image conversion parameters are updated (S1325).
 大量の画像ペアを用い、反復的に画像変換器Aのパラメータを更新することで、あらゆる手指画像に汎用的に適用可能な画像変換器Aを構築できる。その結果、この画像変換器Aは、手指の透過光画像を入力することで、その血管像を鮮明化した手指の透過光像を出力する。 By iteratively updating the parameters of the image converter A using a large number of image pairs, it is possible to construct an image converter A that can be universally applied to any hand image. As a result, the image converter A receives the transmitted light image of the finger, and outputs the transmitted light image of the finger in which the blood vessel image is sharpened.
 このように、認証処理部3は、まず指の赤色画像、及び赤外画像をペアとし、赤色画像を画像変換器Aに入力し、ペアの赤外画像が出力されるよう画像変換器Aの画像変換パラメータを更新して、画像変換器Aの最適化を図る。 As described above, the authentication processing unit 3 first makes a red image of a finger and an infrared image a pair, inputs the red image to the image converter A, and outputs an infrared image of the pair. The image conversion parameters are updated to optimize the image converter A.
 続いて、図12の画像変換器Aによる画像変換(S133)では、図10Aに示す装置で収集した赤外画像を画像変換器Aに入力して画像変換の出力として血管像が鮮明化された赤外画像を得る。 Subsequently, in the image conversion (S133) by the image converter A in FIG. 12, the infrared image collected by the apparatus shown in FIG. 10A is input to the image converter A and the blood vessel image is sharpened as an output of the image conversion. Obtain an infrared image.
 続いて、赤外画像から静脈パターン画像を生成する画像変換器Bの最適化(S134)を行う。画像変換器Bの最適化(S134)では、深層学習などの機械学習による画像変換器に対し、図10Aに示す装置で収集した赤外画像を入力とし、画像変換器Aによる画像変換(S133)で得られた血管像が鮮明化された赤外画像から生成した静脈パターン画像が出力となるように画像変換器Bを最適化する。画像変換器Bの最適化(S134)の具体的な処理フローは、図13に示した画像変換器Aの最適化と同一でもよい。大量の画像ペアを用い、反復的に画像変換器Bのパラメータを更新することで、あらゆる手指画像に汎用的に適用可能な画像変換器Bを構築できる。この画像変換器Bは、赤外画像を入力することで、静脈を鮮明化した静脈パターン画像を出力する。 Then, optimization (S134) of the image converter B which produces | generates a vein pattern image from an infrared image is performed. In the optimization of the image converter B (S134), an infrared image collected by the apparatus shown in FIG. 10A is input to the image converter based on machine learning such as deep learning, and the image conversion by the image converter A (S133) The image converter B is optimized so that a vein pattern image generated from the infrared image in which the blood vessel image obtained in the above is sharpened is output. The specific process flow of the optimization (S134) of the image converter B may be the same as the optimization of the image converter A shown in FIG. By repeatedly updating the parameters of the image converter B using a large number of image pairs, it is possible to construct an image converter B that can be applied universally to any hand image. The image converter B outputs a vein pattern image obtained by sharpening a vein by inputting an infrared image.
 上記の方法により最適化された生体特徴の抽出(S13)では、図14に示す画像変換器Bによる画像変換(S135)を行う。画像変換器Bによる画像変換(S135)では、図10Bに示す赤外光源9が照射した赤外光の手指の透過光を撮影(S11)して得られた画像に対し、指検出(S12)を実施して得られた赤外画像を画像変換器Bに入力することで鮮明な静脈パターン画像を得る。 In the extraction (S13) of the biometric feature optimized by the above method, the image conversion (S135) by the image converter B shown in FIG. 14 is performed. In the image conversion by the image converter B (S135), the finger detection (S12) is performed on the image obtained by photographing (S11) the transmitted light of the infrared light finger irradiated by the infrared light source 9 shown in FIG. 10B. The clear image of vein pattern is obtained by inputting the infrared image obtained by performing the above into the image converter B.
 以上説明した実施例3では、手指の透過光を撮影して得られた画像を認証に利用する透過光方式について述べたが、あくまで重要なのは異なる中心波長をもつ二種類の光源を利用することにある。よって、撮像装置の構成は、実施例の図2と同様、図11に示すようなカメラと光源が手指に対して同じ側にある反射光方式でもよい。 In the third embodiment described above, a transmitted light method is described in which an image obtained by photographing transmitted light of a finger is used for authentication, but it is important to use two types of light sources having different central wavelengths. is there. Therefore, the configuration of the imaging apparatus may be a reflected light method in which the camera and the light source as shown in FIG. 11 are on the same side with respect to the finger, as in FIG. 2 of the embodiment.
 なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。
例えば、上記した実施例は本発明のより良い理解のために詳細に説明したのであり、必ずしも説明の全ての構成を備えるものに限定されるものではない。また、ある実施例の構成の一部を他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることが可能である。また、各実施例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。
The present invention is not limited to the embodiments described above, but includes various modifications.
For example, the embodiments described above have been described in detail for better understanding of the present invention, and are not necessarily limited to those having all the configurations of the description. In addition, part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. In addition, with respect to a part of the configuration of each embodiment, it is possible to add, delete, and replace other configurations.
 更に、上述した各構成、機能、認証処理部等は、それらの一部又は全部を実現するプログラムを作成する例を説明したが、それらの一部又は全部を例えば集積回路で設計する等によりハードウェアで実現しても良いことは言うまでもない。すなわち、処理部の全部または一部の機能は、プログラムに代え、例えば、ASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)などの集積回路などにより実現してもよい。 Furthermore, although each configuration, function, authentication processing unit, etc. described above has described an example of creating a program for realizing a part or all of them, hardware is designed by designing a part or all of them with an integrated circuit etc. It goes without saying that it may be realized by wear. That is, all or part of the functions of the processing unit may be realized by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA) instead of the program.
1 指、2 入力装置、3 認証処理部、4 記憶部、5 表示部、6 入力部、7 音声出力部、8 画像入力部、9 赤外光源、10 撮像装置、11 CPU、12 メモリ、13 IF、14 装置筐体、15 アクリル板、16 偏光板A、17 偏光板B、18 撮影画像、19 関節紋抽出画像、20 静脈紋抽出画像、21 カラーカメラ、22 光源、23 スマートフォン、24 表皮のしわ、25 表皮紋、26 赤色光源 Reference Signs List 1 finger, 2 input device, 3 authentication processing unit, 4 storage unit, 5 display unit, 6 input unit, 7 audio output unit, 8 image input unit, 9 infrared light source, 10 imaging device, 11 CPU, 12 memory, 13 IF, 14 device housings, 15 acrylic plates, 16 polarizing plates A, 17 polarizing plates B, 18 photographed images, 19 joint print extraction images, 20 vein print extraction images, 21 color cameras, 22 light sources, 23 smartphones, 24 skins Wrinkled, 25 skins, 26 red light sources

Claims (15)

  1. 生体を撮影して生体画像を取得する画像入力部と、
    取得した前記生体画像を処理して生体認証を行う認証処理部と、
    前記生体画像から得られる複数の生体特徴に関する登録情報を記憶する記憶部と、を備え、
    前記認証処理部は、
    処理して得られた、空間特性の異なる複数の前記生体特徴を組み合わせて位置合わせを行い、位置合わせ後の前記生体特徴と、前記登録情報とを用いて、前記生体認証を行う、
    ことを特徴とする生体認証装置。
    An image input unit that captures a living body and acquires a living body image;
    An authentication processing unit that processes the acquired biological image to perform biometric authentication;
    A storage unit storing registration information on a plurality of biometric features obtained from the biometric image;
    The authentication processing unit
    A plurality of biometric features having different spatial characteristics obtained by processing are combined for alignment, and the biometric authentication is performed using the biometric information after alignment and the registration information.
    A biometric authentication device characterized by
  2. 請求項1に記載の生体認証装置であって、
    前記画像入力部が取得する前記生体画像は、前記生体の指の画像である、
    ことを特徴とする生体認証装置。
    The biometric authentication device according to claim 1, wherein
    The living body image acquired by the image input unit is an image of a finger of the living body,
    A biometric authentication device characterized by
  3. 請求項2に記載の生体認証装置であって、
    前記空間特性は、前記指の画像の色素濃度の空間分布の特性である、
    ことを特徴とする生体認証装置。
    The biometric apparatus according to claim 2, wherein
    The spatial characteristic is a characteristic of spatial distribution of pigment concentration of the image of the finger,
    A biometric authentication device characterized by
  4. 請求項3に記載の生体認証装置であって、
    複数の前記生体特徴は、前記指の静脈紋、関節紋、表皮のしわ、あるいは脂肪紋を含む、ことを特徴とする生体認証装置。
    The biometric authentication device according to claim 3, wherein
    The biometric authentication apparatus, wherein the plurality of biometric features include a vein pattern of the finger, a joint pattern, wrinkles of the epidermis, or a fatty pattern.
  5. 請求項4に記載の生体認証装置であって、
    前記認証処理部は、
    前記指の長軸方向、又は短軸方向を強調するエッジ強調フィルタを用いて、前記静脈紋のパターン、又は前記関節紋のパターンを分離する、
    ことを特徴とする生体認証装置。
    The biometric authentication device according to claim 4, wherein
    The authentication processing unit
    Separating the vein pattern or the joint pattern using an edge enhancement filter that emphasizes the long axis direction or the short axis direction of the finger;
    A biometric authentication device characterized by
  6. 請求項2に記載の生体認証装置であって、
    前記認証処理部は、
    前記生体画像の第一の生体特徴を用いて前記指の長軸方向の位置ずれ量を求め、求めた前記指の長軸方向の位置ずれ量に基づき長軸方向の位置ずれを補正し、前記生体画像の第二の生体特徴を用いて前記指の短軸方向の位置ずれ量を求め、求めた前記指の短軸方向の位置ずれに基づき単軸方向の位置ずれを補正した後、前記登録情報の前記第二の生体特徴との間の第一の相違度を算出する、
    ことを特徴とする生体認証装置。
    The biometric apparatus according to claim 2, wherein
    The authentication processing unit
    The displacement amount of the finger in the long axis direction is determined using the first biological feature of the living body image, and the displacement in the long axis direction is corrected based on the displacement amount of the finger determined in the long axis direction, The positional deviation amount in the minor axis direction of the finger is determined using the second biological feature of the living body image, and the positional deviation in the uniaxial direction is corrected based on the positional deviation in the minor axis direction of the finger determined, and then the registration Calculating a first degree of difference between the information and the second biometric feature;
    A biometric authentication device characterized by
  7. 請求項6に記載の生体認証装置であって、
    前記認証処理部は、
    前記生体画像の第二の生体特徴を用いて前記指の短軸方向の位置ずれ量を求め、求めた前記指の長軸方向の位置ずれ量に基づき短軸方向の位置ずれを補正し、前記生体画像の第一の生体特徴を用いて前記指の長軸方向の位置ずれ量を求め、求めた前記指の長軸方向の位置ずれに基づき長軸方向の位置ずれを補正した後、前記登録情報の前記第二の生体特徴との間の第二の相違度を算出する、
    ことを特徴とする生体認証装置。
    The biometric authentication device according to claim 6, wherein
    The authentication processing unit
    The displacement amount in the minor axis direction of the finger is determined using the second biological feature of the living body image, and the displacement in the minor axis direction is corrected based on the displacement amount in the major axis direction of the finger determined The position shift amount of the finger in the long axis direction is determined using the first biological feature of the living body image, and the position shift in the long axis direction is corrected based on the calculated position shift in the long axis direction of the finger. Calculating a second degree of difference between the information and the second biometric feature;
    A biometric authentication device characterized by
  8. 請求項7に記載の生体認証装置であって、
    前記認証処理部は、
    前記第一の相違度と前記第二の相違度を合成して合成相違度とし、当該合成相違度に基づき、前記生体認証を実行する、
    ことを特徴とする生体認証装置。
    The biometric authentication device according to claim 7, wherein
    The authentication processing unit
    The first degree of difference and the second degree of difference are combined into a combined difference degree, and the biometric authentication is performed based on the combined difference degree.
    A biometric authentication device characterized by
  9. 請求項8に記載の生体認証装置であって、
    前記第一の生体特徴は前記指の関節紋であり、前記第二の生体特徴は前記指の静脈紋である、
    ことを特徴とする生体認証装置。
    The biometric apparatus according to claim 8, wherein
    The first biological feature is a joint print of the finger, and the second biological feature is a vein print of the finger.
    A biometric authentication device characterized by
  10. 請求項3に記載の生体認証装置であって、
    前記認証処理部は、
    複数の前記生体特徴として、前記指の表皮紋、及び静脈紋を抽出し、前記表皮紋を第一の生体特徴とし、当該第一の生体特徴を用いて、前記登録情報との間の前記指の位置ずれ量を求め、前記位置ずれ量に基づき位置ずれを補正し、前記静脈紋を第二の生体特徴として、前記登録情報との間の第一の相違度を算出し、更に、前記静脈紋で前記指の位置ずれ量を算出して位置ずれの補正を行い、前記表皮紋で第二の相違度を算出し、前記第一、第二の相違度を合成して合成相違度を求め、当該合成相違度に基づき、前記生体認証を実行する。
    ことを特徴とする生体認証装置。
    The biometric authentication device according to claim 3, wherein
    The authentication processing unit
    The skin pattern and vein pattern of the finger are extracted as the plurality of living body features, and the skin pattern is regarded as a first living body feature, and the first living body feature is used to extract the finger between the registration information and the finger The positional deviation amount is calculated, the positional deviation is corrected based on the positional deviation amount, and the vein pattern is used as a second biological feature to calculate a first degree of difference with the registered information, and the vein is further calculated. The position shift amount of the finger is calculated with a print to correct the position shift, the second degree of difference is calculated with the skin print, and the first and second degree of difference are combined to obtain a combined degree of difference. The biometric authentication is performed based on the synthetic difference degree.
    A biometric authentication device characterized by
  11. 請求項10に記載の生体認証装置であって、
    前記認証処理部は、
    前記指の関節紋、及び表皮のしわを抽出し、前記関節紋と前記表皮のしわを合成して前記表皮紋を算出する、
    ことを特徴とする生体認証装置。
    The biometric authentication device according to claim 10, wherein
    The authentication processing unit
    The joint print of the finger and the wrinkles of the epidermis are extracted, and the joint print and the wrinkles of the epidermis are combined to calculate the skin print.
    A biometric authentication device characterized by
  12. 請求項2に記載の生体認証装置であって、
    前記画像入力部は、光源と、前記指を反射、或いは透過した前記光源からの光を撮像する撮像装置と、を有する、
    ことを特徴とする生体認証装置。
    The biometric apparatus according to claim 2, wherein
    The image input unit includes a light source, and an imaging device that captures light from the light source that has been reflected or transmitted through the finger.
    A biometric authentication device characterized by
  13. 請求項12に記載の生体認証装置であって、
    前記光源は、赤色光源、及び赤外光源を含み、
    前記認証処理部は、
    前記赤色光源、及び前記赤外光源を交互に点灯して取得される前記指の赤色画像及び赤外画像をペアとし、前記赤色画像を画像変換器Aに入力し、前記ペアの前記赤外画像が出力されるよう前記画像変換器Aの画像変換パラメータを更新して、前記画像変換器Aの最適化を図る、
    ことを特徴とする生体認証装置。
    The biometric authentication device according to claim 12, wherein
    The light source includes a red light source and an infrared light source.
    The authentication processing unit
    The red image and the infrared image of the finger acquired by alternately lighting the red light source and the infrared light source are paired, and the red image is input to the image converter A, and the infrared image of the pair is input. Updating the image conversion parameters of the image converter A so that the image converter A is output to optimize the image converter A.
    A biometric authentication device characterized by
  14. 請求項13に記載の生体認証装置であって、
    前記認証処理部は、
    前記赤外画像を前記画像変換器Aに入力して、血管像が鮮明化された赤外画像を得る、
    ことを特徴とする生体認証装置。
    The biometric authentication device according to claim 13, wherein
    The authentication processing unit
    The infrared image is input to the image converter A to obtain an infrared image in which a blood vessel image is sharpened.
    A biometric authentication device characterized by
  15. 請求項14に記載の生体認証装置であって、
    前記認証処理部は、
    前記赤外画像を入力し、前記画像変換器Aによる画像変換で得られた前記血管像が鮮明化された赤外画像から生成した静脈パターン画像が出力となるよう画像変換器Bを最適化する、
    ことを特徴とする生体認証装置。
    The biometric authentication device according to claim 14, wherein
    The authentication processing unit
    The infrared image is input, and the image converter B is optimized so that a vein pattern image generated from the infrared image in which the blood vessel image obtained by the image conversion by the image converter A is sharpened is output. ,
    A biometric authentication device characterized by
PCT/JP2018/042458 2018-01-22 2018-11-16 Biometric authentication device WO2019142479A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-007907 2018-01-22
JP2018007907A JP7002348B2 (en) 2018-01-22 2018-01-22 Biometric device

Publications (1)

Publication Number Publication Date
WO2019142479A1 true WO2019142479A1 (en) 2019-07-25

Family

ID=67301440

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/042458 WO2019142479A1 (en) 2018-01-22 2018-11-16 Biometric authentication device

Country Status (2)

Country Link
JP (1) JP7002348B2 (en)
WO (1) WO2019142479A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4131148A1 (en) * 2020-03-26 2023-02-08 NEC Corporation Authentication device, authentication method, and recording medium
WO2023218551A1 (en) * 2022-05-11 2023-11-16 富士通株式会社 Image processing device, image processing method, and image processing program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008134862A (en) * 2006-11-29 2008-06-12 Hitachi Omron Terminal Solutions Corp Vein authentication device
JP2016096987A (en) * 2014-11-20 2016-05-30 株式会社日立製作所 Biometric authentication device
JP2017091186A (en) * 2015-11-10 2017-05-25 株式会社日立製作所 Authentication apparatus using biological information and authentication method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008134862A (en) * 2006-11-29 2008-06-12 Hitachi Omron Terminal Solutions Corp Vein authentication device
JP2016096987A (en) * 2014-11-20 2016-05-30 株式会社日立製作所 Biometric authentication device
JP2017091186A (en) * 2015-11-10 2017-05-25 株式会社日立製作所 Authentication apparatus using biological information and authentication method

Also Published As

Publication number Publication date
JP7002348B2 (en) 2022-02-10
JP2019128630A (en) 2019-08-01

Similar Documents

Publication Publication Date Title
US11188734B2 (en) Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices
CN110326001B (en) System and method for performing fingerprint-based user authentication using images captured with a mobile device
US9361507B1 (en) Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices
US10599932B2 (en) Personal electronic device for performing multimodal imaging for non-contact identification of multiple biometric traits
KR102538405B1 (en) Biometric authentication system, biometric authentication method and program
US10922512B2 (en) Contactless fingerprint recognition method using smartphone
Fletcher et al. Development of mobile-based hand vein biometrics for global health patient identification
WO2019142479A1 (en) Biometric authentication device
KR20110119214A (en) Robust face recognizing method in disguise of face
Gomez-Barrero et al. Multi-spectral Short-Wave Infrared Sensors and Convolutional Neural Networks for Biometric Presentation Attack Detection
JP2018067206A (en) Imaging device
JP6759142B2 (en) Biometric device and method
KR20210050649A (en) Face verifying method of mobile device
KR20220052828A (en) Biometric authentication apparatus and biometric authentication method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18901236

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18901236

Country of ref document: EP

Kind code of ref document: A1