WO2019142479A1 - Dispositif d'authentification biométrique - Google Patents

Dispositif d'authentification biométrique Download PDF

Info

Publication number
WO2019142479A1
WO2019142479A1 PCT/JP2018/042458 JP2018042458W WO2019142479A1 WO 2019142479 A1 WO2019142479 A1 WO 2019142479A1 JP 2018042458 W JP2018042458 W JP 2018042458W WO 2019142479 A1 WO2019142479 A1 WO 2019142479A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
finger
biometric
biometric authentication
authentication device
Prior art date
Application number
PCT/JP2018/042458
Other languages
English (en)
Japanese (ja)
Inventor
渓一郎 中崎
三浦 直人
友輔 松田
洋 野々村
長坂 晃朗
宮武 孝文
Original Assignee
株式会社日立産業制御ソリューションズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立産業制御ソリューションズ filed Critical 株式会社日立産業制御ソリューションズ
Publication of WO2019142479A1 publication Critical patent/WO2019142479A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a biometric authentication apparatus that authenticates an individual using a living body.
  • finger vein authentication is known as one that can realize highly accurate authentication.
  • Finger vein authentication uses blood vessel patterns inside the finger to achieve excellent authentication accuracy. Since finger vein authentication is more difficult to forge and tamper as compared with fingerprint authentication, high security can be realized.
  • biometric authentication device In recent years, increasing cases of securing the security of each device by installing a biometric authentication device on devices such as mobile phones, laptop PCs (Personal Computers), smartphones and tablet terminals, lockers, safes, printers, etc. ing. Further, as fields to which biometrics authentication is applied, biometrics authentication has been used in recent years in addition to entry and exit management, attendance management, login to a computer, and the like. In particular, it is important for biometric authentication devices used in the public to realize reliable personal authentication. Furthermore, in view of the spread of tablet type portable terminals in recent years and the trend of wearable computing, it is also one of the important requirements to realize the miniaturization of the device while securing the convenience as described above.
  • Patent Document 1 discloses a biometric authentication technology that extracts a plurality of features from a narrow-area biometric image acquired by a small device and uses it for authentication.
  • Patent Document 2 discloses a biometric authentication technology that acquires a wide area biological image while being a compact device, and realizes robust authentication against posture change.
  • Patent Document 1 a plurality of visible light sources having different wavelengths are illuminated on a finger, and a pattern of vein pattern and fat pattern is extracted as a biological feature from an image obtained by photographing the reflected light, and those features are mutually extracted.
  • a technique for performing high-accuracy authentication by collating the information in an efficient manner.
  • the characteristics of each feature are not considered.
  • Patent Document 2 near-infrared and green light sources are illuminated on a finger, and patterns such as vein pattern, joint pattern, fat pattern and finger outline are extracted from an image obtained by photographing the reflected light, It refers to combining and matching the patterns of However, no study has been made on a specific combination method of those patterns.
  • the object of the present invention is to solve the above-mentioned problems, to provide a biometric authentication device which realizes stable and high-accuracy authentication by efficiently combining and collating features in consideration of characteristics of a plurality of biometric features. It is to do.
  • an image input unit for capturing a living body to acquire a living body image
  • an authentication processing unit for processing the acquired living body image to perform biometric authentication
  • a storage unit for storing registration information about the feature
  • the authentication processing unit performs alignment by combining a plurality of biological features having different spatial characteristics obtained by processing and performing alignment, and registering the biological features after alignment
  • a biometric authentication device that performs biometric authentication using information and information.
  • an authentication device that performs stable and high-accuracy authentication by efficiently combining in consideration of the characteristics of a plurality of biometric features in a biometric authentication device.
  • FIG. 1 is a diagram illustrating an example of the entire configuration of a biometric authentication system according to a first embodiment.
  • FIG. 2 is a diagram illustrating an example of the configuration of an input device according to a first embodiment.
  • FIG. 7 is a diagram illustrating an example of a processing flow at the time of registration of the first embodiment.
  • FIG. 8 is a diagram showing an example of a process flow at the time of authentication in the first embodiment.
  • FIG. 8 is a diagram showing an example of a process flow of verification in the process flow at the time of authentication in the first embodiment.
  • FIG. 3 is a view showing an example of a photographed image of Example 1 and an image obtained as a result of feature extraction.
  • FIG. 1 is a diagram illustrating an example of the entire configuration of a biometric authentication system according to a first embodiment.
  • FIG. 2 is a diagram illustrating an example of the configuration of an input device according to a first embodiment.
  • FIG. 7 is a diagram illustrating an example of a processing flow at the
  • FIG. 16 is a schematic view showing an example of a configuration for photographing and authenticating a living body with a built-in camera of a mobile terminal according to a second embodiment.
  • FIG. 18 is a diagram illustrating an example of a process flow at the time of registration of the second embodiment.
  • FIG. 16 is a view showing an example of a photographed image of Example 2 and an image obtained as a result of feature extraction.
  • FIG. 18 is a diagram illustrating an example of a process flow at the time of authentication of the second embodiment.
  • FIG. 18 is a diagram showing an example of a process flow of verification in the process flow at the time of authentication in the second embodiment.
  • FIG. 18 is a diagram showing a configuration of a transmitted light system of an input device used for data collection for optimization of feature extraction processing in the third embodiment.
  • FIG. 18 is a diagram showing an example of a configuration of an input device used at the time of registration and authentication in the third embodiment.
  • FIG. 18 is a diagram showing a configuration of a reflected light scheme of an input device used for data collection for optimization of feature extraction processing in the third embodiment.
  • FIG. 18 is a diagram illustrating an example of a process flow of optimization of extraction of biological features in the third embodiment.
  • FIG. 18 is a diagram illustrating an example of a process flow of optimization of the image converter A in the third embodiment.
  • FIG. 18 is a diagram illustrating an example of image conversion by an image converter B in Embodiment 3.
  • the first embodiment stores an image input unit that captures a living body and acquires a biometric image, an authentication processing unit that processes the acquired biometric image to perform biometric authentication, and registration information on biometric features obtained from the biometric image.
  • the authentication processing unit combines a plurality of biological features with different spatial characteristics obtained by processing and performs alignment, and performs biometrics using the biological features after alignment and registration information.
  • the spatial characteristic is a characteristic of the spatial distribution of the dye concentration of the image of the living body, and in the present embodiment, after alignment is performed by combining a plurality of biological features having different spatial characteristics, registration with the registration information is performed. It is an Example of the biometrics apparatus which performs collation.
  • FIG. 1 is a diagram illustrating an example of an entire configuration of a biometric authentication system using a blood vessel of a finger according to a first embodiment.
  • the configuration of the present embodiment is not limited to the system configuration shown in FIG. 1, and it goes without saying that the configuration may be a device in which all or a part of the configuration is mounted on a housing.
  • the apparatus may be a biometric authentication apparatus including an authentication process, or the authentication process may be performed outside the apparatus and may be a blood vessel image acquisition apparatus specialized for acquiring a blood vessel image or a blood vessel image extraction apparatus.
  • the embodiment may be a terminal such as a smartphone.
  • all the embodiments including the biometric authentication system may be collectively referred to as a biometric authentication device.
  • the biometric authentication system includes an input device 2, an authentication processing unit 3, a storage unit 4, a display unit 5, an input unit 6, an audio output unit 7, and an image input unit 8.
  • the input device 2 includes the light source 9 installed in the housing and the imaging device 10 installed inside the housing, and inputs a living body image to the authentication processing unit 3 through the image input unit 8.
  • the image input unit 8 acquires a living body image captured by the imaging device 10 of the input device 2, and inputs the acquired living body image to the authentication processing unit 3. Therefore, in the present specification, the input device 2 and the image input unit 8 may be collectively referred to as an image input unit.
  • the authentication processing unit 3 is a generic name of processing units that execute processing related to biometrics, and a determination unit that determines the distance between a living body (finger) and a system or the posture of a living body (finger) from an image, and the living body (finger)
  • a state control unit that instructs the display unit etc. to correct the distance of the subject or the posture of the living body (finger), an unnecessary information removal unit that removes unnecessary information (wrinkling, background, etc.) from the captured image, the living body from the captured image
  • a feature extraction unit that extracts feature information and a matching unit that matches the extracted biometric feature information with registered information stored in advance in the storage unit are provided as the functional processing unit.
  • the light source 9 disposed in the input device 2 is, for example, a light emitting element such as a light emitting diode (LED), and emits light to the finger 1 presented on the upper portion of the input device 2.
  • the imaging device 10 captures an image of the finger 1 presented to the input device 2.
  • the finger 1 to be presented may be not only one but a plurality.
  • the authentication processing unit 3 includes, as its hardware configuration, a central processing unit (CPU: Central Processing Unit) 11, a memory 12 and various interfaces (IF) 13.
  • CPU Central Processing Unit
  • IF interfaces
  • the interface 13 connects the authentication processing unit 3 to an external device. Specifically, the interface 13 connects the input device 2, the storage unit 4, the display unit 5, the input unit 6, the audio output unit 7, the image input unit 8 and the like to the CPU 11, the memory 12 and the like.
  • the storage unit 4 stores in advance registration data of the user.
  • the registration data is information for matching the user, and is, for example, an image of a finger vein pattern.
  • the image of the finger vein pattern is an image obtained by imaging blood vessels (finger veins) mainly distributed subcutaneously on the palm side of the finger as a dark shadow pattern.
  • the authentication processing unit 3 extracts, from the image of the finger vein pattern, a plurality of biological features having different spatial characteristics, which is a characteristic of the spatial distribution of the dye concentration of the image of the finger, and extracts and collates the registered information It has a function.
  • the display unit 5 is, for example, a liquid crystal display (LCD), and is an output device that displays various information received from the authentication processing unit 3.
  • the input unit 6 is, for example, a keyboard, and transmits information input by the user to the authentication processing unit 3.
  • the voice output unit 7 is an output device that transmits information received from the authentication processing unit 3 as an acoustic signal such as voice.
  • the display unit 5 and the voice output unit 7 are instruction units for instructing the user who uses the biometric authentication system to correct the distance between the living body (finger) and the system and the posture of the living body (finger).
  • This embodiment is an example, and the present embodiment is not limited to this device configuration.
  • the authentication processing unit described above may perform all processing by one CPU, or may use a CPU for each function processing unit.
  • FIG. 2 is a diagram for explaining an example of a specific structure of the input device 2 of the biometric authentication system of the first embodiment.
  • the input device 2 captures biological features such as blood vessels (finger veins) distributed on the surface of the finger or under the skin.
  • the input device 2 is enclosed by a device housing 14, and one imaging device 10 is disposed in the inside thereof.
  • the plurality of infrared light sources 9 are annularly arranged around the imaging device 10, and can uniformly illuminate the finger 1 through the opening.
  • the infrared light source 9 emits light of an infrared wavelength.
  • the infrared light source 9 can emit light with an arbitrary intensity.
  • the infrared light source 9 selects a wavelength of 850 nm as an example of a specific wavelength.
  • An acrylic material 15 is inserted into the opening to prevent dust and the like from intruding into the inside of the apparatus, and has an effect of physically protecting members inside the apparatus.
  • a polarizing plate A16 and a polarizing plate B17 are inserted between the imaging device 10 and the acrylic material 15 and between the light source 9 and the acrylic material 15, respectively.
  • the polarizing plate A is a polarizing plate that polarizes the P wave
  • the polarizing plate B is a polarizing plate that polarizes the S wave.
  • a polarizing plate that polarizes the S wave may be used as the polarizing plate A
  • a polarizing plate that polarizes the P wave may be used as the polarizing plate B.
  • a polarizing plate (a generic name of a polarizing plate A16 and a polarizing plate B17), it is suppressed that the light reflected by the surface of the finger 1 among the light irradiated by the infrared light source 9 is received by the imaging device 10 Can.
  • a polarizing plate (a generic name of a polarizing plate A16 and a polarizing plate B17)
  • the imaging device 10 of the present embodiment is a monochrome camera, and has a light receiving element having sensitivity in only the wavelength band of infrared light.
  • the imaging device 10 may use a monochrome camera or a color camera having a light receiving element having sensitivity to the wavelength band of infrared light and visible light.
  • an optical filter for example, a band pass filter or a low pass filter for blocking visible light is inserted in front of or inside the camera lens so that only the wavelength band of infrared light is received by the light receiving element.
  • FIG. 3 shows an example of a processing flow at the time of registration of the biometric authentication system of the present embodiment, in which authentication is performed by efficiently combining a plurality of biometric features based on spatial characteristics of biometric features.
  • the user presents a finger to the input device 2 of the system, and the system shoots the finger with the camera of the imaging device 10 while emitting infrared light (S11).
  • the exposure time or the irradiation light amount of the light source may be adjusted, and the exposure time at which the luminance saturation disappears or the irradiation light amount of the light source may be set.
  • the position and posture of the finger are detected from the image captured by the camera (S12).
  • the posture information of the finger includes the position of the fingertip or the root of the finger, and the image of each finger is cut out using the positional information of one or more fingers to be authenticated.
  • a plurality of biological features having different spatial characteristics are extracted from the finger image (S13).
  • Whether or not the predetermined registration quality is satisfied may be, for example, whether or not the density of the extracted vein pattern or the amount of change in the pattern falls within a predetermined range. If the extracted biometric feature does not meet the predetermined quality, the finger imaging is performed again. If the extracted biometric feature satisfies a predetermined quality, the biometric feature is stored as registration data, which is registration information (S15).
  • FIG. 4A is an example of a process flow at the time of authentication of the biometric authentication system of the present embodiment, in which authentication is performed by efficiently combining a plurality of biometric features, focusing on spatial characteristics of biometric features.
  • the processing flow from the photographing of the finger (S11), the detection of the finger position / posture from the photographed image (S12), and the extraction of the biometric feature from the finger image (S13) is the same as at the time of registration.
  • S16 After extraction of the biometric feature, matching of the registered data with the biometric feature is performed (S16).
  • Verification is carried out with a combination of all registration data and authentication data, and authentication propriety determination is performed for the combination with the smallest degree of dissimilarity score obtained (S17). That is, if the calculated dissimilarity score is below the threshold set in advance, the authentication is successful (Yes) and the process ends. If the degree of difference score exceeds the threshold value, the authentication fails (No), and the process returns to photographing of the finger.
  • FIG. 4B shows an example of the process flow of verification (S16) in the process flow at the time of authentication of the system of this embodiment.
  • a plurality of biological features having different spatial characteristics, such as vein patterns, joint patterns, wrinkles of the epidermis, and fatty patterns.
  • the vein pattern and the joint pattern appear relatively clearly in the photographed image acquired by the system configuration of the present embodiment.
  • the vein pattern has a line pattern flowing in the long axis direction of the finger
  • the joint pattern has a line pattern flowing in the short axis direction of the finger. Therefore, by applying a filter for extracting a line pattern in a specific direction, such as a Gabor filter, to the captured image, it is possible to separate and extract the vein pattern and the joint pattern from each other.
  • the difference between the feature of the authentication data obtained in the extraction of the biometric feature (S13) and the biometric feature of the registered data stored inside or outside the device is calculated.
  • calculation of the degree of difference is performed for all registered IDs (0 ⁇ i ⁇ N) (S161).
  • the degree of difference is calculated by rounding one or more biometric features (S162).
  • part of the finger position / posture information between the registration data and the authentication data is normalized by the first biological feature (S163), and the entire finger position / posture information is calculated by the second biological feature. Are normalized, and the degree of difference is calculated by the second biometric feature (S164).
  • the biological feature to be handled is a two-dimensional image, and the joint pattern 19 extracted from the finger image 18 And a vein pattern 20 are used.
  • the amount of positional deviation of the registration data and the authentication data in the long axis direction of the finger between the joint print 19 as the first biological feature is determined.
  • the positional deviation is corrected based on the positional deviation amount in the long axis direction of the finger, and then the positional deviation amount in the short axis direction of the finger is determined using the vein pattern 20 as the second biological feature.
  • the positional deviation is corrected based on the positional deviation amount in the minor axis direction of the finger, and then the degree of difference between the data of the vein pattern 20 which is the second biological feature is calculated. Do.
  • the joint print 19 and the vein print 20 are exchanged, position correction of the finger in the short axis direction is performed by the vein print 20 as the second biological feature, and the long axis direction of the finger is performed by the joint print 19 as the first biological feature.
  • the position correction is performed to calculate the degree of difference between the data of the joint print 19 which is the first biological feature, and this is taken as the second degree of difference.
  • the first degree of difference and the second degree of difference between the determined vein pattern 20 and joint pattern 19 data are synthesized, the degree of synthesized difference is determined, and biometrics is executed. That is, in the collation processing of the present embodiment, the position of only the long axis direction of the finger is corrected in the joint print 19 in consideration of the spatial characteristics of each of a plurality of biological features such as the joint print and the vein print. In this case, position correction is performed only on the short axis direction of the finger to obtain the degree of synthetic difference, and comparison with registered information is performed.
  • the position correction in the short axis direction and the long axis direction of the finger is performed only with the vein pattern 20, and the difference degree of the vein pattern 20 is calculated based on the position correction result.
  • the vein pattern 20 has a pattern that flows in the direction of either the short axis or the long axis of the finger, so that the amount of positional deviation between the registered data and the authentication data can be accurately determined by template matching.
  • template matching the degree of difference is repeatedly calculated while shifting the positions of two images, and the position at which the degree of difference is minimum is determined.
  • the position at which the degree of difference between the data is the smallest is the position at which the position and posture of the finger substantially match between the data.
  • the position where the degree of difference between the data is the smallest is not necessarily the position where the position and posture of the finger roughly match between the data. It will be a final difference score, which will be a factor in increasing false acceptance.
  • the second embodiment is a camera which is a Tsu image device having sensitivity to visible light, which is standard mounted on a smartphone or the like, and is an example of a biometric authentication device that performs authentication under ambient light.
  • the system configuration of the second embodiment is the same as that of the first embodiment, but as shown in FIG. 6, the structure of the input device is different.
  • the input device 2 of the present embodiment includes a color camera 21 as an imaging device of the smartphone 23, and has a plurality of light receiving elements having sensitivity in the wavelength band of visible light.
  • the color camera 21 has solid-state imaging elements such as three types of CMOS or CCD elements having sensitivity to blue (B), green (G), red (R), for example, and these are arranged in a grid for each pixel of an image. It is arranged.
  • the color camera 21 of this embodiment has a plurality of three light receiving sensors having different peak wavelengths of light receiving sensitivity.
  • each light receiving sensor is, for example, a sensor having a peak of light receiving sensitivity in the vicinity of 480 nm in blue, in the vicinity of 550 nm in green, and in the vicinity of 620 nm in red.
  • the spatial color distribution of light ie, the spatial characteristics that are characteristic of the spatial distribution of dye concentration, can be obtained.
  • the smartphone 23 incorporates a white light source 22 as a light source.
  • FIG. 7 is a process flow at the time of registration of a biometric authentication apparatus that performs authentication by combining a plurality of biometric features efficiently based on the spatial characteristics of biometric features in the second embodiment.
  • the processing flow at the time of registration of the second embodiment is the same as that of the first embodiment, but the extraction of the biological feature (S13) is different.
  • vein pattern, joint pattern, and wrinkles of the epidermis are extracted as features.
  • a plurality of biological tissues such as vein pattern, joint pattern, wrinkles of the epidermis, fat pattern and the like are distributed in an overlapping manner, but in the present embodiment, as shown in FIG. From the photographed image 18 of the finger, a vein pattern 20 and a joint pattern 19 as well as the wrinkles 24 of the epidermis are extracted as biological features.
  • veins are less reddish than other living tissues.
  • the color image is an RGB image
  • the vein pattern 20 has a pattern flowing in the long axis direction of the finger. Therefore, by applying an edge enhancement filter that emphasizes the long axis direction of the finger, such as an unsharp mask or a Gabor filter, to the image in which the vein color is enhanced, the vein pattern is separated from the other biological tissues. Can be extracted.
  • the wrinkles of the epidermis are largely unstable due to dryness or life style compared to the vein pattern 20 and the joint pattern 19, the deep wrinkle pattern is relatively stable and is useful for authentication.
  • the above-described joint pattern extraction is performed by taking out a single band image of R with few mottled patterns due to capillary blood vessels and fat patterns from the acquired color image of the finger.
  • the joint print pattern is removed from the single band image of R to obtain a joint removed image. Wrinkles of the epidermis are predominant in the living tissue shown in the joint removal image.
  • a noise removal filter such as a Gaussian filter or an average filter
  • a skin pattern 25 obtained by combining the joint pattern 19 and the wrinkles 24 of the epidermis is used as a new biological feature.
  • This is because, as described in the first embodiment, in the joint fingerprint alone, normalization of finger position / posture information between registered data and authentication data can not be accurately performed, but on the spatial characteristics of biological features. Based on this, it is possible to perform the normalization of the posture information of the finger with higher accuracy by setting it as a new biological feature using not only the joint print but also the wrinkles of the epidermis.
  • FIG. 9A is a process flow at the time of authentication of the biometric apparatus according to the second embodiment, in which authentication is performed by efficiently combining a plurality of biometric features, focusing on the spatial characteristics of the biometric features described above.
  • the process flow at the time of authentication in the second embodiment is basically the same as that of the first embodiment, the processing content of the collation (S16) is different.
  • FIG. 9B shows the details of the process flow of verification (S16).
  • the degree of difference between the biometric feature of the authentication data obtained in the biometric feature extraction (S13) and the biometric feature of the registered data stored inside or outside the apparatus is calculated, but the calculation of the degree of difference is the same as in the first embodiment.
  • All registered IDs (0 ⁇ i ⁇ N) (S161).
  • the degree of difference is calculated by rounding one or more features (S162).
  • the position / posture information of the finger between the registration data and the authentication data is normalized by the first biometric feature (S165), and the difference between the two is calculated by the second biometric feature. (S166).
  • the biometric feature to be handled is a two-dimensional image
  • the authentication processing unit of the apparatus of the second embodiment using the skin pattern 25 and the vein pattern 20 extracted from the finger image 18 An example of collation by First, the amount of positional deviation of the finger between the skin pattern 25 which is the first biometric feature of the registration data and the authentication data is determined. Next, after the positional deviation is corrected based on the positional deviation amount of the finger, the degree of difference between the data of the vein pattern 20 which is the second biological feature is calculated, and this is taken as the first degree of difference.
  • the skin pattern 25 and the vein pattern 20 are exchanged, the positional deviation of the finger is corrected by the vein pattern 20 which is the second living body feature, and the second difference degree is calculated with the skin pattern 25 which is the first living body feature. . Finally, the first and second differences between the determined vein pattern 20 and skin pattern 25 data are combined, the final combined difference is determined, and the biometric authentication is performed based on the difference.
  • biometrics can be used to match other biometric features and efficiently reject each other by combining a plurality of biometric features, thereby suppressing another person's acceptance and the person's rejection. It becomes possible to provide an apparatus.
  • the third embodiment is an embodiment of the biometric device in which the light source and the camera are disposed on the opposite side to the finger.
  • the system configuration of this embodiment is the same as that of Embodiments 1 and 2, but the structure of the input device 2 is different as shown in FIGS. 10A and 10B.
  • the infrared light source 9 and the red light source 26 are disposed on the opposite side of the imaging device 10 with respect to the finger.
  • the infrared light source 9 is disposed on the opposite side of the imaging device 10 with respect to the finger.
  • the processing flow at the time of registration of biometric information in the biometric authentication device of the third embodiment and the processing flow at the time of authentication are basically the same as in the first embodiment, but the processing content of extraction of the biometric feature (S13) is different.
  • FIG. 12 shows a process flow of optimization of the extraction (S13) of the biological feature of the present embodiment.
  • image reading S131
  • two types of finger images different in how to look at the living tissue can be acquired by alternately lighting the infrared light source 9 and the red light source 26 and photographing the fingers.
  • an image (infrared image) obtained by photographing a finger with the infrared light source 9 a blood vessel image is relatively clearly seen, but obtained by photographing the finger with the red light source 26
  • the blood vessel image is relatively unclear. A large amount of these infrared and red images are acquired, and two images taken in the same trial are taken as one pair.
  • the image converter A is optimized (S132) using the pair of the infrared image and the red image read in the image reading (S131).
  • the image converter A receives the red image as an input to the image converter based on machine learning such as deep learning and outputs the infrared image of the red image and the pair. To optimize.
  • the specific processing flow of optimization (S132) of the image converter A is shown in FIG.
  • updating (S1321) of image conversion parameters is repeated so that a desired output image can be obtained from the input image.
  • the image conversion parameter update (S1321) is repeated N times, processing in the s-th image conversion parameter update (S1321) will be described.
  • a red image is input to the image converter A, and image conversion (S1322) is performed.
  • the distance between the output image output from the image converter A by the image conversion (S1322) and the infrared image of the correct answer paired with the input red image is calculated (S1323).
  • the measure of the distance may be Manhattan distance, Euclidean distance, or a distance determined by hostile learning.
  • the update amount of the image conversion parameter is calculated (S1324).
  • the update amount of the parameter is determined by the distance measure obtained above and the method of image conversion.
  • the image conversion parameters are updated (S1325).
  • the image converter A By iteratively updating the parameters of the image converter A using a large number of image pairs, it is possible to construct an image converter A that can be universally applied to any hand image. As a result, the image converter A receives the transmitted light image of the finger, and outputs the transmitted light image of the finger in which the blood vessel image is sharpened.
  • the authentication processing unit 3 first makes a red image of a finger and an infrared image a pair, inputs the red image to the image converter A, and outputs an infrared image of the pair.
  • the image conversion parameters are updated to optimize the image converter A.
  • the infrared image collected by the apparatus shown in FIG. 10A is input to the image converter A and the blood vessel image is sharpened as an output of the image conversion. Obtain an infrared image.
  • generates a vein pattern image from an infrared image is performed.
  • an infrared image collected by the apparatus shown in FIG. 10A is input to the image converter based on machine learning such as deep learning, and the image conversion by the image converter A (S133)
  • the image converter B is optimized so that a vein pattern image generated from the infrared image in which the blood vessel image obtained in the above is sharpened is output.
  • the specific process flow of the optimization (S134) of the image converter B may be the same as the optimization of the image converter A shown in FIG.
  • the image converter B outputs a vein pattern image obtained by sharpening a vein by inputting an infrared image.
  • the image conversion (S135) by the image converter B shown in FIG. 14 is performed.
  • the finger detection (S12) is performed on the image obtained by photographing (S11) the transmitted light of the infrared light finger irradiated by the infrared light source 9 shown in FIG. 10B.
  • the clear image of vein pattern is obtained by inputting the infrared image obtained by performing the above into the image converter B.
  • the configuration of the imaging apparatus may be a reflected light method in which the camera and the light source as shown in FIG. 11 are on the same side with respect to the finger, as in FIG. 2 of the embodiment.
  • the present invention is not limited to the embodiments described above, but includes various modifications.
  • the embodiments described above have been described in detail for better understanding of the present invention, and are not necessarily limited to those having all the configurations of the description.
  • part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
  • each configuration, function, authentication processing unit, etc. described above has described an example of creating a program for realizing a part or all of them
  • hardware is designed by designing a part or all of them with an integrated circuit etc. It goes without saying that it may be realized by wear. That is, all or part of the functions of the processing unit may be realized by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA) instead of the program.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Collating Specific Patterns (AREA)
  • Image Input (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Afin d'assurer une authentification biométrique stable et hautement précise par combinaison et rassemblement efficaces de caractéristiques biométriques tout en prenant en considération les particularités d'une pluralité de caractéristiques biométriques, l'invention porte sur un dispositif d'authentification biométrique comprenant : une unité d'entrée d'image qui acquiert une image biométrique par photographie d'un corps vivant ; une unité de traitement d'authentification qui traite l'image biométrique acquise et réalise une authentification biométrique ; et une unité de stockage qui stocke des informations d'enregistrement concernant une pluralité de caractéristiques biométriques obtenues à partir de l'image biométrique. L'unité de traitement d'authentification réalise l'authentification biométrique par combinaison et positionnement d'une pluralité de caractéristiques biométriques différentes ayant des particularités spatiales différentes et obtenues par traitement, et utilisation des caractéristiques biométriques qui ont été positionnées et des informations d'enregistrement pour effectuer un calcul de degré de différence (S161 – S164).
PCT/JP2018/042458 2018-01-22 2018-11-16 Dispositif d'authentification biométrique WO2019142479A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018007907A JP7002348B2 (ja) 2018-01-22 2018-01-22 生体認証装置
JP2018-007907 2018-01-22

Publications (1)

Publication Number Publication Date
WO2019142479A1 true WO2019142479A1 (fr) 2019-07-25

Family

ID=67301440

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/042458 WO2019142479A1 (fr) 2018-01-22 2018-11-16 Dispositif d'authentification biométrique

Country Status (2)

Country Link
JP (1) JP7002348B2 (fr)
WO (1) WO2019142479A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230252119A1 (en) * 2020-03-26 2023-08-10 Nec Corporation Authentication device, authentication method, and recording medium
WO2023218551A1 (fr) * 2022-05-11 2023-11-16 富士通株式会社 Dispositif de traitement des images, procédé de traitement des images et programme de traitement des images

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008134862A (ja) * 2006-11-29 2008-06-12 Hitachi Omron Terminal Solutions Corp 静脈認証装置
JP2016096987A (ja) * 2014-11-20 2016-05-30 株式会社日立製作所 生体認証装置
JP2017091186A (ja) * 2015-11-10 2017-05-25 株式会社日立製作所 生体情報を用いた認証装置及び認証方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008134862A (ja) * 2006-11-29 2008-06-12 Hitachi Omron Terminal Solutions Corp 静脈認証装置
JP2016096987A (ja) * 2014-11-20 2016-05-30 株式会社日立製作所 生体認証装置
JP2017091186A (ja) * 2015-11-10 2017-05-25 株式会社日立製作所 生体情報を用いた認証装置及び認証方法

Also Published As

Publication number Publication date
JP2019128630A (ja) 2019-08-01
JP7002348B2 (ja) 2022-02-10

Similar Documents

Publication Publication Date Title
US11188734B2 (en) Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices
CN110326001B (zh) 使用利用移动设备捕捉的图像执行基于指纹的用户认证的系统和方法
US9361507B1 (en) Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices
US10599932B2 (en) Personal electronic device for performing multimodal imaging for non-contact identification of multiple biometric traits
US10922512B2 (en) Contactless fingerprint recognition method using smartphone
KR102538405B1 (ko) 생체 인증 시스템, 생체 인증 방법 및 프로그램
Fletcher et al. Development of mobile-based hand vein biometrics for global health patient identification
WO2019142479A1 (fr) Dispositif d'authentification biométrique
KR20220052828A (ko) 생체 인증 장치 및 생체 인증 방법
KR20110119214A (ko) 얼굴 변화에 강인한 얼굴 인식 방법
Gomez-Barrero et al. Multi-spectral Short-Wave Infrared Sensors and Convolutional Neural Networks for Biometric Presentation Attack Detection
JP2018067206A (ja) 撮影装置
JP6759142B2 (ja) 生体認証装置、及び方法
KR20210050649A (ko) 모바일 기기의 페이스 인증 방법
Saripalle A multimodal biometric authentication for smartphones

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18901236

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18901236

Country of ref document: EP

Kind code of ref document: A1