WO2013145280A1 - 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム - Google Patents
生体認証装置、生体認証方法及び生体認証用コンピュータプログラム Download PDFInfo
- Publication number
- WO2013145280A1 WO2013145280A1 PCT/JP2012/058650 JP2012058650W WO2013145280A1 WO 2013145280 A1 WO2013145280 A1 WO 2013145280A1 JP 2012058650 W JP2012058650 W JP 2012058650W WO 2013145280 A1 WO2013145280 A1 WO 2013145280A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- hand
- finger
- posture
- user
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
- G06V40/11—Hand-related biometrics; Hand pose recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/14—Vascular patterns
- G06V40/145—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
- G06V40/117—Biometrics derived from hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/14—Vascular patterns
Definitions
- the present invention relates to, for example, a biometric authentication apparatus, a biometric authentication method, and a biometric authentication computer program that use an image showing hand biometric information for biometric authentication.
- Biometric authentication technology has been developed to determine whether or not to authenticate an individual using biometric information such as fingerprints or hand vein patterns.
- Biometric authentication technology is a device used by a specific individual such as a computer or mobile terminal from a large-scale system with a large number of registered users, such as an entrance / exit management system, a border control system, or a system using a national identification number. Until widely used.
- the biometric authentication device acquires a biometric image representing the vein pattern as an input biometric image. Then, the biometric authentication device collates the input biometric information, which is the user's vein pattern represented in the input biometric image, with the registered biometric information, which is the hand vein pattern represented in the registered user's biometric image. To do. If the biometric authentication device determines that the input biometric information and the registered biometric information match based on the result of the verification process, the biometric authentication device authenticates the user as a registered user having a legitimate authority. The biometric authentication device permits an authenticated user to use a device in which the biometric authentication device is incorporated or another device connected to the biometric authentication device.
- the posture of the part including the biometric information is verified between the biometric information of the user and the biometric information of the registered user. It is preferably the same as the position of the part at the time (hereinafter referred to as collation). If the posture is the same at the time of matching and at the time of registration, the degree of similarity between the shape of the biological information captured in the biological image obtained at the time of registration and the shape of the biological information captured in the biological image obtained at the time of matching is high. As a result, the personal authentication rate is improved.
- the posture of the part including the biological information at the time of registration may be different from the posture of the part at the time of verification.
- the shape of the biometric information captured in the biometric image obtained at the time of registration and the shape of the biometric information captured in the biometric image obtained at the time of collation do not necessarily coincide completely.
- Patent Document 1 a hand rotation angle is detected from handprint image data, and a misregistration with a rotation angle of a registered hand in a registered handprint image that is detected and registered in advance is corrected. And a bill authentication method for matching a registered bill image.
- Patent Document 2 discloses biometric authentication that authenticates by comparing at least a vein pattern image of a subject and a template based on an evaluation result of a relative position between the skin surface pattern image of the subject and a template of the subject. An apparatus is disclosed.
- Patent Literature 1 or 2 is based on the premise that the entire biological information captured in one biological image is rotated or moved in the same manner with respect to the biological information captured in the other biological image. ing.
- the biometric information on one biometric image is localized with respect to the biometric information on the other biometric image in addition to the overall positional deviation. May be distorted.
- the biometric information used for authentication is hand biometric information
- part of the hand biometric information may be deformed according to the posture of the finger.
- An object of the present invention is to provide a biometric authentication device.
- a biometric authentication device includes: first shape data representing a shape of biometric information of a registered user's hand in a state in which a finger of the hand takes a first posture; a first index representing a first posture; A storage unit for storing second shape data representing the shape of biometric information of a registered user's hand and a second index representing the second posture in a state in which the finger of the hand takes the second posture; A biometric information acquisition unit that generates a biometric image representing the biometric information of the user's hand, and a posture for calculating a third index representing the third posture of the user's finger shown in the biometric image from the biometric image Based on the biometric image, the identification unit, a biometric information extraction unit that generates third shape data representing the shape of the biometric information of the user's hand in the biometric image, a first index, and a second index According to the ratio of the difference between the third index and the first index or the second index relative to the index difference The corrected
- the biometric authentication device disclosed in the present specification provides authentication accuracy even if there is a local distortion in the biometric information of the hand shown in the biometric image at the time of matching with respect to the biometric information of the hand shown in the biometric image at the time of registration. Can be suppressed.
- FIG. 1 is a schematic configuration diagram of a biometric authentication device.
- FIG. 2 is a functional block diagram of a processing unit according to one embodiment.
- FIG. 3A is a diagram illustrating an example of a vein pattern of a hand on a biological image when a finger is open.
- FIG. 3B is a diagram illustrating an example of a hand vein pattern on a biological image when a finger is closed.
- FIG. 3C is a diagram in which the vein pattern shown in FIG. 3A and the vein pattern shown in FIG.
- FIG. 4 is a diagram illustrating an example of the angle of each finger.
- FIG. 5 is a flowchart illustrating the registration process.
- FIG. 5 is a flowchart illustrating the registration process.
- FIG. 6 is a diagram illustrating an example of a vein pattern whose position is corrected from a vein pattern when the finger is open and a vein pattern when the finger is closed according to the angle of the finger at the time of collation.
- FIG. 7 is a diagram illustrating an operation flowchart of the biometric authentication process.
- FIG. 8 is a functional block diagram of a processing unit according to a modification.
- FIG. 9 is a schematic configuration diagram of an example of a computer system in which a biometric authentication device according to an embodiment or a modification thereof is mounted.
- This biometric authentication device uses a biometric image showing a hand vein pattern, which is an example of biometric information of a user's hand, for biometric authentication.
- This biometric authentication device is a biometric representing a vein pattern of a hand with a finger open at the time of registration in order to correct a partial deformation of the vein pattern near the base of the finger due to a difference in the degree of opening of the finger.
- An image and a biological image representing a vein pattern of a hand with a finger closed are acquired.
- the biometric authentication device registers, for each finger, the angle of the finger and the position of the vein near the base of the finger, together with data representing the shape of the vein pattern, for each of the two biometric images.
- the biometric authentication device obtains the angle of each finger from the biometric image acquired at the time of collation, and is registered according to the difference between the angle and the angle of each finger of the two biometric images at the time of registration.
- the position of the vein near each finger is corrected for the data representing the shape of the vein pattern.
- the biometric authentication device collates data representing the shape of the vein pattern with the corrected vein position and data representing the shape of the vein pattern obtained from the biometric image acquired at the time of collation.
- the vein pattern on the hand may be either a vein pattern on the palm side or a vein pattern on the back side of the hand.
- the term “collation process” is used to indicate a process of calculating an index representing the degree of difference or similarity between the user's biometric information and the registered user's biometric information.
- biometric authentication process is used to indicate the entire authentication process including not only the verification process but also the process of determining whether to authenticate the user using the index obtained by the verification process. used.
- FIG. 1 shows a schematic configuration diagram of a biometric authentication apparatus.
- the biometric authentication device 1 includes a display unit 2, an input unit 3, a biometric information acquisition unit 4, a storage unit 5, a storage medium access device 6, and a processing unit 7.
- the display unit 2, the input unit 3, and the biological information acquisition unit 4 may be provided separately from the housing in which the storage unit 5, the storage medium access device 6, and the processing unit 7 are accommodated.
- the biometric authentication device 1 uses the biometric image representing the vein pattern of the user's hand generated by the biometric information acquisition unit 4 to collate the vein pattern with the vein pattern of the registered user, thereby performing biometric authentication processing. Execute.
- the biometric authentication device 1 authenticates the user as one of the registered users as a result of the biometric authentication process
- the biometric authentication device 1 permits the user to use the device on which the biometric authentication device 1 is mounted.
- the biometric authentication device 1 transmits a signal indicating that the user has been authenticated to another device (not shown), thereby permitting the user to use the other device.
- the display unit 2 includes a display device such as a liquid crystal display or an organic electroluminescence display. And the display part 2 displays the message showing the result of the biometric authentication process performed by the process part 7, or various information relevant to an application.
- the input unit 3 has a user interface such as a keyboard, a mouse, or a touch pad, for example. Then, the user identification information or the command or data input by the user via the input unit 3 such as the user name is passed to the processing unit 7. However, the input unit 3 may be omitted when the user does not need to input information other than the biometric information to the biometric authentication device 1.
- the biometric information acquisition unit 4 includes a vein sensor for generating a biometric image in which an image of a vein pattern of a user's hand is captured.
- the vein sensor includes, for example, a housing on which a user's hand can be placed, an infrared light emitting diode, an area sensor, and an imaging optical system arranged in the housing.
- the infrared light emitting diode illuminates the hand placed on the housing.
- the imaging optical system forms an image of the vein pattern of the hand illuminated by the infrared light emitting diode on the area sensor.
- the area sensor has a solid-state image sensor that is two-dimensionally arranged and has sensitivity to infrared light, and each solid-state image sensor outputs a signal corresponding to the intensity of the infrared light.
- a biological image in which an image of the vein pattern of the hand is captured is generated.
- the value of each pixel of the biological image is, for example, a value in the range of 0 to 255, and the pixel value increases as the luminance of the pixel increases.
- a region where a vein is shown and a portion where a hand is not shown are black, and a region where a hand portion where no vein is shown is bright.
- the biological image is generated such that the tip of the finger is positioned on the upper end side of the biological image and the wrist side is positioned on the lower end side of the biological image.
- the biometric information acquisition unit 4 passes the biometric image to the processing unit 7 every time a biometric image is generated.
- the storage unit 5 includes, for example, at least one of a semiconductor memory, a magnetic disk device, and an optical disk device. And the memory
- the storage unit 5 stores a program for executing biometric authentication processing. Further, for each registered user, the storage unit 5 stores data related to the vein pattern of the hand, which is registered biometric information of the registered user, together with the registered user identification information such as the user name and password of the registered user.
- the data related to the registered biometric information includes, for example, the shape of the vein pattern obtained from the biometric image showing the vein pattern of the hand that is captured with the finger open and the finger closed.
- the image to be represented and the angle of each finger are included.
- the storage medium access device 6 is a device that accesses the storage medium 8 such as a magnetic disk, a semiconductor memory card, and an optical storage medium.
- the storage medium access device 6 reads, for example, a biometric authentication computer program executed on the processing unit 7 and stored in the storage medium 8 and passes it to the processing unit 7 or stores it in the storage unit 5.
- the processing unit 7 has one or a plurality of processors and their peripheral circuits.
- the processing unit 7 executes biometric authentication processing or registration processing using the biometric image acquired from the biometric information acquisition unit 4.
- FIG. 2 is a functional block diagram of the processing unit 7.
- the processing unit 7 includes a living body region extracting unit 11, an invariant region extracting unit 12, an angle calculating unit 13, a vein pattern extracting unit 14, a registering unit 15, a correcting unit 16, A verification unit 17 and an authentication determination unit 18 are included.
- Each of these units included in the processing unit 7 is a functional module implemented by a computer program executed on a processor included in the processing unit 7.
- these units included in the processing unit 7 may be implemented in the biometric authentication device 1 as firmware.
- the biometric region extraction unit 11, the invariant region extraction unit 12, the angle calculation unit 13, and the vein pattern extraction unit 14 are used in both the biometric authentication process and the registration process.
- the registration unit 15 is used in the registration process.
- the correction unit 16, the collation unit 17, and the authentication determination unit 18 are used in the biometric authentication process.
- the registration process generates a data related to the registered biometric information from the vein pattern of the registered user's hand, that is, a biometric image showing the registered biometric information, and stores the data together with identification information such as the user name of the registered user.
- 5 is a process to be stored in 5.
- the data related to the registered biometric information includes data obtained from a biometric image obtained by capturing a vein pattern with a finger open and a vein pattern captured with a finger closed. And data obtained from a biological image.
- the process performed on any biological image is the same, the process on one biological image will be described below unless it is necessary. In the state where the fingers are open, the opening angle between the fingers is relatively wide for at least one pair of two adjacent fingers compared to the state where the fingers are closed.
- the living body region extraction unit 11 detects a pixel in which a part of the user's hand is captured in the living body image having a luminance value higher than a predetermined threshold. Then, the living body region extraction unit 11 performs a labeling process on the detected pixels to extract a living body region that is a region in which a part or all of the hand is shown on the living body image.
- the predetermined threshold value may be a value obtained by adding an offset value corresponding to the fluctuation width of the luminance value of the background pixel where nothing is captured to the minimum value of the luminance values of each pixel of the biological image. it can.
- the biological area extraction unit 11 generates a binary image having the same size as the size of the biological image, for example, as information representing the biological area.
- the biological region extraction unit 11 sets the value of the pixel included in the biological region in the binary image to “1”, and sets the value of the background pixel to “0”. Then, the biological region extraction unit 11 passes information representing the biological region to the invariant region extraction unit 12, the angle calculation unit 13, and the vein pattern extraction unit 14.
- the invariant area extraction unit 12 extracts an invariant area from the living body area.
- the invariant area is an area in which the vein pattern included in the invariable area does not deform regardless of the posture of the finger.
- FIG. 3A is a diagram illustrating an example of a vein pattern of a hand on a biological image when a finger is open.
- FIG. 3B is a diagram illustrating an example of a hand vein pattern on a biological image when a finger is closed.
- FIG. 3C is a diagram in which the vein pattern shown in FIG. 3A and the vein pattern shown in FIG. In the vicinity of the base of the finger, the position of the vein connected to the finger also varies depending on the angle of the finger. Therefore, if the vein pattern 301 shown on the image 300 shown in FIG. 3A is compared with the vein pattern 302 shown on the image 310 shown in FIG. The part near the base of the finger is different. This is apparent, for example, with reference to FIG.
- the invariant region extraction unit 12 performs pattern matching while shifting the relative position of the biological image corresponding to the state in which the finger is closed to the biological image corresponding to the state in which the finger is opened. The amount of displacement when the images match best is obtained.
- the invariant region extraction unit 12 may obtain a positional deviation amount by performing pattern matching between a part of each biological image including the center of gravity of the biological region. Then, the invariant region extraction unit 12 moves at least one of the two biological images so as to cancel out the displacement, and matches the vein patterns reflected in the two biological images as much as possible.
- the invariant region extraction unit 12 divides one biological image into a plurality of partial regions in a state where the vein patterns of the two biological images are aligned. Note that the size of each partial region is set to be smaller than the size of the portion where the position of the vein changes depending on the angle of the finger, for example. Then, the invariant region extraction unit 12 obtains a normalized cross-correlation value for each partial region including a part of the living body region and a corresponding partial region of the other image, and the normalized cross-correlation value is Partial areas that are equal to or greater than a predetermined threshold are extracted as invariant area candidates.
- the predetermined threshold is set to a lower limit value of a normalized cross-correlation value that can be considered that the images of the subject in each of the two partial areas match, for example, 0.8 to 0.9.
- the invariant region extraction unit 12 performs a labeling process on the invariant region candidates to connect the invariant region candidates adjacent to each other. Then, the invariant area extraction unit 12 sets the maximum set among the connected sets of invariant area candidates as the invariant area.
- a plurality of hand reference model images representing the positional relationship between the hand outline and the invariant region may be prepared in advance, and the reference model images may be stored in the storage unit 5.
- the invariant region extraction unit 12 performs pattern matching between the biological image and each of the plurality of reference model images, and selects the reference model image that most closely matches the biological image. Then, the invariant region extraction unit 12 selects a region on the biological image that overlaps the invariable region set in the reference model image in a state where the contour of the hand of the selected reference model image matches the contour of the hand on the biological image.
- the invariant region in the biological image may be used.
- the invariant region extraction unit 12 sets a reference coordinate system that serves as a reference for obtaining the angle of the finger based on the invariant region. For example, as described above, when two biological images are aligned with each other, the center of gravity of the invariable region of each biological image is set as the origin of the reference coordinate system in the aligned state. Then, the invariant region extraction unit 12 sets the x-axis of the reference coordinate system so as to be parallel to the horizontal direction of the non-moving biological image of the two biological images, and with respect to the vertical direction of the biological image. Set the y-axis of the reference coordinate system to be parallel.
- the x-axis and y-axis of the reference coordinates are respectively the horizontal direction of the moving image. And set to face in a direction rotated by an angle ⁇ counterclockwise from the vertical direction.
- the center of gravity of the invariant area of each biological image is set as the origin of the reference coordinate system.
- the invariant area extraction unit 12 generates a binary image having the same size as the size of the biological image, for example, as information representing the invariant area. Then, the living body region extraction unit 11 sets the value of the pixel included in the invariant region in the binary image to “1”, and sets the values of the other pixels to “0”.
- the invariant region extraction unit 12 includes information representing the invariant region of each biological image, the coordinates of the origin of the reference coordinate system set in the biological image and the direction of the coordinate axis, the angle calculation unit 13, the vein pattern extraction unit 14, and the registration unit 15 To pass.
- the angle calculation unit 13 is an example of a posture specifying unit, and obtains the angle of each finger shown in a biological image representing a vein pattern with the finger closed as an example of a first index that represents the posture of the finger. In addition, the angle calculation unit 13 obtains the angle of each finger in the biological image representing the vein pattern with the finger open as an example of a second index representing the finger posture. For this purpose, the angle calculation unit 13 first obtains a center line along the longitudinal direction of each finger on the biological image. Therefore, the angle calculation unit 13 scans in the horizontal direction sequentially from the upper end of the biological image, and detects a contour pixel located at the contour of the finger adjacent to the background region in the biological region.
- the angle calculation unit 13 extracts the contour pixel as the finger contour. And the angle calculation part 13 calculates
- the angle calculation unit 13 obtains the coordinates of the middle point of the contour pixel at the left end of the finger and the contour pixel at the right end of the finger at two points having different heights in the y-axis direction of the reference coordinate system for each finger.
- the angle calculation unit 13 sets a straight line connecting the two midpoints as the center line of the finger.
- FIG. 4 is a diagram illustrating an example of the angle of each finger.
- the x-axis and the y-axis each represent a coordinate axis of the reference coordinate system.
- the angle calculation unit 13 calculates the angles ⁇ 1 to ⁇ 5 of each finger as the angles of the fingers and the angles formed by the center lines 401 to 405 of the fingers and the x axis, respectively.
- the angle calculation unit 13 passes the angle of each finger to the vein pattern extraction unit 14 and the registration unit 15 for each biological image.
- the vein pattern extraction unit 14 is an example of a biological information extraction unit, and extracts a vein pattern of a hand from a biological region on a biological image. For this purpose, the vein pattern extraction unit 14 determines each pixel in the living body region as a vein pixel in which a vein having a pixel value less than a predetermined threshold is captured, and another pixel having a pixel value greater than or equal to the predetermined threshold. To binarize. Note that the predetermined threshold is set to, for example, an average value of the pixels in the living body region. The vein pattern extraction unit 14 generates a vein pattern image in which each vein is represented by a line having a width of one pixel by performing thinning processing on the vein pixels of the binarized biological image. This vein pattern image is an example of shape data representing the shape of hand biometric information.
- the vein pattern extraction unit 14 compares the vein pattern image with the invariant region, and identifies a pixel that is out of the invariant region and has a vein closer to the finger than the invariant region.
- the position of a part of the vein reflected in the specified pixel changes according to the posture of the finger. Therefore, for the sake of convenience, this identified pixel is hereinafter referred to as a variable vein pixel.
- the vein pattern extraction unit 14 When the vein pattern extraction unit 14 corrects a vein pattern at the time of collation for each variable vein pixel, the vein pattern extraction unit 14 is used to find a pair of variable vein pixels in which the same portion of the vein is reflected in the two biological images. Ask for specific information. Therefore, the vein pattern extraction unit 14 detects, for each variable vein pixel, a pixel where the vein reaches the boundary of the invariable region along the vein reflected in the variable vein pixel, and the invariant boundary pixel. Find the coordinates of the reference coordinate system. The vein pattern extraction unit 14 detects invariant boundary pixels for each variable vein pixel, and counts the number of pixels along the vein from the variable vein pixel to the invariant boundary pixel. The vein pattern extraction unit 14 associates the coordinates of the corresponding invariant boundary pixel and the number of pixels from the invariant boundary pixel as position specifying information with each of the variable vein pixels.
- the vein pattern extraction unit 14 specifies the closest center line among the center lines of each finger for each of the variable vein pixels, and assigns the finger identification number corresponding to the closest center line to the variable vein pixel. Associate. This finger identification number is also an example of position specifying information.
- the vein pattern extraction unit 14 passes the vein pattern image, each variable vein pixel and its position specifying information to the registration unit 15.
- the registration unit 15 uses the identification information of the user input via the input unit 3 as information representing an invariable region, vein pattern image, each variable vein pixel and its position specifying information, and each finger. Are stored in the storage unit 5 together with the angles. As a result, the user is registered as a registered user who is permitted to use the device on which the biometric authentication device 1 is mounted.
- FIG. 5 is an operation flowchart of a registration process executed by the processing unit 7.
- the processing unit 7 receives from the biometric information acquisition unit 4 two biometric images representing the vein pattern of the user's hand, taken with the finger open and the finger closed, the processing unit 7
- the living body region extracting unit 11 extracts a living body region from each living body image (step S101).
- the invariant region extraction unit 12 of the processing unit 7 extracts an invariant region that is a region in which the position of the vein does not change even if the angle of the finger is changed from the living body region for each biological image (step S102).
- the invariant area extraction unit 12 sets a reference coordinate system with respect to the invariant area for each biological image (step S103).
- the angle calculation unit 13 of the processing unit 7 calculates the angle of each finger according to the reference coordinate system for each biological image (step S104).
- the vein pattern extraction unit 14 of the processing unit 7 extracts a vein pattern from each biological image and binarizes and thins the vein pattern image corresponding to the finger open state and the finger closed state. A corresponding vein pattern image is generated (step S105). Furthermore, the vein pattern extraction unit 14 obtains position specifying information for each variable vein pixel located between the finger and the invariable region (step S106).
- the registration unit 15 of the processing unit 7 uses the identification information of the user acquired from the input unit 3 as information indicating an invariant region for the finger closed state and the finger open state, a vein pattern image, each variable vein pixel, The position specifying information and the angle of each finger are stored in the storage unit 5 (step S107). Thereafter, the processing unit 7 ends the registration process.
- biometric authentication process is the target of the so-called 1: 1 authentication method in which the registered user to be identified is known by obtaining the identification information of the registered user via the input unit 3. It may be executed in accordance with any of the so-called 1: N authentication methods in which the registered user is unknown.
- the biometric region extraction unit 11, the invariant region extraction unit 12, the angle calculation unit 13, and the input biometric image generated at the time of execution of the biometric authentication process The processing of the vein pattern extraction unit 14 is executed. Then, the angle of each finger is calculated, and a vein pattern image is generated. Note that when the biometric authentication process is executed, the biometric authentication device 1 generates only one input biometric image. Therefore, the invariant region extraction unit 12 specifies the invariant region by pattern matching between the input biological image and the reference model image.
- the invariant region extraction unit 12 is specified according to the identification information of the user input via the input unit 3, and the registered biometric image and the input biometric image in either a finger closed state or a finger open state
- the invariable region on the input biological image may be specified by pattern matching.
- the correction unit 16 For each finger, the correction unit 16 performs the finger angle obtained from the input biometric image on the difference between the finger angle when the finger is closed and the finger angle when the finger is open, and the finger is closed or The ratio of the difference with the angle of the finger in the state where the finger is opened is obtained as a position correction coefficient. The correction unit 16 then closes the vein pattern image with the finger closed or the vein pattern image with the finger open so as to cancel the difference in the shape of the vein pattern due to the difference in finger angle according to the position correction coefficient. Correct. For this purpose, the correction unit 16 calculates, for each finger, a position correction coefficient rc for the vein pattern image with the finger closed and a position correction coefficient ro for the vein pattern image with the finger open, for each finger.
- ⁇ is an angle with respect to the x axis of the reference coordinate system for the finger of interest in the input biological image.
- ⁇ c and ⁇ o are angles with respect to the x axis of the reference coordinate system for the finger of interest in the biological image with the finger closed and the biological image with the finger open, respectively.
- the correction unit 16 For each finger, the correction unit 16 relates between the pair of variable vein pixels in which the same portion of the vein is captured in the vein pattern image with the finger closed and the vein pattern image with the finger open, which are associated with the finger.
- the internal dividing point of is a pixel in which that portion after position correction is shown.
- the internal dividing point dp (x, y) is calculated according to the following equation using the position correction coefficients rc and ro.
- vc (x, y) represents the position of the variable vein pixel in the vein pattern image with the finger closed, and vo (x, y) is the same as part of the vein shown in vc (x, y) This represents the position of the variable vein pixel in the vein pattern image in which the portion is shown and the finger is open.
- the correction unit 16 specifies a pair of variable vein pixels in which the same part of the vein is shown according to the following procedure.
- the correction unit 16 does not change the invariant boundary pixel on the vein pattern image with the finger open and the invariant boundary pixel on the vein pattern image with the finger closed and the coordinates in the reference coordinate system are the same or substantially coincide. A pair with a boundary pixel is detected. It is presumed that the same part of the vein appears in these two invariant boundary pixels. Therefore, the correction unit 16 is connected to any of the invariant boundary pixels along the vein on each vein pattern image, and two variable vein pixels having the same distance along the vein from any of the invariant boundary pixels. Is a pair of variable vein pixels in which the same part of the vein is shown.
- FIG. 6 is a diagram showing an example of a vein pattern whose position is corrected from the vein pattern when the finger is open and the vein pattern when the finger is closed according to the angle of the finger at the time of collation.
- a vein pattern 600 is a vein pattern in the vicinity of the index finger extracted from a biological image taken with the finger closed
- a vein pattern 601 is extracted from a biological image taken with the finger open. This is a vein pattern near the index finger.
- the angle of the index finger with respect to the x axis of the reference coordinate system when the finger is closed is 90 degrees
- the angle of the index finger with respect to the x axis of the reference coordinate system when the finger is open is 120 degrees .
- the position-corrected vein pattern 602 is a position that internally divides the vein pattern 600 and the vein pattern 601 into 1: 2.
- the correction unit 16 passes the corrected vein pattern image representing the vein pattern position-corrected based on the two vein pattern images of the registered user to the matching unit 17 together with the identification information of the registered user.
- the correction unit 16 performs the above processing for all registered users, generates a corrected vein pattern image for each registered user, and the corrected vein pattern image. Is transferred to the collation unit 17 together with the identification information of the registered user.
- the collation unit 17 collates the input vein pattern image generated from the input biological image with the corrected vein pattern image, thereby correcting the vein pattern represented in the input vein pattern image and the correction vein pattern image. The similarity with the vein pattern is calculated.
- the matching unit 17 may calculate the similarity by performing pattern matching between the input vein pattern image and the corrected vein pattern image.
- the matching unit 17 obtains a normalized cross-correlation value while changing the relative position of the input vein pattern image with respect to the corrected vein pattern image, and calculates the maximum value of the normalized cross-correlation value as a similarity value.
- the matching unit 17 may extract feature points such as vein branch points or end points from each of the input vein pattern image and the corrected vein pattern image, and calculate the degree of coincidence of the feature points as similarity.
- the collation unit 17 scans the input vein pattern image and the corrected vein pattern image using a plurality of templates corresponding to either the branch point or the end point of the vein. And the collation part 17 detects the position on those vein pattern images when it corresponds with any template as a feature point.
- the collation unit 17 obtains the number of feature points that match between the feature points detected from the input vein pattern image and the feature points detected from the corrected vein pattern image. Then, the matching unit 17 can calculate the similarity by dividing the number of coincidences by the number of feature points detected from the input vein pattern image.
- the collation unit 17 passes the similarity to the authentication determination unit 18 together with the identification information of the registered user.
- the collation unit 17 When the identification information of the user is not input, the collation unit 17 performs the above-described process for each registered user and obtains the similarity for each registered user. Then, the collation unit 17 selects a registered user having the maximum similarity. Then, the collation unit 17 passes the maximum value of the similarity and the identification information of the registered user corresponding to the maximum value to the authentication determination unit 18.
- the authentication determination unit 18 determines whether or not to authenticate the user as a registered user by comparing the similarity with an authentication determination threshold value. For example, when the similarity is equal to or higher than the authentication determination value, the authentication determination unit 18 determines that the vein pattern of the user shown in the input biometric image matches the vein pattern of the registered user. Then, the authentication determination unit 18 authenticates the user as the registered user. When authenticating the user, the authentication determination unit 18 notifies the processing unit 7 of the authentication result. Then, the processing unit 7 permits an authenticated user to use a device in which the biometric authentication device 1 is mounted or a device to which the biometric authentication device 1 is connected.
- the authentication determination unit 18 notifies the processing unit 7 that the user is not authenticated and the user is not authenticated. In this case, the processing unit 7 refuses that a user who has not been authenticated uses a device in which the biometric authentication device 1 is mounted or a device to which the biometric authentication device 1 is connected. The processing unit 7 may cause the display unit 2 to display a message indicating that the authentication has failed.
- the authentication determination threshold is set to a value such that the authentication determination unit 18 succeeds in authentication only when the registered user is a user.
- the authentication determination threshold is preferably set to a value that causes the authentication determination unit 18 to fail authentication when another person different from the registered user is the user.
- the authentication determination threshold value may be a value obtained by adding a value obtained by multiplying the difference between the maximum value and the minimum value of similarity by 0.7 to the minimum value of similarity.
- FIG. 7 is an operation flowchart of the biometric authentication process executed by the processing unit 7.
- the processing unit 7 receives the input biometric image representing the vein pattern of the user's hand from the biometric information acquisition unit 4, the biometric region extraction unit 11 of the processing unit 7 extracts the biometric region from the input biometric image.
- the invariant region extraction unit 12 of the processing unit 7 extracts an invariant region, which is a region in which the position of the vein does not change even if the angle of the finger is changed from the living body region, from the input biological image (step S202).
- the invariant region extraction unit 12 sets a reference coordinate system with respect to the input biometric image with reference to the invariant region (step S203).
- the angle calculation unit 13 of the processing unit 7 calculates the angle of each finger according to the reference coordinate system for the input biological image (step S204).
- the angle calculation unit 13 notifies the angle of each finger to the correction unit 16 of the processing unit 7.
- the vein pattern extraction unit 14 of the processing unit 7 generates an input vein pattern image representing the shape of the vein pattern of the user's hand by extracting the vein pattern from the input biological image and binarizing and thinning it ( Step S205).
- the vein pattern extraction unit 14 passes the input vein pattern image to the matching unit 17 of the processing unit 7.
- the correction unit 16 calculates, as a position correction ratio, an internal fraction of each finger angle in the state where the finger is closed and the finger is open based on the finger angle in the biometric image obtained at the time of biometric authentication (Ste S206). Then, the correction unit 16 corrects the corrected vein pattern image by correcting the position of the corresponding variable vein pixel in the vein pattern image with the finger closed and the vein pattern image with the finger open according to the position correction ratio. Generate (step S207). Then, the correction unit 16 passes the corrected vein pattern image to the matching unit 17.
- the collation unit 17 calculates the similarity between the vein pattern on the input vein pattern image and the vein pattern on the corrected vein pattern image (step S208). Then, the collation unit 17 passes the identification information of the registered user together with the similarity to the authentication determination unit 18 of the processing unit 7. When the user identification number is not obtained, the collation unit 17 obtains the maximum similarity obtained for each registered user. Then, the collation unit 17 passes the identification information of the registered user corresponding to the maximum value to the authentication determination unit 18 together with the maximum value.
- the authentication determination unit 18 determines whether or not the similarity is equal to or higher than an authentication determination threshold (step S209). If the similarity is equal to or greater than the threshold for authentication determination (step S209—Yes), the authentication determination unit 18 authenticates the user as a registered user (step S210). On the other hand, when the similarity is less than the threshold for authentication determination (step S209—No), the authentication determination unit 18 does not authenticate the user (step S211). After step S210 or S211, the processing unit 7 ends the biometric authentication process.
- this biometric authentication device is configured so that, at the time of registration, each of the finger poses is obtained from two biometric images that are photographed with different finger poses affecting the shape of the hand biometric information. Find an index that represents.
- the biometric authentication apparatus obtains an index representing the posture of the finger from the input biometric image acquired at the time of collation. Then, the biometric authentication device cancels the difference in the shape of the biometric information caused by the difference between the finger posture during registration and the finger posture during biometric authentication based on the index during matching and the index during registration. Further, the data representing the shape of the biometric information at the time of registration is transformed. And this biometrics authentication apparatus performs collation processing using biometric information after modification.
- FIG. 8 is a functional block diagram of the processing unit according to this modification.
- the processing unit 71 includes a living body region extracting unit 11, an invariant region extracting unit 12, an inter-finger distance calculating unit 19, a vein pattern extracting unit 14, a registering unit 15, a correcting unit 16, a matching unit 17, and an authentication. And a determination unit 18.
- the processing unit 71 shown in FIG. 8 is different from the processing unit 7 shown in FIG. 2 in that it includes an inter-finger distance calculation unit 19 instead of the angle calculation unit 13. Therefore, hereinafter, the inter-finger distance calculation unit 19 and related portions will be described.
- the inter-finger distance calculating unit 19 is another example of the posture specifying unit.
- the inter-finger distance calculation unit 19 displays a distance between the bases of two adjacent fingers on a biological image representing a vein pattern with the finger closed and a biological image representing a vein pattern with the finger open.
- the distance between the bases of the two fingers is calculated as an index representing the finger posture. Therefore, for example, the inter-finger distance calculation unit 19 sequentially tracks the contour pixels representing the contours of each finger from the upper side to the lower side of the biological image, thereby connecting the contour pixels of adjacent fingers. Find the coordinate in the y-axis direction in the reference coordinate system of the point.
- the inter-finger distance calculation unit 19 sets a connection point closer to the upper end of the biometric image among the left connection point and the right connection point for the finger of interest as a connection point corresponding to the base of the finger. However, since only one connection point is detected for the thumb and the little finger, the inter-finger distance calculation unit 19 sets the connection point as a connection point corresponding to the base of the thumb and little finger. For each finger, the inter-finger distance calculation unit 19 sets the intersection of the vertical line and the center line of the finger from the connecting point of the finger as the base position of the finger. The inter-finger distance calculating unit 19 calculates the distance between the base positions of adjacent fingers as the inter-finger distance.
- a biological image of a hand vein pattern with a finger open and a biological image of a hand vein pattern with a finger closed are acquired. Then, the inter-finger distance is calculated for each biological image.
- the inter-finger distance is stored in the storage unit together with the vein pattern image, the variable vein pixel, the position specifying information, and the like.
- the inter-finger distance calculation unit 19 calculates the distance between the bases of adjacent fingers from the input biometric image. Then, the correction unit 16 calculates the position correction coefficients rc and ro by using the inter-finger distance instead of the finger angle in the equation (1). However, in this case, the correction unit 16 uses the position correction coefficient calculated based on the inter-finger distance between the two fingers of interest, and the variable vein pixel located between the center lines of the two fingers. Correct the position of.
- the shape data representing the shape of the biological information of the hand may include the coordinates of feature points such as vein branch points and end points.
- the vein pattern extraction unit performs template matching between the vein pattern image and a template representing a branch point or end point to extract feature points, as described for the matching unit. That's fine.
- the correction unit is configured to move the first feature point outside the invariable region on the vein pattern image with the finger closed in the reference coordinate system, outside the invariable region on the vein pattern image with the finger open. The closest second feature point is selected from among the feature points. Then, the correction unit applies the expression (2) by setting the coordinates of the first feature point as vc (x, y) and the coordinates of the second feature point as vc (x, y), thereby obtaining the position of the feature point. Correct.
- the biological information of the hand used according to the present embodiment or the modification is not limited to the vein pattern of the hand.
- a palm print may be used as the biological information of the hand.
- biometric authentication device and the biometric authentication method disclosed in the present specification execute a biometric authentication process between the biometric information of the user and the preregistered biometric information in order for the user to perform some operation. Applicable to various devices or systems.
- FIG. 9 is a schematic configuration diagram of an example of a computer system in which the biometric authentication device according to each of the above embodiments or modifications thereof is mounted.
- the computer system 100 includes at least one terminal 110 and a server 120.
- the terminal 110 and the server 120 are connected via a wired or wireless communication network 130.
- the constituent elements corresponding to any of the constituent elements of the biometric authentication apparatus 1 shown in FIG. The same reference numbers are assigned.
- the terminal 110 is a mobile terminal such as a mobile phone or a tablet terminal, or a terminal that is fixedly installed, and includes a display unit 2, an input unit 3, and a biological information acquisition unit 4. Further, the terminal 110 includes a storage unit 21, an image acquisition control unit 22, and an interface unit 23.
- the storage unit 21 includes, for example, a semiconductor memory circuit, and temporarily stores a biological image generated by the biological information acquisition unit 4.
- the image acquisition control unit 22 includes one or a plurality of processors and peripheral circuits thereof, controls each unit of the terminal 110, and executes various programs that operate on the terminal 110.
- the image acquisition control unit 22 transmits the biometric image in which the vein pattern of the hand is generated, which is generated by the biometric information acquisition unit 4, via the interface unit 23 having an interface circuit for connecting the terminal 110 to the communication network 130. Send to server 120. Further, the image acquisition control unit 22 may also transmit user identification information input via the input unit 3 to the server 120.
- the server 120 includes a storage unit 5, a processing unit 7, and an interface unit 24 having an interface circuit for connecting the server 120 to the communication network 130.
- the processing unit 7 of the server 120 uses the biometric image received via the interface unit 24 to realize each function of the processing unit according to any one of the above embodiments or a modification thereof, thereby performing biometric authentication processing. Execute. Then, the server 120 returns a determination result as to whether or not the authentication is successful to the terminal 110 via the interface unit 24.
- the image acquisition control unit 22 of the terminal 110 includes a biological region extraction unit, an invariant region extraction unit, an angle calculation unit or an inter-finger distance calculation unit, and a vein pattern among the functions of the processing unit according to the above-described embodiment or modification. You may perform the process of an extraction part.
- data relating to biometric information extracted from the biometric image of the user and user identification information may be transmitted from the terminal 110 to the server 120 to the server 120.
- the processing unit 7 of the server 120 executes only the processes of the verification unit, the authentication determination unit, and the registration unit among the functions of the processing units according to the above-described embodiments. Thereby, since the load of the server 120 is reduced, the computer system 100 can suppress the waiting time for the user even if a large number of biometric authentication processes are executed at the same time.
- a computer program having instructions for causing a computer to realize the functions of the processing units according to the above embodiments is provided in a form recorded in a recording medium such as a magnetic recording medium, an optical recording medium, or a nonvolatile semiconductor memory. Also good.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Vascular Medicine (AREA)
- Collating Specific Patterns (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
上記の一般的な記述及び下記の詳細な記述の何れも、例示的かつ説明的なものであり、請求項のように、本発明を限定するものではないことを理解されたい。
この生体認証装置は、利用者の手の生体情報の一例である、手の静脈パターンが写った生体画像を生体認証に利用する。この生体認証装置は、指の開き度合いの違いによる、指の付け根近辺の静脈パターンの一部の変形を補正するために、登録時において、指が開いている状態の手の静脈パターンを表す生体画像と、指が閉じている状態の手の静脈パターンを表す生体画像とを取得する。そしてこの生体認証装置は、二つの生体画像のそれぞれについて、指ごとに、その指の角度とその指の付け根近辺の静脈の位置とを、静脈パターンの形状を表すデータとともに登録しておく。そしてこの生体認証装置は、照合時において取得された生体画像から各指の角度を求め、その角度と登録時の2枚の生体画像のそれぞれの指の角度との差に応じて、登録された静脈パターンの形状を表すデータについて各指近辺の静脈の位置を補正する。そしてこの生体認証装置は、静脈の位置が補正された静脈パターンの形状を表すデータと、照合時に取得された生体画像から求められた静脈パターンの形状を表すデータとを照合する。これにより、この生体認証装置は、照合時と登録時の指の姿勢の違いによる静脈パターンの局所的な変形による本人認証率の低下を抑制する。なお、手の静脈パターンは、手のひら側の静脈パターンあるいは手の甲側の静脈パターンの何れであってもよい。
なお、本実施形態では、指の先端が生体画像の上端側に位置し、手首側が生体画像の下端側に位置するように生体画像は生成される。
生体情報取得部4は、生体画像を生成する度に、その生体画像を処理部7へ渡す。
登録処理は、登録利用者の手の静脈パターン、すなわち、登録生体情報が写った生体画像から、その登録生体情報に関するデータを生成し、そのデータを登録利用者のユーザ名といった識別情報とともに記憶部5に記憶させる処理である。本実施形態では、上記のように、登録生体情報に関するデータには、指が開いた状態で静脈パターンが撮影された生体画像から求められるデータと、指が閉じた状態で静脈パターンが撮影された生体画像から求められるデータとが含まれる。ただし、何れの生体画像に対して行われる処理も同じであるため、以下では、必要が無い限り、一方の生体画像に対する処理について説明する。なお、指が開いた状態では、指が閉じた状態と比較して、隣接する2本の指の少なくとも一つの組について、その指間の開き角度が相対的に広くなっている。
生体領域抽出部11は、生体領域を表す情報として、例えば、生体画像のサイズと同じサイズを持つ2値画像を生成する。そして生体領域抽出部11は、その2値画像のうち、生体領域に含まれる画素の値を'1'に設定し、背景画素の値を'0'とする。そして生体領域抽出部11は、生体領域を表す情報を不変領域抽出部12、角度算出部13及び静脈パターン抽出部14へ渡す。
指の付け根近辺では、指の角度に応じて、その指とつながっている静脈の位置も変動する。そのため、図3(A)に示された画像300上に写っている静脈パターン301と図3(B)に示された画像310上に写っている静脈パターン302とを比較すると、静脈パターンのうちの指の付け根近辺の部分が異なっている。このことは、例えば、図3(C)を参照すると明らかである。
不変領域抽出部12は、不変領域候補に対してラベリング処理を実行することにより、互いに隣接する不変領域候補を連結する。そして不変領域抽出部12は、連結された不変領域候補の集合のうちの最大の集合を不変領域とする。
また、各生体画像の不変領域が基準モデル画像に基づいて設定されている場合も、各生体画像の不変領域の重心が基準座標系の原点に設定される。そして、その生体画像が基準モデル画像に対して時計回りに角度θだけ回転している場合、基準座標系のx軸、y軸は、それぞれ、その画像の水平方向及び垂直方向から反時計回りに角度θだけ回転した方向を向くように設定される。
なお、角度算出部13は、画像上で指が写っている領域を検出するための他の様々な技術の何れかを利用して、生体画像上で個々の指が写っている領域を検出してもよい。
角度算出部13は、生体画像ごとに、各指の角度を静脈パターン抽出部14及び登録部15へ渡す。
静脈パターン抽出部14は、2値化された生体画像の静脈画素に対して細線化処理を行うことにより、個々の静脈が1画素幅を持つ線で表された静脈パターン画像を生成する。この静脈パターン画像は、手の生体情報の形状を表す形状データの一例である。
処理部7が、生体情報取得部4から、指が開いた状態及び指が閉じた状態で撮影された利用者の手の静脈パターンが表された2枚の生体画像を受け取ると、処理部7の生体領域抽出部11は、各生体画像から生体領域を抽出する(ステップS101)。処理部7の不変領域抽出部12は、各生体画像について、生体領域から指の角度が変わっても静脈の位置が変わらない領域である不変領域を抽出する(ステップS102)。そして不変領域抽出部12は、各生体画像について、不変領域を基準として基準座標系を設定する(ステップS103)。
次に、生体認証処理について説明する。本実施形態では、生体認証処理は、入力部3を介して登録利用者の識別情報が得られることにより、対象となる登録利用者が分かっている、いわゆる1:1認証方式と、対象となる登録利用者が分からない、いわゆる1:N認証方式の何れに従って実行されてもよい。
そのために、補正部16は、次式に従って、指ごとに、指が閉じた状態の静脈パターン画像に対する位置補正係数rc及び指が開いた状態の静脈パターン画像に対する位置補正係数roを算出する。
rc = (φo-φ)/(φo-φc), ro= (φ-φc)/(φo-φc) (1)
ここで、φは、入力生体画像における注目する指についての、基準座標系のx軸に対する角度である。φc、φoは、それぞれ、指が閉じた状態の生体画像及び指が開いた状態の生体画像における、その注目する指についての基準座標系のx軸に対する角度である。
dp(x,y) = rc*vc(x,y)+ro*vo(x,y) (2)
vc(x,y)は、指が閉じた状態の静脈パターン画像における変動静脈画素の位置を表し、vo(x,y)は、vc(x,y)に写っている静脈の一部と同じ部分が写っている、指が開いた状態の静脈パターン画像における変動静脈画素の位置を表す。なお、補正部16は、以下の手順に従って、静脈の同じ部分が写っている変動静脈画素のペアを特定する。
まず、補正部16は、指が閉じた状態の静脈パターン画像上での不変境界画素と、基準座標系における座標とが同一または略一致する、指が開いた状態の静脈パターン画像上での不変境界画素とのペアを検出する。この二つの不変境界画素には、静脈の同じ部分が写っていると推定される。そこで補正部16は、各静脈パターン画像上でその不変境界画素の何れかと静脈に沿ってつながっており、かつ、その不変境界画素の何れかからの静脈に沿った距離が等しい二つの変動静脈画素を、静脈の同じ部分が写っている変動静脈画素のペアとする。
また、1:N認証方式が採用されている場合、補正部16は、全ての登録利用者について上記の処理を行って、登録利用者ごとに補正静脈パターン画像を生成し、その補正静脈パターン画像を登録利用者の識別情報とともに照合部17へ渡す。
この場合、照合部17は、例えば、静脈の分岐点または端点の何れかに対応する複数のテンプレートを用いて入力静脈パターン画像及び補正静脈パターン画像をそれぞれ走査する。そして照合部17は、何れかのテンプレートと一致するときのそれら静脈パターン画像上の位置を特徴点として検出する。
そして照合部17は、類似度の最大値及びその最大値に対応する登録利用者の識別情報を認証判定部18へ渡す。
処理部7が、生体情報取得部4から、利用者の手の静脈パターンが表された入力生体画像を受け取ると、処理部7の生体領域抽出部11は、入力生体画像から生体領域を抽出する(ステップS201)。処理部7の不変領域抽出部12は、入力生体画像について、生体領域から指の角度が変わっても静脈の位置が変わらない領域である不変領域を抽出する(ステップS202)。そして不変領域抽出部12は、入力生体画像について、不変領域を基準として基準座標系を設定する(ステップS203)。
類似度が認証判定用閾値以上である場合(ステップS209-Yes)、認証判定部18は、利用者を登録利用者として認証する(ステップS210)。
一方、類似度が認証判定用閾値未満である場合(ステップS209-No)、認証判定部18は利用者を認証しない(ステップS211)。
ステップS210またはS211の後、処理部7は、生体認証処理を終了する。
図8は、この変形例による処理部の機能ブロック図である。処理部71は、生体領域抽出部11と、不変領域抽出部12と、指間距離算出部19と、静脈パターン抽出部14と、登録部15と、補正部16と、照合部17と、認証判定部18とを有する。なお、図8において、処理部71が有する各構成要素には、図2に示した処理部7の対応する構成要素と同じ参照番号を付した。
図8に示された処理部71は、図2に示された処理部7と比較して、角度算出部13の代わりに指間距離算出部19を有する点で異なる。そこで以下では、指間距離算出部19及び関連部分について説明する。
指間距離算出部19は、それぞれの指について、その指の連結点から、その指の中心線に下ろした垂線とその中心線との交点を、その指の付け根位置とする。そして指間距離算出部19は、隣接する指の付け根位置間の距離を指間距離として算出する。
ただしこの場合、補正部16は、注目する2本の指の指間距離に基づいて算出された位置補正係数を用いて、その2本の指のそれぞれの中心線の間に位置する変動静脈画素の位置を補正する。
また補正部は、基準座標系において、指が閉じた状態の静脈パターン画像上で不変領域外にある第1の特徴点に対して、指が開いた状態の静脈パターン画像上で不変領域外にある特徴点のうちの最も近い第2の特徴点を選択する。そして補正部は、第1の特徴点の座標をvc(x,y)とし、第2の特徴点の座標をvc(x,y)として(2)式を適用することにより、特徴点の位置を補正する。
例えば、コンピュータシステム100は、少なくとも1台の端末110とサーバ120とを有する。そして端末110とサーバ120は、有線または無線の通信ネットワーク130を介して接続される。なお、図9において、コンピュータシステム100が有する構成要素のうち、図1に示した生体認証装置1が有する構成要素の何れかと対応する構成要素には、生体認証装置1が有する構成要素の参照番号と同じ参照番号を付した。
記憶部21は、例えば、半導体メモリ回路を有し、生体情報取得部4により生成された生体画像を一時的に記憶する。また画像取得制御部22は、一つまたは複数のプロセッサとその周辺回路とを有し、端末110の各部を制御し、かつ、端末110で動作する各種のプログラムを実行する。そして画像取得制御部22は、生体情報取得部4により生成された、手の静脈パターンが写った生体画像を、端末110を通信ネットワーク130と接続するためのインターフェース回路を有するインターフェース部23を介してサーバ120へ送信する。さらに画像取得制御部22は、入力部3を介して入力されたユーザ識別情報もサーバ120へ送信してもよい。
2 表示部
3 入力部
4 生体情報取得部
5 記憶部
6 記憶媒体アクセス装置
7、71 処理部
8 記憶媒体
11 生体領域抽出部
12 不変領域抽出部
13 角度算出部
14 静脈パターン抽出部
15 登録部
16 補正部
17 照合部
18 認証判定部
19 指間距離算出部
100 コンピュータシステム
110 端末
120 サーバ
130 通信ネットワーク
21 記憶部
22 画像取得制御部
23、24 インターフェース部
Claims (16)
- 手の指が第1の姿勢をとった状態における登録利用者の手の生体情報の形状を表す第1の形状データ及び前記第1の姿勢を表す第1の指標と、前記手の指が第2の姿勢をとった状態における前記登録利用者の手の生体情報の形状を表す第2の形状データ及び前記第2の姿勢を表す第2の指標とを記憶する記憶部と、
利用者の手の生体情報を表した生体画像を生成する生体情報取得部と、
前記生体画像から、前記生体画像に写っている前記利用者の手の指の第3の姿勢を表す第3の指標を算出する姿勢特定部と、
前記生体画像に基づいて、当該生体画像に写っている前記利用者の前記手の生体情報の形状を表す第3の形状データを生成する生体情報抽出部と、
前記第1の指標と前記第2の指標の差に対する、前記第3の指標と前記第1の指標または前記第2の指標との差の比に応じて、前記第3の姿勢と前記第1の姿勢または前記第2の姿勢の違いによる前記生体情報の形状の差を打ち消すように前記第1の形状データまたは前記第2の形状データを補正することで、補正された形状データを求める補正部と、
前記第3の形状データと前記補正された形状データとを照合する照合部と、
を有する生体認証装置。 - 前記第1の指標及び前記第2の指標は、それぞれ、所定の方向に対する前記登録利用者の少なくとも第1の指の角度であり、前記第3の指標は、前記所定の方向に対する前記利用者の前記第1の指の角度である、請求項1に記載の生体認証装置。
- 前記第1の姿勢における前記登録利用者の隣接する2本の指の開き角が前記第2の姿勢における前記2本の指の開き角と異なる、請求項2に記載の生体認証処理装置。
- 前記第1の指標及び前記第2の指標は、それぞれ、前記登録利用者の隣接する第1の指の付け根と第2の指の付け根間の距離であり、前記第3の指標は、前記利用者の前記第1の指の付け根と前記第2の指の付け根間の距離である、請求項1に記載の生体認証装置。
- 前記登録利用者の手の生体情報及び前記利用者の手の生体情報は、それぞれ、手の静脈パターンであり、前記第1の形状データは前記第1の姿勢における前記登録利用者の手の静脈パターンを表す第1の画像であり、前記第2の形状データは前記第2の姿勢における前記登録利用者の手の静脈パターンを表す第2の画像であり、
前記記憶部は、前記第1及び第2の画像のそれぞれにおいて、前記登録利用者の手が写った領域から、前記手の指が前記第1の姿勢をとった状態における登録利用者の手の静脈パターンの形状と前記手の指が前記第2の姿勢をとった状態における登録利用者の手の静脈パターンの形状とが一致しない部分を除いた不変領域をさらに記憶し、
前記補正部は、前記第1の画像上の何れかの静脈と前記不変領域の境界とが交差する第1の点から該静脈に沿って所定の画素数だけ離れた第1の画素と、前記第2の画像上の前記第1の点から前記静脈に沿って前記所定の画素数だけ離れた第2の画素との間の前記比に応じて定まる内分点を該静脈の位置とすることにより、前記補正された形状データを求める、
請求項1~4の何れか一項に記載の生体認証装置。 - 利用者の手の生体情報を表した生体画像を生成し、
前記生体画像から、前記生体画像に写っている前記利用者の前記手の指の第1の姿勢を表す第1の指標を算出し、
前記生体画像に基づいて、当該生体画像に写っている前記利用者の前記手の生体情報の形状を表す第1の形状データを生成し、
登録利用者の手の指の第2の姿勢を表す第2の指標と前記登録利用者の前記手の指の第3の姿勢を表す第3の指標との差に対する、前記第1の指標と前記第2の指標または前記第3の指標との差の比に応じて、前記第1の姿勢と前記第2の姿勢または前記第3の姿勢の違いによる生体情報の形状の差を打ち消すように、前記登録利用者の前記手の指が前記第2の姿勢をとった状態における前記登録利用者の手の生体情報の形状を表す第2の形状データまたは前記登録利用者の前記手の指が前記第3の姿勢をとった状態における前記登録利用者の手の生体情報の形状を表す第3の形状データを補正することで、補正された形状データを求め、
前記第1の形状データと前記補正された形状データとを照合する、
ことを含む生体認証方法。 - 前記第2の指標及び前記第3の指標は、それぞれ、所定の方向に対する前記登録利用者の少なくとも第1の指の角度であり、前記第1の指標は、前記所定の方向に対する前記利用者の前記第1の指の角度である、請求項6に記載の生体認証方法。
- 前記第2の姿勢における前記登録利用者の隣接する2本の指の開き角が前記第3の姿勢における前記2本の指の開き角と異なる、請求項7に記載の生体認証処理方法。
- 前記第2の指標及び前記第3の指標は、それぞれ、前記登録利用者の隣接する第1の指の付け根と第2の指の付け根間の距離であり、前記第1の指標は、前記利用者の前記第1の指の付け根と前記第2の指の付け根間の距離である、請求項6に記載の生体認証方法。
- 前記登録利用者の手の生体情報及び前記利用者の手の生体情報は、それぞれ、手の静脈パターンであり、前記第2の形状データは前記第2の姿勢における前記登録利用者の手の静脈パターンを表す第1の画像であり、前記第3の形状データは前記第3の姿勢における前記登録利用者の手の静脈パターンを表す第2の画像であり、
前記補正された形状データを求めることは、前記第1及び第2の画像のそれぞれにおいて、前記登録利用者の手が写った領域から、前記手の指が前記第2の姿勢をとった状態における登録利用者の手の静脈パターンの形状と前記手の指が前記第3の姿勢をとった状態における登録利用者の手の静脈パターンの形状とが一致しない部分を除いた不変領域の境界と前記第1の画像上の何れかの静脈とが交差する第1の点から該静脈に沿って所定の画素数だけ離れた第1の画素と、前記第2の画像上の前記第1の点から前記静脈に沿って前記所定の画素数だけ離れた第2の画素との間の前記比に応じて定まる内分点を該静脈の位置とすることにより、前記補正された形状データを求める、
請求項6~9の何れか一項に記載の生体認証方法。 - 利用者の手の生体情報を表した生体画像から、前記生体画像に写っている前記利用者の前記手の指の第1の姿勢を表す第1の指標を算出し、
前記生体画像に基づいて、当該生体画像に写っている前記利用者の前記手の生体情報の形状を表す第1の形状データを生成し、
登録利用者の手の指の第2の姿勢を表す第2の指標と前記登録利用者の前記手の指の第3の姿勢を表す第3の指標との差に対する、前記第1の指標と前記第2の指標または前記第3の指標との差の比に応じて、前記第1の姿勢と前記第2の姿勢または前記第3の姿勢の違いによる生体情報の形状の差を打ち消すように、前記登録利用者の前記手の指が前記第2の姿勢をとった状態における前記登録利用者の手の生体情報の形状を表す第2の形状データまたは前記登録利用者の前記手の指が前記第3の姿勢をとった状態における前記登録利用者の手の生体情報の形状を表す第3の形状データを補正することで、補正された形状データを求め、
前記第1の形状データと前記補正された形状データとを照合する、
ことをコンピュータに実行させるための生体認証用コンピュータプログラム。 - 前記第2の指標及び前記第3の指標は、それぞれ、所定の方向に対する前記登録利用者の少なくとも第1の指の角度であり、前記第1の指標は、前記所定の方向に対する前記利用者の前記第1の指の角度である、請求項11に記載の生体認証用コンピュータプログラム。
- 前記第2の姿勢における前記登録利用者の隣接する2本の指の開き角が前記第3の姿勢における前記2本の指の開き角と異なる、請求項12に記載の生体認証用コンピュータプログラム。
- 前記第2の指標及び前記第3の指標は、それぞれ、前記登録利用者の隣接する第1の指の付け根と第2の指の付け根間の距離であり、前記第1の指標は、前記利用者の前記第1の指の付け根と前記第2の指の付け根間の距離である、請求項11に記載の生体認証用コンピュータプログラム。
- 前記登録利用者の手の生体情報及び前記利用者の手の生体情報は、それぞれ、手の静脈パターンであり、前記第2の形状データは前記第2の姿勢における前記登録利用者の手の静脈パターンを表す第1の画像であり、前記第3の形状データは前記第3の姿勢における前記登録利用者の手の静脈パターンを表す第2の画像であり、
前記補正された形状データを求めることは、前記第1及び第2の画像のそれぞれにおいて、前記登録利用者の手が写った領域から、前記手の指が前記第2の姿勢をとった状態における登録利用者の手の静脈パターンの形状と前記手の指が前記第3の姿勢をとった状態における登録利用者の手の静脈パターンの形状とが一致しない部分を除いた不変領域の境界と前記第1の画像上の何れかの静脈とが交差する第1の点から該静脈に沿って所定の画素数だけ離れた第1の画素と、前記第2の画像上の前記第1の点から前記静脈に沿って前記所定の画素数だけ離れた第2の画素との間の前記比に応じて定まる内分点を該静脈の位置とすることにより、前記補正された形状データを求める、
請求項11~14の何れか一項に記載の生体認証用コンピュータプログラム。 - 利用者の識別情報を取得する入力部と、
前記利用者の手の指が第1の姿勢をとった状態における前記利用者の前記手の生体情報を表した第1の生体画像と、前記利用者の前記手の指が第2の姿勢をとった状態における前記利用者の前記手の生体情報を表した第2の生体画像とを生成する生体情報取得部と、
前記第1及び第2の生体画像において、前記手が写っている領域から、前記手の指が前記第1の姿勢をとった状態における前記利用者の前記手の生体情報の形状と前記手の指が前記第2の姿勢をとった状態における前記利用者の前記手の生体情報の形状とが一致しない部分を除いた不変領域を求める不変領域抽出部と、
前記第1及び第2の生体画像のそれぞれから、前記第1の姿勢を表す第1の指標及び前記第2の姿勢を表す第2の指標を算出する姿勢特定部と、
前記第1の生体画像に写っている前記利用者の前記手の生体情報の形状を表す第1の形状データ及び前記第2の生体画像に写っている前記利用者の前記手の生体情報の形状を表す第2の形状データを生成する生体情報抽出部と、
前記不変領域、前記第1及び第2の指標、及び前記第1及び第2の形状データを前記利用者の識別情報とともに記憶部に記憶させる登録部と、
を有する生体情報登録装置。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12872969.6A EP2833319B1 (en) | 2012-03-30 | 2012-03-30 | Biometric authentication device, biometric authentication method, and biometric authentication computer program |
KR1020147027430A KR101603469B1 (ko) | 2012-03-30 | 2012-03-30 | 생체 인증 장치, 생체 인증 방법 및 생체 인증용 컴퓨터 프로그램 |
PCT/JP2012/058650 WO2013145280A1 (ja) | 2012-03-30 | 2012-03-30 | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム |
JP2014507241A JP5930023B2 (ja) | 2012-03-30 | 2012-03-30 | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム |
CN201280071889.1A CN104246824B (zh) | 2012-03-30 | 2012-03-30 | 生物体认证装置、生物体认证方法 |
US14/495,222 US9305209B2 (en) | 2012-03-30 | 2014-09-24 | Biometric authentication apparatus, biometric authentication method, and computer program for biometric authentication |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/058650 WO2013145280A1 (ja) | 2012-03-30 | 2012-03-30 | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/495,222 Continuation US9305209B2 (en) | 2012-03-30 | 2014-09-24 | Biometric authentication apparatus, biometric authentication method, and computer program for biometric authentication |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013145280A1 true WO2013145280A1 (ja) | 2013-10-03 |
Family
ID=49258643
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/058650 WO2013145280A1 (ja) | 2012-03-30 | 2012-03-30 | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US9305209B2 (ja) |
EP (1) | EP2833319B1 (ja) |
JP (1) | JP5930023B2 (ja) |
KR (1) | KR101603469B1 (ja) |
CN (1) | CN104246824B (ja) |
WO (1) | WO2013145280A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017049955A (ja) * | 2015-09-04 | 2017-03-09 | 富士通株式会社 | 生体認証装置、生体認証方法および生体認証プログラム |
US20220138297A1 (en) * | 2019-07-19 | 2022-05-05 | Hewlett-Packard Development Company, L.P. | Biometric input device |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9646147B2 (en) * | 2014-09-26 | 2017-05-09 | The United States Of America As Represented By The Secretary Of The Navy | Method and apparatus of three-type or form authentication with ergonomic positioning |
CN105556410B (zh) * | 2014-12-31 | 2018-06-26 | 深圳市大疆创新科技有限公司 | 移动物体及其天线自动对准方法、系统 |
JP6531395B2 (ja) * | 2015-01-08 | 2019-06-19 | 富士通株式会社 | 三次元形状測定装置、三次元形状測定方法、及び、プログラム |
JP6589309B2 (ja) * | 2015-03-16 | 2019-10-16 | 富士通株式会社 | 端末装置、生体認証プログラム及び生体認証方法 |
US10055661B2 (en) * | 2015-03-24 | 2018-08-21 | Intel Corporation | Skin texture-based authentication |
KR102536252B1 (ko) * | 2016-03-25 | 2023-05-25 | 삼성디스플레이 주식회사 | 표시장치 및 표시장치 제조방법 |
CN106650703A (zh) * | 2017-01-06 | 2017-05-10 | 厦门中控生物识别信息技术有限公司 | 手掌防伪方法和装置 |
KR102331464B1 (ko) * | 2017-04-18 | 2021-11-29 | 삼성전자주식회사 | 디스플레이 영역에 형성된 생체 정보 센싱 영역을 이용한 생체 정보 획득 방법 및 이를 지원하는 전자 장치 |
US11711440B2 (en) * | 2021-01-06 | 2023-07-25 | Shopify Inc. | Code monitoring to recommend alternative tracking applications |
CN112926516B (zh) * | 2021-03-26 | 2022-06-14 | 长春工业大学 | 一种鲁棒的手指静脉图像感兴趣区域提取方法 |
US11823488B1 (en) * | 2021-03-29 | 2023-11-21 | Amazon Technologies, Inc. | System for training embedding network |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002117405A (ja) | 2000-10-05 | 2002-04-19 | Nippon Telegr & Teleph Corp <Ntt> | 掌形認証方法 |
JP2006277341A (ja) * | 2005-03-29 | 2006-10-12 | Fujitsu Ltd | 複数同時バイオメトリクス入力装置および複数同時バイオメトリクス認証装置 |
JP2010015365A (ja) | 2008-07-03 | 2010-01-21 | Hitachi Maxell Ltd | 生体認証装置及び生体認証方法 |
JP2010152706A (ja) * | 2008-12-25 | 2010-07-08 | Fujitsu Ltd | 生体認証装置 |
WO2011030441A1 (ja) * | 2009-09-11 | 2011-03-17 | 富士通株式会社 | 生体認証装置、生体認証システム、及び生体認証方法 |
WO2012029150A1 (ja) * | 2010-09-01 | 2012-03-08 | 富士通株式会社 | 生体認証システム、生体認証方法、及びプログラム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8190239B2 (en) * | 2002-09-03 | 2012-05-29 | Fujitsu Limited | Individual identification device |
JP4601380B2 (ja) * | 2004-10-08 | 2010-12-22 | 富士通株式会社 | 生体認証システムの登録方法、生体認証システム及びそのプログラム |
JP4680158B2 (ja) * | 2006-09-13 | 2011-05-11 | 株式会社日立製作所 | 生体認証装置 |
JP4862644B2 (ja) * | 2006-12-15 | 2012-01-25 | ソニー株式会社 | 登録装置、登録方法及びプログラム |
JP5151396B2 (ja) * | 2007-10-29 | 2013-02-27 | 株式会社日立製作所 | 指静脈認証装置 |
JP5271669B2 (ja) * | 2008-10-31 | 2013-08-21 | 株式会社日立製作所 | 生体認証方法およびシステム |
JP5098973B2 (ja) * | 2008-11-27 | 2012-12-12 | 富士通株式会社 | 生体認証装置、生体認証方法及び生体認証プログラム |
JP2012098988A (ja) * | 2010-11-04 | 2012-05-24 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP5708662B2 (ja) * | 2010-12-28 | 2015-04-30 | 富士通株式会社 | 生体認証装置、生体認証方法、および、生体認証プログラム |
-
2012
- 2012-03-30 KR KR1020147027430A patent/KR101603469B1/ko active IP Right Grant
- 2012-03-30 CN CN201280071889.1A patent/CN104246824B/zh active Active
- 2012-03-30 JP JP2014507241A patent/JP5930023B2/ja active Active
- 2012-03-30 WO PCT/JP2012/058650 patent/WO2013145280A1/ja active Application Filing
- 2012-03-30 EP EP12872969.6A patent/EP2833319B1/en active Active
-
2014
- 2014-09-24 US US14/495,222 patent/US9305209B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002117405A (ja) | 2000-10-05 | 2002-04-19 | Nippon Telegr & Teleph Corp <Ntt> | 掌形認証方法 |
JP2006277341A (ja) * | 2005-03-29 | 2006-10-12 | Fujitsu Ltd | 複数同時バイオメトリクス入力装置および複数同時バイオメトリクス認証装置 |
JP2010015365A (ja) | 2008-07-03 | 2010-01-21 | Hitachi Maxell Ltd | 生体認証装置及び生体認証方法 |
JP2010152706A (ja) * | 2008-12-25 | 2010-07-08 | Fujitsu Ltd | 生体認証装置 |
WO2011030441A1 (ja) * | 2009-09-11 | 2011-03-17 | 富士通株式会社 | 生体認証装置、生体認証システム、及び生体認証方法 |
WO2012029150A1 (ja) * | 2010-09-01 | 2012-03-08 | 富士通株式会社 | 生体認証システム、生体認証方法、及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2833319A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017049955A (ja) * | 2015-09-04 | 2017-03-09 | 富士通株式会社 | 生体認証装置、生体認証方法および生体認証プログラム |
US20220138297A1 (en) * | 2019-07-19 | 2022-05-05 | Hewlett-Packard Development Company, L.P. | Biometric input device |
US12001534B2 (en) * | 2019-07-19 | 2024-06-04 | Hewlett-Packard Development Company, L.P. | Biometric input device |
Also Published As
Publication number | Publication date |
---|---|
EP2833319B1 (en) | 2018-06-06 |
CN104246824B (zh) | 2017-05-10 |
KR20140131984A (ko) | 2014-11-14 |
CN104246824A (zh) | 2014-12-24 |
US20150010215A1 (en) | 2015-01-08 |
JP5930023B2 (ja) | 2016-06-08 |
EP2833319A1 (en) | 2015-02-04 |
JPWO2013145280A1 (ja) | 2015-08-03 |
US9305209B2 (en) | 2016-04-05 |
EP2833319A4 (en) | 2015-06-24 |
KR101603469B1 (ko) | 2016-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5930023B2 (ja) | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム | |
JP6467852B2 (ja) | 生体情報補正装置、生体情報補正方法及び生体情報補正用コンピュータプログラム | |
JP5799586B2 (ja) | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム | |
JP2020074174A (ja) | モバイル・デバイスを用いてキャプチャしたイメージを使用する指紋ベースのユーザ認証を実行するためのシステムおよび方法 | |
JP5505504B2 (ja) | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラムならびに生体情報登録装置 | |
JP5812109B2 (ja) | 生体情報処理装置、生体情報処理方法及び生体情報処理用コンピュータプログラム | |
JP5699845B2 (ja) | 生体情報処理装置、生体情報処理方法及び生体情報処理用コンピュータプログラム | |
JP6369078B2 (ja) | 画像補正装置、画像補正方法及び画像補正用コンピュータプログラム | |
JP5799960B2 (ja) | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム | |
JP6024141B2 (ja) | 生体情報処理装置、生体情報処理方法、および生体情報処理プログラム | |
JP2010146073A (ja) | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラムならびにコンピュータシステム | |
WO2013161077A1 (ja) | 生体認証装置、生体認証プログラム及び生体認証方法 | |
JP5915336B2 (ja) | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム | |
JP6056398B2 (ja) | 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム | |
JP6349817B2 (ja) | 位置合わせ装置、位置合わせ方法及び位置合わせ用コンピュータプログラム | |
JP2015129997A (ja) | 生体情報処理装置、生体情報処理方法及び生体情報処理用コンピュータプログラム | |
JP5509769B2 (ja) | 生体認証装置及び生体認証方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12872969 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014507241 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012872969 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20147027430 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |