CN112906448A - Biometric authentication device - Google Patents

Biometric authentication device Download PDF

Info

Publication number
CN112906448A
CN112906448A CN202011237076.8A CN202011237076A CN112906448A CN 112906448 A CN112906448 A CN 112906448A CN 202011237076 A CN202011237076 A CN 202011237076A CN 112906448 A CN112906448 A CN 112906448A
Authority
CN
China
Prior art keywords
finger
image
authentication device
saturation region
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011237076.8A
Other languages
Chinese (zh)
Inventor
野野村洋
宫武孝文
长坂晃朗
三浦直人
松田友辅
中崎溪一郎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Industry and Control Solutions Co Ltd
Original Assignee
Hitachi Industry and Control Solutions Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Industry and Control Solutions Co Ltd filed Critical Hitachi Industry and Control Solutions Co Ltd
Publication of CN112906448A publication Critical patent/CN112906448A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Input (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides a biometric authentication device capable of maintaining authentication accuracy even when a presentation unit is miniaturized. A biometric authentication device including a presenting unit for presenting a biometric object, the biometric authentication device further including: a light source for irradiating the living body with an electromagnetic wave; an imaging unit that images an image based on electromagnetic waves scattered by the living body; an extraction unit that extracts a luminance saturation region in the image, the luminance saturation region being a region in which luminance exceeds an upper limit value of the imaging unit; and an authentication unit that authenticates an individual by comparing registered data registered in advance with newly generated data with respect to the luminance saturation region.

Description

Biometric authentication device
Technical Field
The present invention relates to a biometric authentication device for authenticating a person using a near-infrared image captured by irradiating a finger of the person with near-infrared light.
Background
As one of biometric authentication devices that can perform personal authentication with high accuracy, there is a finger vein authentication device that uses a pattern of blood vessels that differs for each person. The finger vein authentication device is a device that authenticates an individual by comparing a blood vessel pattern extracted from a near-infrared image of a finger with a blood vessel pattern registered in advance. In a biometric authentication device such as a finger vein authentication device, it is important to improve authentication accuracy and to reduce the size of the device.
Patent document 1 discloses the following technique: in order to improve the authentication accuracy of the finger vein authentication apparatus, the irradiation light amount is adjusted so that the area ratio of the finger region to the brightness saturation region in the near-infrared image of the finger falls within a predetermined range, and then a near-infrared image for extracting a blood vessel pattern is captured.
Patent document 1: japanese patent laid-open publication No. 2019-96168
Disclosure of Invention
However, patent document 1 does not mention the downsizing of the biometric authentication device. In order to miniaturize the finger vein authentication apparatus, it is necessary to miniaturize a display unit on which a finger is displayed in order to capture a near infrared image, and miniaturization of the display unit causes a reduction in authentication accuracy. That is, if the presentation unit is miniaturized, the size of the blood vessel pattern extracted from the near-infrared image becomes small, and the authentication accuracy is degraded.
Therefore, an object of the present invention is to provide a biometric authentication device capable of maintaining authentication accuracy even when a presentation unit on which a biometric object such as a finger is presented is miniaturized.
In order to achieve the above object, the present invention provides a biometric authentication device including a presenting unit for presenting a biometric object, the biometric authentication device further including: a light source for irradiating the living body with an electromagnetic wave; an imaging unit that images an image based on electromagnetic waves scattered by the living body; an extraction unit that extracts a luminance saturation region in the image, the luminance saturation region being a region in which luminance exceeds an upper limit value of the imaging unit; and an authentication unit that authenticates the individual by comparing the registered data registered in advance with the newly generated data with respect to the luminance saturation region.
Effects of the invention
According to the present invention, it is possible to provide a biometric authentication device capable of maintaining authentication accuracy even when a display unit for displaying a biometric object such as a finger is miniaturized.
Drawings
Fig. 1 is a schematic configuration diagram of a biometric authentication device according to example 1.
Fig. 2 is a perspective view showing a display unit of example 1.
Fig. 3 is a diagram showing an example of the flow of the processing of example 1.
Fig. 4A to 4C are diagrams showing an example of a near-infrared image of a finger presented to the presentation unit in example 1.
Fig. 5 is a diagram showing a modification of the flow of the processing of example 1.
Fig. 6 is a diagram showing an example of a near-infrared image of a finger captured for each different irradiation light amount.
Fig. 7 is a diagram showing an example of a near-infrared image of a fake finger presented to the presentation unit in example 1.
Fig. 8 is a diagram showing an example of a finger obliquely shown to the display unit.
Fig. 9 is a diagram illustrating a simulated image of a near-infrared image of a finger shown in an inclined manner.
Fig. 10 is a perspective view showing a display part of example 3.
Fig. 11A and 11B are diagrams showing an example of visible light irradiated to a finger shown in the display unit of example 3.
Fig. 12 is a schematic configuration diagram of the biometric authentication device according to embodiment 4.
Fig. 13 is a diagram showing an example of the flow of the processing of example 4.
FIG. 14 is a diagram showing an example of the flow of the processing of example 5.
Fig. 15 is a perspective view showing a display part of example 6.
Fig. 16A and 16B are diagrams showing an example of a near-infrared image of a finger presented on the presentation unit in example 6.
Description of the reference symbols
10: a finger; 100: a presentation unit; 101: a finger placement table; 102: a fingertip support table; 103: a diffusion plate; 104: a light-transmitting portion; 105: a near infrared light source; 110: a box body; 111: a near-infrared imaging unit; 112: a near infrared light source control unit; 120: a computer; 121: a CPU; 122: a memory; 123: an interface; 124: a storage device; 125: a display device; 126: a keyboard; 127: a speaker; 1000: a visible light source; 1101: a left side illumination area; 1102: a right side illumination area; 1201: a visible light shooting part; 1202: a visible light source control part; 1500: a light shielding portion.
Detailed Description
Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description, a finger is taken as an example of a living body, and near infrared rays are taken as an example of electromagnetic waves.
[ example 1]
The biometric authentication device according to the present embodiment will be described with reference to fig. 1. Fig. 1 is a schematic configuration diagram of the biometric authentication device according to the present embodiment. The biometric authentication device of the present embodiment includes a presentation unit 100 and a computer 120 provided on a housing 110 in order to execute authentication processing using a near infrared image of a finger 10. The presentation unit 100 presents the finger 10.
The presentation unit 100 of the present embodiment will be described with reference to fig. 2. Fig. 2 is a perspective view of the display unit 100 of the present embodiment. The display unit 100 of the present embodiment includes a finger placement stage 101, a fingertip support stage 102, a diffusion plate 103, a light transmission section 104, and a plurality of near infrared light sources 105.
The fingertip support base 102 is a member that the tip of the finger 10 abuts against and covers the side of the finger 10, and is used for positioning the finger 10 in the longitudinal direction. The finger placement stage 101 is a member for placing the base of the finger 10 and positioning the finger 10 in the thickness direction. The finger placement stage 101 and the fingertip support stage 102 are made of a material that shields the interfering light.
The plurality of near-infrared light sources 105 are arranged on the side of the finger 10 along the longitudinal direction of the finger 10, and irradiate near-infrared light from the side of the finger 10 through the diffusion plate 103. The diffusion plate 103 is disposed between the near infrared ray light source 105 and the finger 10, and diffuses the near infrared ray irradiated from the near infrared ray light source 105 in the longitudinal direction of the finger 10. For the diffusion plate 103, for example, a milky plastic material is used. The light-transmitting portion 104 is a transparent plate disposed between the finger 10 and the case 110, and prevents foreign substances such as dust and dirt from entering the case 110. A filter that transmits only light of a specific wavelength such as near infrared rays may be used for the light-transmitting portion 104.
Returning to fig. 1. The case 110 is provided with a near infrared ray imaging unit 111 and a near infrared ray light source control unit 112. The near-infrared imaging unit 111 detects the near-infrared rays transmitted through the light transmission portion 104, and images a near-infrared image of the finger 10. That is, the near infrared rays irradiated from the near infrared ray light source 105 and incident on the finger 10 and scattered by the finger 10 are captured as a near infrared ray image of the finger 10. The near infrared image of the finger 10 is sent to the computer 120 for authentication processing. The authentication process using the near infrared image of the finger 10 will be described later with reference to fig. 3. The near-infrared light source control unit 112 controls the plurality of near-infrared light sources 105 based on an instruction transmitted from the computer 120, for example, to selectively turn on or off the near-infrared light sources 105 or to increase or decrease the amount of irradiation light emitted from the near-infrared light sources 105.
The computer 120 includes a CPU121, a memory 122, and a plurality of interfaces 123. The CPU121 loads a program stored in the storage device 124, for example, a program for authentication processing and data necessary for program execution into the memory 122 and executes them, or controls the operation of each unit. The memory 122 stores or reads programs executed by the CPU121 and the processing of arithmetic processing. The interface 123 is connected to a storage device 124, a display device 125, a keyboard 126, and a speaker 127, together with the near-infrared imaging unit 111 and the near-infrared light source control unit 112. Further, a device that operates according to the result of the authentication process, such as a gate or a door, is connected to the interface 123.
The storage device 124 stores programs executed by the CPU121 and data necessary for the execution of the programs, for example, registered data used for authentication processing. The storage device 124 is a recording device such as an hdd (hard Disk drive) or an ssd (solid State drive), or a device that reads from or writes to a recording medium such as an IC card, an SD card, or a DVD. The display device 125 is a device that displays the result of program execution, for example, the result of authentication processing, and specifically, a liquid crystal display or the like. The keyboard 126 is a device to which an operation instruction for the computer 120 is input. The speaker 127 is a device that outputs sound, and outputs a result of program execution, for example, a result of authentication processing.
An example of the flow of the processing of the present embodiment will be described with reference to fig. 3. Note that the flow of the following processing may be executed by the CPU121, or may be executed by an external server.
(S301)
The CPU121 instructs the near-infrared imaging unit 111 to acquire a near-infrared image. Specifically, near infrared rays with a predetermined irradiation light amount are irradiated from the near infrared light source 105 on one of the right and left sides of the finger 10 to capture a near infrared image, and then near infrared rays are irradiated from the other side to capture a near infrared image. Alternatively, near infrared rays may be simultaneously irradiated from both the right and left sides of the finger 10 to capture a near infrared image.
An example of the near-infrared image of the finger 10 will be described with reference to fig. 4A to 4C. Fig. 4A is an example of a near-infrared image of the finger 10 when near-infrared rays are simultaneously irradiated from only the left side, fig. 4B is an example of a near-infrared image of the finger when near-infrared rays are simultaneously irradiated from only the right side, and fig. 4C is an example of a near-infrared image of the finger from both sides. By positioning the finger 10 in the longitudinal direction by the fingertip support base 102, an infrared image including the first joint and the second joint of the finger 10 is captured. The black portion is a background region not including a finger, the white portion is a luminance saturated region that is a region having luminance exceeding an upper limit value detectable by the near infrared ray imaging unit 111, and the gray portion is a non-luminance saturated region that is neither the background region nor the luminance saturated region. In the luminance saturation region, luminance equal to the upper limit value of the near infrared ray imaging unit 111 is uniformly distributed. The blood vessel pattern is contained in the non-luminance saturated region. The sum of the luminance saturated region and the non-luminance saturated region is called a finger region. Further, the width of the luminance saturation region is wider at the first joint and the second joint than at other portions.
There is a personal difference in the luminance saturation region in the near-infrared image of the finger 10, and the size of the luminance saturation region does not change even if the presentation section 100 is miniaturized, so the authentication process is performed using the luminance saturation region in the present embodiment. Further, when near-infrared rays are irradiated from either one of the left and right sides, the luminance saturation region can be more accurately obtained, and when near-infrared rays are irradiated from both sides simultaneously, a near-infrared image can be acquired in a short time.
(S302)
The CPU121 determines whether or not the finger 10 is included in the near-infrared image acquired in S301. Specifically, the near-infrared image acquired in S301 is compared with the near-infrared image of only the background region and determined. If the finger 10 is included, the process proceeds to S303, and if the finger 10 is not included, the process returns to S301, and the near-infrared image is acquired again.
(S303)
The CPU121 extracts a luminance saturation region from the near-infrared image of the finger 10 acquired in S301. That is, the CPU121 functions as an extraction unit that extracts a luminance saturation region. In the extraction of the luminance saturated region, for example, threshold processing of the luminance value is used.
In addition, the near infrared image of the finger 10 may be normalized before extracting the luminance saturation region. Specifically, the average value of the finger width is calculated from the contour of the finger region extracted based on the background region of the near-infrared image of the finger 10, and the near-infrared image of the finger 10 is enlarged or reduced so that the average value of the finger width becomes a predetermined value, thereby performing normalization. The standardization of the near-infrared image of the finger 10 can improve the authentication accuracy of the authentication process.
(S304)
The CPU121 collates the generated data newly generated from the luminance saturation region extracted in S303 with the registered data registered in advance with respect to the luminance saturation region. The registered data may be an image of a luminance saturation region extracted from a near-infrared image of the finger 10 captured in advance, or may be a measurement value measured from the image of the luminance saturation region, for example, a value of the width of the luminance saturation region. The generated data is generated in the same form as the registered data. That is, in the case where the registered data is an image of a luminance saturation region, the generated data is an image of the luminance saturation region extracted in S303, and in the case where the registered data is a value of the width of the luminance saturation region, the generated data is a value of the width of the luminance saturation region extracted in S303.
(S305)
The CPU121 determines whether the degree of inconsistency between the registered data and the generated data is lower than a preset threshold. That is, the CPU121 functions as an authentication unit that authenticates an individual by comparing the registered data with the generated data. When the registration data and the generation data are images in a luminance saturation region, the images are compared while being superimposed, and the degree of disparity is calculated. When the registration data and the generation data are values of the width of the luminance saturation region, the values of the widths are compared for each position in the longitudinal direction of the finger 10, and the degree of inconsistency is calculated. If the degree of inconsistency is lower than the threshold value, the process proceeds to S306, and if not, the process proceeds to S307.
(S306)
The CPU121 performs post-processing after authentication is successful. Specifically, when the biometric authentication device of the present embodiment is attached to a door or a gate, the opening and closing of the gate or the door changes. That is, the closed door is opened, and the door with the lock opened is locked.
(S307)
The CPU121 performs post-authentication processing. Specifically, when the biometric authentication device of the present embodiment is attached to a gate or a door, the opening and closing of the gate or the door is maintained. That is, the closed gate remains closed, and the unlocked gate remains unlocked.
According to the flow of the process described above, the authentication process is performed using the luminance saturation region extracted from the near-infrared ray image of the finger 10.
A modified example of the flow of the processing of the present embodiment will be described with reference to fig. 5. Note that the flow of the following processing may be executed by the CPU121, or may be executed by an external server. In fig. 5, S501 to S503 are added instead of S304 of fig. 3, and the other steps are the same as those in fig. 3, and therefore S501 to S503 will be described.
(S501)
The CPU121 instructs the near-infrared light source control unit 112 to increase the irradiation light amount of the near-infrared light emitted from the near-infrared light source 105. The increase in the irradiation light amount is a predetermined value.
(S502)
The CPU121 determines whether the irradiation light amount of the near infrared ray reaches the upper limit value of the near infrared ray light source 105. If the irradiation light amount reaches the upper limit value, the process proceeds to S503, and if the irradiation light amount does not reach the upper limit value, the process returns to S301. That is, a plurality of near-infrared images captured at different irradiation light amounts are acquired through the flow of the processing in S301 to S502, and a luminance saturation region is extracted from each near-infrared image.
An example of a near-infrared image of the finger 10 captured at different irradiation light amounts will be described with reference to fig. 6. In fig. 6, a near-infrared image of the finger 10 photographed by irradiating near-infrared rays from the left side or the right side is shown together with a composite image obtained by combining the near-infrared image irradiated from the left side and the near-infrared image irradiated from the right side in accordance with the irradiation light amounts of three levels, i.e., small, medium, and large. The composite image is generated by adding regions having a luminance higher than a predetermined threshold value extracted from finger regions of the two images.
As shown in fig. 6, as the irradiation light amount increases, the width of the luminance saturation region expands, and particularly, the expansion of the width is significant at the first joint and the second joint. Since there is a personal difference in the expansion of the width of the luminance saturation region accompanying the increase in the irradiation light amount, the authentication process is performed using each luminance saturation region for each different irradiation light amount in the present modification. By using the respective luminance saturation regions for each different irradiation light amount, the authentication accuracy can be improved.
(S503)
The CPU121 collates the generated data newly generated from the luminance saturation region for each different irradiation light amount with the registered data. The registration data is registered in advance for each different irradiation light amount. That is, the registration data is collated with the generation data for each of the luminance saturation regions of different irradiation light amounts. In this step, it is also possible to determine whether the finger 10 being presented is a fake finger made of a plastic material or the like.
An example of a near-infrared image of a fake finger captured for each different irradiation light amount will be described with reference to fig. 7. Since the plastic material used in the artificial finger is homogeneous, the boundary between the luminance saturated region and the non-luminance saturated region is linear regardless of the change in the amount of illumination light. On the other hand, in the near-infrared image of the finger 10 which is not a dummy finger, as shown in fig. 6, the boundary between both regions is not linear, and is curved particularly in the vicinity of the joint portion, and is largely curved as the irradiation light amount increases.
Therefore, an approximate straight line of the boundary between the luminance saturated region and the non-luminance saturated region is calculated, and if the deviation between the boundary and the approximate straight line is equal to or less than a certain limit, it can be determined as a fake finger. In addition, when a photograph or the like of the near infrared image of the finger 10 is displayed as a fake finger in the display unit 100, the luminance saturation region is not extracted, or the luminance saturation region does not change even if the irradiation light amount changes, and therefore it can be determined that the finger is a fake finger.
According to the flow of the process described above, the authentication process is performed using the luminance saturation region extracted from the near-infrared ray image of the finger 10. Further, since the authentication process is performed using the respective luminance saturation regions for each different irradiation light amount, the authentication accuracy can be further improved.
As described above, according to the present embodiment, since the size of the luminance saturation region does not change even if the size of the presentation unit 100 changes, it is possible to provide a biometric authentication device capable of maintaining authentication accuracy even when the presentation unit on which a biometric object such as a finger is presented is miniaturized.
[ example 2]
In embodiment 1, the authentication process using the luminance saturation region extracted from the near-infrared ray image of the finger 10 displayed in parallel with the display unit 100 is described. When the user's finger 10 is thin, the finger 10 may be displayed obliquely to the display unit 100. In the present embodiment, a description will be given of a case where the finger 10 is displayed obliquely with respect to the display unit 100. The configuration of the biometric authentication device and the flow of the authentication process in the present embodiment are the same as those in embodiment 1, and therefore, the description thereof is omitted.
An example of the finger 10 displayed obliquely to the display unit 100 will be described with reference to fig. 8. In fig. 8, the tip side of the finger 10 is close to the right side, and the base side is close to the left side. That is, since the tip side of the finger 10 is closer to the right near infrared light source 105 and farther from the left near infrared light source 105, the luminance saturation region in the near infrared image irradiated on the right side is large, and the luminance saturation region irradiated on the left side is small. Further, since the finger 10 is farther from the right side and closer to the left side than the tip side, the luminance saturation region irradiated on the right side is small and the luminance saturation region irradiated on the left side is large. As a result, the difference from the registered data of the luminance saturation region becomes large, and the authentication process may become difficult.
Therefore, in the present embodiment, a simulated image simulating a near-infrared image of the finger 10 displayed obliquely to the display unit 100 is generated using near-infrared images of the finger 10 captured for each different irradiation light amount, and an image of a luminance saturation region extracted from the simulated image or the like is used as registration data.
A simulation image generated from a near infrared ray image of the finger 10 captured at different irradiation light amounts will be described with reference to fig. 9. In fig. 9, a near-infrared image of the finger 10 photographed by irradiating near-infrared rays from the left side is shown together with a simulated image generated from these near-infrared images in accordance with three levels of small, medium, and large irradiation light amounts. The simulated image of fig. 9 simulates a near-infrared image irradiated on the left side of the finger 10 with the tip side on the right side and the base side on the left side as shown in fig. 8.
The simulated image of fig. 9 is generated by combining the distal end portion of the near-infrared image of the small level of the irradiation light amount, the central portion of the near-infrared image of the medium level, and the base portion of the near-infrared image of the large level. In addition, an approximate curve in which the boundary between the luminance saturation region and the non-luminance saturation region is smoothly continuous may be set and connected to the connection portion between the respective portions. For example, a polynomial curve or a spline (spline) curve is used in the approximate curve, and the coefficients are adjusted so that the value of the boundary and the micro coefficient are equal at the connection portion.
In addition, a near-infrared image for registration of registration data may be captured in advance in a state where the finger 10 is obliquely displayed. In this case, a plurality of angles at which the finger 10 is tilted may be set, and the registration data may be registered for each angle. Further, it is also possible to generate registered data at an arbitrary angle by interpolation processing or the like using registered data at different angles.
As described above, according to the present embodiment, even in the case where the finger 10 is presented obliquely with respect to the presentation section 100, the authentication process using the luminance saturation region extracted from the near-infrared ray image of the finger 10 can be performed. Further, as in example 1, it is possible to provide a biometric authentication device capable of maintaining authentication accuracy even when a presentation unit on which a biometric object such as a finger is presented is miniaturized.
[ example 3]
In embodiment 2, a case where a simulated image generated from a near infrared ray image of the finger 10 captured for each different irradiation light amount is used when the finger 10 is displayed obliquely to the display unit 100 is described. In the present embodiment, a case will be described in which visible light is irradiated as a guide for guiding the finger 10 so as not to be obliquely displayed with respect to the display unit 100. In this embodiment, the points different from embodiment 1 will be described, and the description of the same configuration and the same flow of processing as those in embodiment 1 will be omitted.
The presentation unit 100 of the present embodiment will be described with reference to fig. 10. Fig. 10 is a perspective view of the display unit 100 of the present embodiment. The display unit 100 of the present embodiment includes a finger placement stage 101, a fingertip support stage 102, a diffusion plate 103, a light transmission section 104, a plurality of near infrared light sources 105, and a plurality of visible light ray sources 1000, as in embodiment 1.
The plurality of visible light ray sources 1000 are arranged on the side of the finger 10 along the longitudinal direction of the finger 10, and irradiate visible light rays from the side of the finger 10 through the diffusion plate 103. The visible light irradiated from the visible light source 1000 is preferably light of a color clearly distinguishable from the near infrared ray and sufficiently recognizable, for example, green light. The diffusion plate 103 diffuses the visible light irradiated from the visible light source 1000 in the longitudinal direction of the finger 10 together with the near infrared rays irradiated from the near infrared ray source 105.
The visible light to be irradiated to the finger 10 shown in the display unit 100 of the present embodiment will be described with reference to fig. 11A and 11B. Fig. 11A shows the finger 10 parallel to the display unit 100, and fig. 11B shows the finger 10 inclined with respect to the display unit 100. Fig. 11A and 11B show a left irradiation region 1101, which is a region colored with visible light rays irradiated to the left side of the finger 10, and a right irradiation region 1102, which is a region colored with visible light rays irradiated to the right side of the finger 10.
When the finger 10 is displayed in parallel with the display unit 100, the widths of the left irradiation region 1101 and the right irradiation region 1102 are substantially the same along the longitudinal direction of the finger 10, whereas when the finger 10 is displayed in an inclined manner, the widths of the both regions are different between the tip portion and the base portion of the finger 10. That is, when the user of the biometric authentication device of the present embodiment moves the finger 10 so that the widths of the left irradiation region 1101 and the right irradiation region 1102 are substantially the same, the finger 10 is displayed in parallel to the display unit 100.
As described above, according to the present embodiment, since the finger 10 is caused to be presented in parallel to the presentation unit 100, the authentication accuracy of the authentication process performed using the luminance saturation region extracted from the near-infrared image of the finger 10 is improved.
[ example 4]
In embodiment 3, the case where the visible light is irradiated to the finger 10 as the guide for prompting the finger 10 to be presented in parallel to the presentation unit 100 is described. In the present embodiment, a case of a visible light image captured by detecting visible light irradiated on and reflected by the finger 10 will be described. In this embodiment, the points different from embodiment 1 will be described, and the description of the same configuration and the same flow of processing as those in embodiment 1 will be omitted.
The biometric authentication device according to the present embodiment will be described with reference to fig. 12. The biometric authentication device of the present embodiment includes a presentation unit 100 and a computer 120 provided on a case 110, as in embodiment 1. The display unit 100 of the present embodiment includes a finger placement stage 101, a fingertip support stage 102, a diffusion plate 103, a light transmission section 104, a plurality of near infrared light sources 105, and a plurality of visible light sources 1000, as in embodiment 3. In addition, in the light transmitting portion 104 of the present embodiment, it is preferable to use a filter that transmits near infrared rays irradiated from the near infrared ray light source 105 and visible rays irradiated from the visible ray light source 1000.
The case 110 of the present embodiment is provided with a near infrared ray imaging unit 111 and a near infrared ray light source control unit 112, and further provided with a visible light ray imaging unit 1201 and a visible light ray light source control unit 1202, as in embodiment 1. The visible light ray imaging unit 1201 and the visible light ray light source control unit 1202 are connected to the interface 123 provided in the computer 120.
The visible light ray imaging unit 1201 detects the visible light rays transmitted through the light transmission portion 104, and images a visible light ray image of the finger 10. That is, the visible light irradiated from the visible light source 1000 and reflected by the finger 10 is photographed as a visible light image of the finger 10. The visible light image of the finger 10 is sent to the computer 120 together with the near infrared image of the finger 10 for authentication processing. The visible light ray source control unit 1202 controls the plurality of visible light ray sources 1000 based on an instruction transmitted from the computer 120, for example, to selectively turn on or off the visible light ray sources 1000.
An example of the flow of the processing of the present embodiment will be described with reference to fig. 13. Note that the flow of the following processing may be executed by the CPU121, or may be executed by an external server. In fig. 13, S1301 to S1303 are added instead of S303 in fig. 3, and the other steps are the same as those in fig. 3, so S1301 to S1303 will be described.
(S1301)
The CPU121 instructs the visible light imaging unit 1201 to acquire a visible light image. Specifically, the visible light is irradiated from the visible light source 1000 on one of the right and left sides of the finger 10 to capture a visible light, and then the visible light image is captured by irradiating the visible light from the other side.
(S1302)
The CPU121 extracts the outline of the finger 10 based on the near-infrared ray image of the finger 10 acquired in S301 and the visible light ray image of the finger 10 acquired in S1301. Specifically, the contour of the finger 10 is extracted based on the obtained background region, which is a region having no change in brightness in all of the four images, i.e., the near-infrared image illuminated on the left side, the near-infrared image illuminated on the right side, the visible light image illuminated on the left side, and the visible light image illuminated on the right side. That is, the contour of the finger 10 is extracted by using the same characteristic as that of the background region, which is a region other than the finger 10, regardless of the direction from the left or right to which the near infrared ray or the visible ray is irradiated. The outline of the finger 10 extracted based on the near-infrared ray image and the visible light image is found with higher accuracy.
(S1303)
The CPU121 extracts a luminance saturation region from the near-infrared image of the finger 10 based on the outline of the finger 10 extracted in S1302. Since the outline of the finger 10 extracted in S1302 is obtained with high accuracy, the luminance saturated region extracted in this step is also extracted with higher accuracy.
According to the flow of the processing described above, since the luminance saturation region is extracted from the near infrared image of the finger 10 with higher accuracy, the authentication accuracy of the authentication processing using the luminance saturation region can be further improved.
[ example 5]
In embodiment 1, the authentication process using the luminance saturation region extracted from the near-infrared ray image of the finger 10 presented to the presentation unit 100 is described. In this embodiment, an authentication process using a blood vessel pattern included in a non-luminance saturated region together with a luminance saturated region will be described. Since the configuration of the biometric authentication device according to the present embodiment is the same as that of embodiment 1, the description thereof will be omitted, and the point of difference in the flow of the authentication process will be described.
An example of the flow of the processing of the present embodiment will be described with reference to fig. 14. Note that the flow of the following processing may be executed by the CPU121, or may be executed by an external server. In fig. 14, steps S1401 to S1404 are added between steps S305 and S306 in fig. 3, and the other steps are the same as those in fig. 3, and therefore steps S1401 to S1404 are described.
(S1401)
The CPU121 calculates the amount of positional deviation of the generated data from the registered data. In S304, when the generated data of the image as the luminance saturation region is compared with the registered data, the CPU121 calculates the similarity between the generated data and the registered data while shifting the position of one of the two images, and searches for a position having the highest similarity. Then, since the degree of disparity between the two images is calculated at the position where the degree of similarity is the greatest, the amount by which one image is shifted to that position is calculated as the amount of positional deviation.
(S1402)
The CPU121 extracts a blood vessel pattern included in the non-luminance saturated region. In the extraction of the blood vessel pattern, threshold processing of, for example, a luminance value is used.
Further, the image of the non-luminance saturated region of the blood vessel pattern to be extracted may be a combined image in which the left region of the right-side illuminated near-infrared image and the right region of the left-side illuminated near-infrared image are combined. By combining the regions on the opposite side of the side surface to which the near infrared rays are irradiated, the blood vessel pattern becomes clear. When the two regions are combined, it is preferable to adjust the luminance value of at least one of the regions so that the luminance values are continuous at the combined portion.
Before the blood vessel pattern is extracted, the near-infrared image of the finger 10 may be normalized in the same manner as when the luminance value saturation region is extracted in S303.
(S1403)
The CPU121 collates the generated pattern newly generated from the blood vessel pattern extracted in S1402 with the registered pattern that is a blood vessel pattern registered in advance. The registration pattern is a blood vessel pattern extracted from a near infrared ray image of the finger 10 photographed in advance.
In addition, the amount of positional deviation calculated in S1401 may be used when the generated pattern is compared with the registered pattern. When the generated pattern is compared with the registered pattern, the similarity between both patterns is calculated while shifting the position of one of the patterns, and the position with the highest similarity is searched for, as in S1401. Therefore, by searching for a position having the greatest similarity from the position obtained by shifting one of the two patterns using the amount of positional deviation calculated in S1401, the search range is narrowed, and therefore the time for matching the generated pattern with the registered pattern can be shortened.
(S1404)
The CPU121 determines whether the degree of inconsistency between the registered pattern and the generated pattern is lower than a preset threshold. If the degree of inconsistency is lower than the threshold value, the process proceeds to S306, and if not, the process proceeds to S307.
According to the flow of the processing described above, since the authentication processing using the luminance saturation region and the blood vessel pattern extracted from the near-infrared image of the finger 10 is executed, the authentication accuracy can be further improved. Further, by using the amount of positional deviation, the blood vessel pattern can be collated in a shorter time.
[ example 6]
In embodiment 1, the authentication process using the luminance saturation region extracted from the near-infrared image of the finger 10 shown apart from the diffusion plate 103 is described. When the user's finger 10 is displayed close to the diffusion plate 103, extraction of the luminance saturation region may be difficult. In this embodiment, a case where the finger 10 is displayed close to the diffusion plate 103 will be described. In this embodiment, the points different from embodiment 1 will be described, and the description of the same configuration and the same flow of processing as those in embodiment 1 will be omitted.
The presentation unit 100 of the present embodiment will be described with reference to fig. 15. The display unit 100 of the present embodiment includes a finger placement stage 101, a fingertip support stage 102, a diffusion plate 103, a light transmission section 104, a plurality of near infrared light sources 105, and a light shielding section 1500, as in embodiment 1. The light shielding portion 1500 is a member for shielding near infrared rays, is provided between the plurality of near infrared ray sources 105, and divides the diffusion plate 103.
A near-infrared image of the finger 10 displayed on the display unit 100 of the present embodiment will be described with reference to fig. 16A and 16B. Fig. 16A shows a near-infrared image of the finger 10 in proximity to the diffusion plate 103 without the light shielding portion 1500, and fig. 16B shows a case with the light shielding portion 1500.
In the case where the light shielding portion 1500 is not provided, since the near infrared rays emitted from the near infrared ray light source 105 are captured in the near infrared ray image close to the luminance saturation region, the outline of the finger 10 becomes unclear, and extraction of the luminance saturation region becomes difficult. In contrast, when the light shielding portion 1500 is present, the outline of the finger 10 becomes clear in the portion of the light shielding portion 1500, and therefore, a luminance saturation region can be extracted.
As described above, according to the present embodiment, even in the case where the finger 10 is presented close to the diffusion plate 103, the luminance saturation region can be extracted from the near-infrared ray image of the finger 10, so the authentication process using the luminance saturation region can be executed. In particular, when the display unit on which a living body such as a finger is displayed is miniaturized, the finger 10 is easily displayed close to the diffusion plate 103, and thus the effect is exhibited.
The six embodiments have been described above, but the biometric authentication device of the present invention is not limited to the above embodiments, and the constituent elements may be modified and embodied within a range not departing from the gist of the invention. Further, a plurality of constituent elements disclosed in the above embodiments may be appropriately combined. Further, several components may be deleted from all the components shown in the above embodiments.

Claims (10)

1. A biometric authentication device comprising a presenting unit for presenting a biometric subject,
the biometric authentication device further includes:
a light source for irradiating the living body with an electromagnetic wave;
an imaging unit that images an image based on electromagnetic waves scattered by the living body;
an extraction unit that extracts a luminance saturation region in the image, the luminance saturation region being a region in which luminance exceeds an upper limit value of the imaging unit; and
and an authentication unit that authenticates an individual by comparing registered data registered in advance with newly generated data with respect to the luminance saturation region.
2. The biometric authentication device according to claim 1,
the living body is a finger, the electromagnetic wave is near infrared rays, and the image is a near infrared ray image of the finger.
3. The biometric authentication device according to claim 2,
a light source control unit for controlling the amount of light emitted from the light source;
the extraction unit extracts a brightness saturation region for each near-infrared image captured with a different amount of illumination light;
the authentication unit compares the registration data with the generation data for each of the different luminance saturation regions of the irradiation light amount.
4. The biometric authentication device according to claim 3,
the authentication unit generates a simulated image simulating a near-infrared image of a finger obliquely presented to the presentation unit using a luminance saturation region for each different amount of illumination light, and performs authentication using the simulated image.
5. The biometric authentication device according to claim 3,
the authentication unit determines which of the finger and the artificial finger is presented in the presentation unit based on the generated data of the luminance saturation region for each different amount of illumination light.
6. The biometric authentication device according to claim 2,
the finger may further include a plurality of visible light sources arranged along a longitudinal direction of the finger, and configured to irradiate the visible light from a side of the finger.
7. The biometric authentication device according to claim 6,
a visible light ray photographing part for photographing a visible light ray image of the finger when the visible light ray is irradiated from the visible light ray light source;
the extraction unit extracts the luminance saturation region based on the outline of the finger extracted using the visible light image and the near infrared image.
8. The biometric authentication device according to claim 2,
the extraction unit further extracts a blood vessel pattern from the near-infrared image;
the authentication unit also authenticates an individual by comparing a registered pattern registered in advance with a newly generated pattern with respect to the blood vessel pattern.
9. The biometric authentication device according to claim 8,
the authentication unit calculates a positional deviation amount of the generated data with respect to the registered data, and compares the registered pattern with the generated pattern based on the positional deviation amount.
10. The biometric authentication device according to claim 2,
the light source includes: a plurality of point light sources arranged along a longitudinal direction of the finger and configured to irradiate the near infrared ray from a side of the finger; and a light shielding portion provided between the point light sources and shielding the near infrared rays.
CN202011237076.8A 2019-12-04 2020-11-09 Biometric authentication device Pending CN112906448A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-219256 2019-12-04
JP2019219256A JP7349891B2 (en) 2019-12-04 2019-12-04 biometric authentication device

Publications (1)

Publication Number Publication Date
CN112906448A true CN112906448A (en) 2021-06-04

Family

ID=76111163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011237076.8A Pending CN112906448A (en) 2019-12-04 2020-11-09 Biometric authentication device

Country Status (2)

Country Link
JP (1) JP7349891B2 (en)
CN (1) CN112906448A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101513348A (en) * 2005-06-13 2009-08-26 株式会社日立制作所 Vein authentication device
CN101661555A (en) * 2008-08-25 2010-03-03 索尼株式会社 Vein imaging apparatus, vein imaging method and vein authentication apparatus
CN101957914A (en) * 2009-07-20 2011-01-26 深圳大学 Finger vein image acquisition device
CN107004114A (en) * 2014-11-28 2017-08-01 株式会社日立制作所 Blood-vessel image filming apparatus and individual authentication system
JP2018081469A (en) * 2016-11-16 2018-05-24 株式会社 日立産業制御ソリューションズ Blood vessel image pickup apparatus and personal authentication system
JP2018205785A (en) * 2015-10-29 2018-12-27 バイオニクス株式会社 Personal authentication device
JP2019096168A (en) * 2017-11-27 2019-06-20 株式会社日立製作所 Biometric authentication device and biometric authentication system
JP2020071144A (en) * 2018-10-31 2020-05-07 日本放送協会 Mtf measuring device and program therefor
JP2020191553A (en) * 2019-05-22 2020-11-26 シャープ株式会社 Image forming apparatus and image forming method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101513348A (en) * 2005-06-13 2009-08-26 株式会社日立制作所 Vein authentication device
CN101661555A (en) * 2008-08-25 2010-03-03 索尼株式会社 Vein imaging apparatus, vein imaging method and vein authentication apparatus
CN101957914A (en) * 2009-07-20 2011-01-26 深圳大学 Finger vein image acquisition device
CN107004114A (en) * 2014-11-28 2017-08-01 株式会社日立制作所 Blood-vessel image filming apparatus and individual authentication system
JP2018205785A (en) * 2015-10-29 2018-12-27 バイオニクス株式会社 Personal authentication device
JP2018081469A (en) * 2016-11-16 2018-05-24 株式会社 日立産業制御ソリューションズ Blood vessel image pickup apparatus and personal authentication system
JP2019096168A (en) * 2017-11-27 2019-06-20 株式会社日立製作所 Biometric authentication device and biometric authentication system
JP2020071144A (en) * 2018-10-31 2020-05-07 日本放送協会 Mtf measuring device and program therefor
JP2020191553A (en) * 2019-05-22 2020-11-26 シャープ株式会社 Image forming apparatus and image forming method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHRISTOF KAUBA 等: "Combined Fully Contactless Finger and Hand Vein Capturing Device with a Corresponding Dataset", 《SENSORS》, 17 November 2019 (2019-11-17), pages 1 - 25 *
杨璐: "手指静脉识别方法研究", 《中国博士学位论文全文数据库 信息科技辑》, no. 2016, 15 November 2016 (2016-11-15), pages 138 - 12 *

Also Published As

Publication number Publication date
JP2021089550A (en) 2021-06-10
JP7349891B2 (en) 2023-09-25

Similar Documents

Publication Publication Date Title
TWI381319B (en) Refers to the intravenous authentication device
CN101866419B (en) Personal authentication device
US7957563B2 (en) Personal identification apparatus and method using living body
CN103177242B (en) Finger vein authentication device
KR101551369B1 (en) Vein authentication device, vein authentication imaging device, and vein illuminating method
JP6607755B2 (en) Biological imaging apparatus and biological imaging method
JP4770936B2 (en) Vein authentication device and template registration method
JP5055210B2 (en) Finger vein authentication device
JP2017091186A (en) Authentication apparatus using biological information and authentication method
EP2172911A2 (en) Authentication apparatus
KR102538405B1 (en) Biometric authentication system, biometric authentication method and program
JP2004265269A (en) Personal identification device
US9355297B2 (en) Biometric information input apparatus and biometric information input method
CN104063679B (en) Blood-vessel image filming apparatus
KR20170116530A (en) Apparatus and Method for Recognizing Fake Fingerprint Using Wave Length of Light
KR101468381B1 (en) Method and apparatus for detecting fingerprint image
CN112906448A (en) Biometric authentication device
JP6759065B2 (en) Blood vessel imaging device and personal authentication system
JP4604117B2 (en) Personal authentication apparatus and method
JP2010277611A (en) Personal authentication device and method
JP5292500B2 (en) Finger vein authentication device
CN114283459A (en) Blood vessel image capturing device and person authentication system
JP7184538B2 (en) Blood vessel imaging device and personal authentication system
JP2008176690A (en) Biometrics device
JP6772710B2 (en) Biological imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination