WO2006051976A1 - 生体特徴入力装置 - Google Patents
生体特徴入力装置 Download PDFInfo
- Publication number
- WO2006051976A1 WO2006051976A1 PCT/JP2005/020905 JP2005020905W WO2006051976A1 WO 2006051976 A1 WO2006051976 A1 WO 2006051976A1 JP 2005020905 W JP2005020905 W JP 2005020905W WO 2006051976 A1 WO2006051976 A1 WO 2006051976A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- finger
- image
- image sensor
- input device
- dimensional
- Prior art date
Links
- 230000036961 partial effect Effects 0.000 claims abstract description 36
- 238000012545 processing Methods 0.000 claims abstract description 22
- 239000007787 solid Substances 0.000 claims description 34
- 210000004204 blood vessel Anatomy 0.000 claims description 23
- 238000003384 imaging method Methods 0.000 claims description 13
- 230000005855 radiation Effects 0.000 claims description 12
- 238000012937 correction Methods 0.000 claims description 7
- 210000001015 abdomen Anatomy 0.000 claims description 6
- 230000001678 irradiating effect Effects 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 claims description 2
- 210000003491 skin Anatomy 0.000 description 56
- 230000001681 protective effect Effects 0.000 description 44
- 238000000034 method Methods 0.000 description 33
- 230000003287 optical effect Effects 0.000 description 16
- 238000001514 detection method Methods 0.000 description 12
- 230000000694 effects Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 7
- 229910052710 silicon Inorganic materials 0.000 description 7
- 239000010703 silicon Substances 0.000 description 7
- 239000011521 glass Substances 0.000 description 6
- 239000000463 material Substances 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 210000002615 epidermis Anatomy 0.000 description 5
- 230000009467 reduction Effects 0.000 description 5
- 201000004624 Dermatitis Diseases 0.000 description 4
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 4
- 238000010521 absorption reaction Methods 0.000 description 4
- QVQLCTNNEUAWMS-UHFFFAOYSA-N barium oxide Chemical compound [Ba]=O QVQLCTNNEUAWMS-UHFFFAOYSA-N 0.000 description 4
- 210000004907 gland Anatomy 0.000 description 4
- 239000000126 substance Substances 0.000 description 4
- 210000001519 tissue Anatomy 0.000 description 4
- 206010040844 Skin exfoliation Diseases 0.000 description 3
- 230000002411 adverse Effects 0.000 description 3
- 239000008280 blood Substances 0.000 description 3
- 210000004369 blood Anatomy 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 210000004207 dermis Anatomy 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 102000001554 Hemoglobins Human genes 0.000 description 2
- 108010054147 Hemoglobins Proteins 0.000 description 2
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 238000001035 drying Methods 0.000 description 2
- 210000000744 eyelid Anatomy 0.000 description 2
- 238000003702 image correction Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- HTUMBQDCCIXGCV-UHFFFAOYSA-N lead oxide Chemical compound [O-2].[Pb+2] HTUMBQDCCIXGCV-UHFFFAOYSA-N 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000003921 oil Substances 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 210000003462 vein Anatomy 0.000 description 2
- 238000009736 wetting Methods 0.000 description 2
- 229910000831 Steel Inorganic materials 0.000 description 1
- 208000007536 Thrombosis Diseases 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- JZKFIPKXQBZXMW-UHFFFAOYSA-L beryllium difluoride Chemical compound F[Be]F JZKFIPKXQBZXMW-UHFFFAOYSA-L 0.000 description 1
- 229910001633 beryllium fluoride Inorganic materials 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 229910003460 diamond Inorganic materials 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 239000011019 hematite Substances 0.000 description 1
- 229910052595 hematite Inorganic materials 0.000 description 1
- 229910052742 iron Inorganic materials 0.000 description 1
- LIKBJVNGSGBSGK-UHFFFAOYSA-N iron(3+);oxygen(2-) Chemical compound [O-2].[O-2].[O-2].[Fe+3].[Fe+3] LIKBJVNGSGBSGK-UHFFFAOYSA-N 0.000 description 1
- 229910000464 lead oxide Inorganic materials 0.000 description 1
- 235000003715 nutritional status Nutrition 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 239000005342 prism glass Substances 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 235000012239 silicon dioxide Nutrition 0.000 description 1
- 239000000377 silicon dioxide Substances 0.000 description 1
- 239000010959 steel Substances 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 210000000434 stratum corneum Anatomy 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- GWEVSGVZZGPLCZ-UHFFFAOYSA-N titanium dioxide Inorganic materials O=[Ti]=O GWEVSGVZZGPLCZ-UHFFFAOYSA-N 0.000 description 1
- 238000003325 tomography Methods 0.000 description 1
- 230000037303 wrinkles Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
- A61B5/1172—Identification of persons based on the shapes or appearances of their bodies or parts thereof using fingerprinting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1335—Combining adjacent partial images (e.g. slices) to create a composite input or reference pattern; Tracking a sweeping finger movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1341—Sensing with light passing through the finger
Definitions
- the present invention relates to an apparatus for inputting biometric features for authenticating an individual.
- a biometric feature input device for authenticating an individual of this type is typically a device that reads a fingerprint that is a pattern on the skin of a fingertip. Fingerprints have the same characteristics as one person since ancient times, they have the same lifelong characteristics that never change, and they are studied especially in the fields of the police and the judicial field, and are used for high-accuracy personal identification. .
- a method using the total reflection critical angle of a fiberoptic plate or a prism is widely used as a fingerprint input device.
- a conventional fingerprint input device using, for example, the total reflection critical angle of a prism will be described with reference to FIG.
- the finger skin 104 is written with an enlarged skin pattern.
- the lens 106 and the two-dimensional image sensor 107 are disposed orthogonal to the prism surface 109 of the prism 105.
- the light 101 from the portion of the finger not in contact with the prism is also greatly refracted by the aerodynamic surface with a refractive index of 1.0 being incident on the prism surface 108 with a refractive index of 1.4 or more, and totally reflected by the prism surface 109.
- the force or prism surface 109 is not reached and the two-dimensional image sensor 107 is not reached.
- the refractive index of oil or water on the skin or the surface of the skin is close to prism glass, the light 102 of the portion in contact with the skin is the light at the prism surface 108.
- the refracting angle decreases, and the light enters a lens 106 which does not reach the total reflection angle at the prism surface 109, forms an image by the lens 106, and reaches the two-dimensional image sensor 107.
- a fingerprint image can be obtained based on whether or not the uneven pattern of the skin such as a fingerprint of a finger contacts the prism.
- this conventional fingerprint input device uses expensive and large optical components, which hinders the miniaturization and low cost of the device.
- a sub-one-dimensional sensor using pressure, temperature, and capacitance is used, and partial images of finger fingerprints obtained by moving a finger are stitched together to form a finger.
- Japanese Patent Application Laid-Open Nos. 10-91769 and 2001-155137 propose techniques for reconstructing a print image. Techniques for moving an object to read and reconstructing an image using a one-dimensional sensor are known in facsimiles and copiers, but a special mechanism is needed to obtain the speed in the direction of finger movement. . In order to omit such a special mechanism, in the technique proposed in Japanese Patent Laid-Open No. 10-91769, the image is reconstructed based on the similarity of images of several lines in a quasi-one-dimensional manner, Ru.
- FIGS. 2A and 2B An example of image reconstruction of a fingerprint of this scheme will be described with reference to FIGS. 2A and 2B.
- a partial image of II power In is obtained by moving the finger. Similar portions are removed from these partial images to obtain a reconstructed fingerprint image 302.
- FIGS. 3A and 3B when the finger is powered slowly with respect to the imaging speed of the sensor, the overlap between the adjacent partial images is large, and the judgment of the similarity is It will be difficult.
- the obtained fingerprint image 303 is longitudinally extended and distorted. Conversely, when the finger slide is performed faster than the imaging speed, as shown in FIGS.
- this conventional example has the problem that fingerprint authentication, that is, authentication due to biometric characteristics is difficult when skin is partially peeled off due to dermatitis or the like. Ru.
- the contactless fingerprint detection device has been proposed, for example, in Japanese Laid-Open Patent Publication No. 2003-85538.
- this non-contact method even if the finger on which the peeled part is difficult to read and reading is difficult in the method based on the above contact, the part of the internal structure of the skin that is the origin of the skin pattern is preserved. Then the video is obtained. Also, because it is non-contact, it is not susceptible to changes in skin surface conditions such as moisture and dryness.
- Light incident on the finger is scattered inside the finger and emitted light is emitted from the finger to reflect the internal structure of the skin.
- the concave part of the fingerprint becomes a light area
- the convex part becomes a dark area
- a gray-scale image of the same shape as the fingerprint is obtained.
- the structure of the dermis which is the basis of the epidermal pattern such as fingerprints, is preserved even if the stratum corneum is dropped off due to dermatitis or the like, regardless of the wet drying of the epidermis.
- a fingerprint image can be obtained.
- an intended image can not be obtained unless a space is provided between the finger and the imaging system.
- the finger and the imaging system are separated, and even if the amount of light emitted by the skin surface force changes due to the internal structure of the finger, the light scatters on the skin surface, which is considered to be an adverse effect due to diffusion due to the distance of the imaging system. Depending on the event, the skin was actually peeled off, and a fingerprint image with a good contrast could not be obtained in the part.
- a two-dimensional image sensor is provided in proximity to the finger, and the scattered radiation light such as finger force is imaged by the two-dimensional image sensor through a transparent protective cover made of glass or the like, and the concave portion of the fingerprint is a dark area
- a fingerprint input device for acquiring a fingerprint image in which a convex portion is a bright region is proposed by the inventor of the present application in Japanese Patent No. 3150126.
- Upper power of finger Irradiates near infrared rays, light intensity emitted from the inside of finger and discharges from the opposite finger
- the blood vessel image is obtained by darkening by absorption of near infrared rays in blood abundantly present in blood vessels.
- a fingerprint pattern can be read simultaneously, it can be complemented with fingerprint information, or a powerful source of information as to whether it is a living body or not. It is valid.
- the effective information content of the blood vessel pattern generally changes due to less nutritional status than various fingerprints and disorders such as blood clots and blood pressure.
- the accuracy is unconfirmed and remains as a future research topic, as compared with fingerprints whose research has been completed mainly by the police and the judicial field as inconsistencies in all life and lifelong.
- the proposal of non-contact fingerprint detection device 2003-85538 a space is required between the finger and the imaging optical system, and a need for focusing is required, and a fixed frame for the finger is also required, which hinders operability and downsizing of the device. It had become.
- the capillaries are present in the fingerprint portion of the fingertip, and the capillaries can not read the pattern by the above-mentioned method. Since the readable veins are at the base of the finger below the first joint, the part must be read simultaneously with the fingerprint portion of the finger tip above the first joint by the reduction optical system, so the device becomes larger. There is also the problem of
- a fingerprint detection method is disclosed in Japanese Patent Laid-Open No. 5-168610.
- the conventional fingerprint detection method is an optical fingerprint detection method in which a light source irradiates light to a specimen including a latent fingerprint and a fingerprint image obtained is arithmetically processed to detect a fingerprint.
- the surface temperature of the sample is measured in advance and stored as thermal image information, and then light of a wavelength in a region where absorption characteristics change depending on the amount of water or organic substance contained in the fingerprint component of the sample is fixed for a fixed time After projection, cut the projection light.
- the temperature of the sample surface at that time is measured and taken as thermal image information.
- a fingerprint information processing apparatus is disclosed in Japanese Patent Application Laid-Open No. 10-143663.
- This conventional fingerprint information processing apparatus has a fingerprint image detection unit for partially optically detecting the fingerprint of a subject.
- the relative position detection unit detects relative positions of a plurality of partial fingerprint images detected by the fingerprint image detection unit.
- the image combining unit forms a combined fingerprint image by correcting and combining positional deviations among the plurality of partial fingerprint images based on the relative position information detected by the relative position detection unit.
- the storage unit registers data of the synthesized fingerprint image as a registered fingerprint image for personal identification information.
- the photo sensor area of the optical image sensor 2 has an effective image area for converting scattered light from the inside of the finger into an image signal and an ⁇ black reference area which does not react to light.
- the black reference area is formed by providing an optical light shielding film on silicon dioxide that covers the photo sensor area that may be connected by a thermally conductive film to the silicon substrate that is the base of the optical image sensor. Is made.
- the black reference area reading unit reads the dark current of the photodiode of the optical image sensor before and after the finger is placed on the optical image sensor, and the dark current comparison unit compares both current signals.
- the fingerprint collating unit takes in the image signal of the effective image area, collates and compares the fingerprint database.
- the fingerprint determination unit determines that the finger is genuine only when a difference greater than or equal to a predetermined value is recognized as a result of comparison of the image signals, and as a result of comparison of the fingerprint database, the features match.
- An object of the present invention is to provide a compact and inexpensive biometric feature that can stably input biometric features such as finger fingerprints using a one-dimensional or quasi one-dimensional image sensor. It is in providing a force device.
- Another object of the present invention is to provide a small and inexpensive biometric feature input device capable of inputting a blood vessel image of a finger simultaneously with a fingerprint on the finger surface.
- Still another object of the present invention is to provide an electronic device provided with a finger travel guide for stably inputting a biometric feature such as a fingerprint of a finger using a one-dimensional or quasi-one-dimensional image sensor. It is.
- the biometric feature input device includes: a one-dimensional or quasi one-dimensional image sensor; a finger; and an effective pixel portion of the image sensor during relative movement in which the finger and the image sensor slide relative to each other. And a finger travel guide for maintaining a substantially constant distance without contact, and an image by radiation emitted from the surface of the skin of the finger scattered inside the finger during the relative movement.
- Image processing means is provided by combining one-dimensional or quasi-one-dimensional partial images obtained by imaging with an image sensor!
- a one-dimensional or quasi one-dimensional image sensor, a finger and the image sensor mutually correspond to each other.
- a finger running guide for maintaining a substantially constant distance without contact between the finger and the effective pixel portion of the image sensor during relative movement sliding on the eyelid, and a light for blood vessel imaging on the back of the finger
- An upper light source for emitting light, a first image by radiation emitted from the skin surface of the finger scattered inside the finger, and light emitted from the upper light source pass through the inside of the finger and the skin of the finger is Primary image obtained by alternately imaging with the image sensor the second image of the radiation emitted from the surface during the relative movement
- An image processing means for connecting original or quasi one-dimensional partial images for each first image and each second image, and for extracting a blood vessel image which is a difference between the first and second reconstructed images; There is.
- the finger travel guide may have a gap immediately above the effective pixel portion of the image sensor. It is preferable that the height of the gap is 10 m or more and 200 IX m or less, and the width parallel to the relative movement direction is an effective pixel length in the sub scanning direction of the image sensor or more and 2. O mm or less.
- a solid having light transparency may be inserted into the gap.
- the finger travel guide immediately above the effective pixel portion of the image sensor may be made of a solid having light transparency.
- the height of the solid is preferably 10 m or more and 200 m or less.
- the refractive index of the said solid is larger than 1.1. More preferably, the refractive index of the solid is greater than 1.1 and less than 1.4.
- the refractive index of the said solid is larger than 2.0. It is more preferable that the refractive index power of the solid is greater than 2.0 and less than 5.0.
- the living body characteristic input device may further include a lower light source that generates scattered light inside the finger by irradiating light to the abdomen of the finger from the vicinity of a region to be read by the image sensor.
- the biometric feature input device may further include a band pass filter for extracting an image component of an output image signal power of the image sensor and a fingerprint pitch, and an automatic gain control circuit for amplifying an output of the band pass filter.
- the image processing means may include a correction means for correcting distortion of the connected image by frequency analysis of a fingerprint portion.
- the first electronic device is characterized in that the height is 10 ⁇ m or more and 200 ⁇ m or less and the short side width of the side edge of the image sensor is directly above the effective pixel portion of the one-dimensional or quasi one-dimensional image sensor. 2.
- a gap which is greater than or equal to the effective pixel length in the eyelid direction 2.
- the gap is less than or equal to O mm, and the finger does not contact the effective pixel portion of the image sensor during relative movement in which the finger and the image sensor slide relative to each other. It is preferable to provide a finger travel guide for maintaining a substantially constant distance.
- the finger travel guide prevents contact between the finger and the effective pixel portion of the image sensor. Since the distance between the finger and the effective pixel portion is too long to prevent the image from being blurred or the distance from being distorted, the image is prevented from being distorted.
- the radiation emitted from the skin surface of the finger can be stably imaged by the image sensor during the relative movement, and thus generated by joining together the imaged one-dimensional or quasi-one-dimensional partial images. Accuracy of the entire finger image.
- FIG. 1 is a view for explaining the principle of an optical prism system on the premise of a conventional contact.
- FIGS. 2A and 2B are diagrams for explaining a conventional image reconstruction method.
- FIGS. 3A and 3B are diagrams for explaining problems in the case where the finger is moved slowly in conventional image reconstruction.
- FIGS. 4A and 4B are diagrams for explaining the problems in the case where the finger is quickly turned on in the conventional image reconstruction.
- FIGS. 5A and 5B are a top view and a cross-sectional view of a biometric feature input device according to a first embodiment of the present invention.
- FIG. 6 is a view for explaining a state in which a finger is pressed against the gap of the finger travel guide of the biometric feature input device according to the first embodiment of the present invention.
- FIG. 7 is a view for explaining a state in which the finger is moved along the finger travel guide of the biometric feature input device according to the first embodiment of the present invention.
- FIG. 8 is a view for explaining the internal structure of the skin of a finger.
- FIG. 9 is a flow chart showing an example of processing by the microprocessor unit of the biometric feature input device of the first embodiment of the present invention.
- FIGS. 10A and 10B are a top view and a cross-sectional view of a biometric feature input device according to a second embodiment of the present invention.
- FIG. 11 is a view for explaining the function of the light source of the biometric feature input device of the second embodiment of the present invention.
- FIG. 12 is a view showing an example of a fingerprint image read by a biometric feature input device according to a second embodiment of the present invention.
- FIG. 13 is a view showing an example of a fingerprint image inputted by the conventional fingerprint input device using the total reflection critical angle.
- FIG. 14 is a cross-sectional view of a biometric feature input device according to a third embodiment of the present invention.
- FIGS. 15A and 15B are explanatory views of the operation of the biometric feature input device according to the third embodiment of the present invention.
- FIG. 16 is a graph showing the relationship between the refractive index and the contrast of the transparent solid film interposed between the two-dimensional image sensor and the finger.
- FIG. 17 is a view showing an example of a fingerprint image read by the biometric feature input device according to the third embodiment of the present invention.
- FIG. 18 is a cross-sectional view of a biometric feature input device according to a fourth embodiment of the present invention.
- FIG. 19 is a cross-sectional view of a biometric feature input device according to a fifth embodiment of the present invention.
- FIG. 20 is a cross-sectional view of a biometric feature input device according to a sixth embodiment of the present invention.
- FIG. 21 is a view for explaining the principle of reading a blood vessel image at the same time as a fingerprint image by the biometric feature input device according to the sixth embodiment of the present invention.
- FIG. 22 is a flow chart showing an example of processing by a microprocessor unit of the biometric feature input device of the sixth embodiment of the present invention.
- FIGS. 23A and 23B are diagrams for explaining an image correction method for a spiral fingerprint in a biometric feature input device according to a sixth embodiment of the present invention.
- FIGS. 24A and 24B are diagrams for explaining an image correction method for a scale-like fingerprint in a biometric feature input device according to a sixth embodiment of the present invention.
- FIG. 25 is a view for explaining the shape of an arcuate fingerprint.
- FIG. 26 is a flow chart showing an example of processing by a microprocessor unit of the biometric feature input device of the seventh embodiment of the present invention.
- the biometric feature input device includes a one-dimensional or quasi one-dimensional image sensor 5 and a gap 2 directly above the effective pixel portion 1 of the image sensor 5.
- the AZD conversion unit 7 which converts the analog output signal of the image sensor 5 into a digital signal, the control of the imaging timing of the image sensor 5 and the AZD conversion unit 7 And a microphone port processor unit 8 that executes image processing and the like on the output digital signal.
- the one-dimensional image sensor 5 is a one-line image sensor, and the quasi-one-dimensional image sensor 5 is a rectangular image sensor of about two to twenty lines.
- the ridge spacing of the fingerprint is about 0.2 mm force for adults and about 0.5 mm for children and women.
- the sensor (light receiving element) pitch be approximately 20 to 50 m.
- the width and radius of the finger are taken into consideration and the width of about 15 mm is considered as the contact effective part, for example, if the sensor of 2 lines and 5 12 dots at 12 intervals is arranged in 12 lines to make a quasi 1 dimensional image sensor, An image of 15.15 mm wide and 0.35 mm long can be obtained at a time.
- the image sensor 5 can be manufactured by CMOS, CCD, TFT technology, etc., and its density and size can be sufficiently produced by the current integrated circuit technology, and an image sensor that has been put to practical use in video cameras etc. Necessary and sufficient sensitivity can also be obtained by considering the fact that m or less.
- the finger travel guide 3 has a substantially constant distance without contact between the finger 4 and the effective pixel portion 1 of the image sensor 5 during relative movement at the time of fingerprint acquisition in which the finger 4 and the image sensor 5 slide relative to each other. It is provided between the finger 4 and the image sensor 5 so as to be maintained.
- the finger The traveling guide 3 is made of an opaque material, and is attached to a housing (not shown) dedicated to the biometric feature input device, or attached to a housing of an electronic device such as a cellular phone or a personal computer, or the like. Configure a part of the chassis.
- the shape of the gap 2 provided in the finger travel guide 3 of the present embodiment is rectangular when viewed from directly above, and the long side size thereof is sufficient for light to be applied to the effective pixel portion 1 of the image sensor 5.
- the size is equal to or more than the short side of 1 (sub scanning direction). If the gap 2 is too large, the skin of the finger 4 will come into direct contact with the effective pixel portion 1 at the time of fingerprint collection, so that the short side size is 2. Omm or less, preferably 1. Omm or less.
- the size in the height (depth) direction of the gap 2 is too small, the skin of the finger 4 directly contacts the effective pixel portion 1 at the time of fingerprint collection, and if too large, the skin of the finger 4 and the effective pixel Since the distance to the part 1 is too large and the image blur becomes worse, the size is 10 ⁇ m to 200 ⁇ m, preferably 20 ⁇ m to 80 ⁇ m.
- the AZD conversion unit 7 converts the analog output signal of the image sensor 5 into a digital signal and outputs the digital signal to the microprocessor unit 8.
- the microprocessor unit 8 inputs the digital signal of the A / D conversion unit 7 and Execute image processing.
- the first joint of the finger 4 is applied to the vicinity of the gap 2 of the finger travel guide 3 and Draw finger 4 in the direction of arrow 6 in Fig. 5B as you trace. Since the skin of the finger 4 has elasticity, as shown in FIG. 6, when the finger 4 is pressed in the direction of the arrow 601, the height and width of the finger 4 contact the effective pixel portion 1 of the image sensor 5 Even with the gap 2, if the gap 2 is lightly traced by the abdomen of the finger 4 as described above, a force 603 in the opposite direction to the pulling direction 602 is applied to the skin surface of the finger 4 as shown in FIG. When the finger 4 is not in contact with the pixel unit 1, the distance between the skin of the finger 4 and the effective pixel unit 1 is always kept constant while the finger 4 is moving.
- the tissue inside the finger of the epidermis 1004 has a dermis 1005, and the ridge 1002 of the ridge of the fingerprint There is a papillary gland tissue 1003 below.
- the dermis 1005 including the papillary gland contains more water and oil than the epidermis 1004, causing a difference in refractive index.
- the light emitted from the valley portion 1001 which is the concave portion of the fingerprint is reduced by the papillary gland projecting to the fingerprint ridge portion. Therefore, among the sensors (light receiving elements) arranged in the effective pixel portion 1 of the image sensor 5, the sensor closer to the ridge 1002 at the timing of imaging is incident compared to the sensor closer to the valley portion 100 1. As a result, a partial image in which the valley portion 1001 is a bright area and the ridge 1002 is a dark area is obtained.
- an analog signal that produces a one-dimensional or quasi-one-dimensional partial image obtained at appropriate timings is converted into a digital signal by the AZD conversion unit 7 and is input to the microprocessor unit 8.
- the microprocessor unit 8 reconstructs a pattern image of the skin of the entire finger 4 by connecting the partial images sequentially input.
- the connection processing of partial images performed by the microprocessor unit 8 is basically performed by judging the similarity between partial images in the same manner as the method described in FIG. 2A. An example of the process is shown in Fig.9.
- a partial image of one frame of the image sensor 5 is read, and written to the bottom of the first memory (not shown) (step S101).
- the partial image for one frame means an image obtained for all the lines in the case of the quasi-one-dimensional image sensor in which the image sensor 5 is composed of several lines, and the image sensor 5 is an image for one line In the case of a one-dimensional image sensor, the image obtained by that one line is meant.
- the partial image of one frame of the image sensor 5 is read again, and the partial image stored in the first memory is compared line by line from the top line (step S102).
- the image part of the line or more having the difference is the partial image stored in the first memory.
- the image sensor 5 is also configured to have 12 line forces, and the last 3 lines of the 12 lines read this time are the same as the 3 lines on the top line side of the partial image stored in the first memory, If the fourth to last line force is also different from the first line, the image portion from the first line to the ninth line is added onto the topmost line of the first memory.
- the image data for one finger is acquired in the above-described steps S102 to S104. (Step S105).
- a finger pattern image of the skin directly reflecting the internal structure of the finger 4 using the one-dimensional or quasi-one-dimensional image sensor 5 and unnecessary optical components is used. It is possible to read stably without being affected by wetting or drying, and to make the device simple and compact. The reason is that a small and inexpensive one-dimensional or quasi one-dimensional image sensor is used as the image sensor 5 and the finger 4 and the image sensor 5 are effective during relative movement in which the finger 4 and the image sensor 5 slide relative to each other.
- a finger travel guide 3 is provided to maintain a substantially constant distance without contacting the pixel portion 1, and the surface force of the finger 4 is scattered by the inside of the finger 4 and the skin surface force of the finger 4 is emitted.
- the image is directly captured by the image sensor 5 during the relative movement, and the obtained one-dimensional or quasi-one-dimensional partial images are stitched together by the image processing of the microprocessor sensor unit 8 to reconstruct a finger-like image. It is from.
- the biometric feature input device according to the second embodiment of the present invention is the same as the one shown in FIGS. 5A and 5B in that a plurality of light sources 151 are disposed on the finger travel guide 3.
- This embodiment is the same as the first embodiment except for the biometric feature input device according to the first embodiment.
- the light sources 151 are arranged in a line along the long side in the vicinity of the gap 2 of the finger travel guide 3 and move the finger 4 moving on the finger travel guide 3 in the direction indicated by the arrow 6 at the time of fingerprint extraction. It illuminates from the abdomen side (the finger travel guide 3 side) and generates scattered light inside the finger.
- the light source 151 is arranged on the side where the finger 4 is drawn with the gap 2 at the center so that scattered light can be sufficiently generated inside the fingertip even when the fingertip reaches the vicinity of the gap 2.
- the surface light that is scattered inside the finger The emitted light has a skin-like pattern even with ambient light alone If the light source 151 is placed in parallel with the one-dimensional or quasi-one-dimensional image sensor 5 in the finger pulling direction, the light from the light source 151 is scattered inside the finger and the light source The light component of the direction is strongly emitted. This situation will be described with reference to FIG.
- FIG. 11 shows an image of the same part of the same finger read by the method using the total reflection critical angle among the methods based on the conventional contact.
- FIG. 12 there is a light source on the lower side of the figure, which is the side that pulls a finger.
- the more distant fingerprint ridges become darker and the near side of the valleys brighter.
- the light source side of the fingerprint ridge is darker and the contrast is rising.
- This portion is considered to overlap with the attenuation effect of the scattered light inside the skin of the finger by the papillary gland tissue 1003.
- a pattern is missing around the center of the image in Fig. 13 and there is a force with a portion that corresponds to a peeled portion of the skin!
- the same site has a pattern in FIG. 11, and the image of the peeled portion which has conventionally been lost is also obtained with high V and contrast!
- the light source 151 is disposed on the side from which the finger 4 is pulled from the gap 2.
- the light source 151 may be disposed on the opposite side from the gap 2. You may arrange the
- a protective cover 801 made of a light transmitting solid is inserted in the gap 2 of the finger travel guide 3.
- This embodiment differs from the biometric feature input device according to the first embodiment shown in FIG. 1 and is otherwise the same as the first embodiment.
- the lower surface of the protective cover 801 is substantially in contact with the effective pixel portion 1 of the image sensor 5, and the upper surface thereof is substantially flush with the upper surface of the finger travel guide 3.
- the protection embedded in gap 2 of finger travel guide 3 around the first joint of finger 4 If the finger 4 is pulled so as to trace the protective cover 801 with the abdomen of the finger 4 and apply pressure near the cover 801, a part of the skin of the finger 4 always touches the protective cover. For this reason, among the light scattered in the finger and emitted from the skin surface force of the finger, the light emitted also from the fingerprint ridge portion contacting the protective cover, etc. is directly protected as shown by reference numeral 1111 in FIG. 15A.
- the light is incident on 801 and propagates in the protective cover 801 to reach the effective pixel portion 1 of the image sensor 5.
- light that has also emitted a force such as fingerprint valleys not in contact with the protective cover 801 enters the air layer once and propagates in the air layer as shown by reference numeral 1112 and then enters the protective cover 801 and thereafter the fingerprint ridges.
- the light which has been emitted is also propagated through the protective cover 801 in the same manner as the emitted light and reaches the effective pixel portion 1 of the image sensor 5.
- the light scattered in the finger and emitted from the skin surface of the finger is the light regardless of the fingerprint ridge and the fingerprint valley
- the light enters the air layer and propagates the air layer, and then reaches the effective pixel portion 1.
- the ridge portion is detected by the image sensor 5 as a dark region and the valley portion as a bright region.
- the protective cover 801 shown in FIG. 15A is interposed, if the refractive index of the protective cover 801 is close to the same value “1” as air, it is equivalent to FIG.
- the ridges are detected by the image sensor 5 as dark areas and valleys as bright areas. However, if the refractive index of the protective cover 801 is increased, the ridges are bright areas and the valleys are dark. As an area, it is detected by the image sensor 5. This is because the refractive index difference between the finger 4 and the air and between the air and the protective cover 801 is greater than the refractive index difference between the finger 4 and the protective cover 801 when the refractive index of the protective cover 801 increases.
- the light 11 12 emitted from the valley passes through the two interfaces with a large difference in refractive index (the interface between the finger and air, the interface between the air and the protective cover), and thus the skin surface
- the intensity of the emitted light in the valley is stronger than that of the ridge, when it reaches the effective pixel area 1, the light arriving from the ridge is stronger than the valley.
- a two-dimensional image sensor in which finger radiation is radiated close to the finger through a transparent protective cover made of glass or the like.
- a fingerprint image in which the valley portion of the fingerprint is a dark area and the ridge portion is a bright area is obtained.
- the refractive index of the protective cover 801 is a certain value, the contrast force SO between the ridges and the valleys is obtained.
- a refractive index value is referred to as a singular point
- the protective cover 801 is made of a light transmitting solid having a refractive index of a value other than the value near the singular point.
- the refractive index of the protective cover 801 will be considered below.
- the line connecting the points of + marks is for the case where the refractive index of the finger is assumed to be 1.4
- the line connecting the points of X is for the case where the refractive index of the finger is assumed to be 1.5.
- the graph in Fig. 16 is obtained by calculating only the effect due to the difference in refractive index between the skin of the finger and the interface between the air and the transparent solid film, which is different from the effect due to the internal structure of the skin of the finger.
- the contrast is 0%.
- the graph in FIG. 16 assumes that the power of light directed from inside the skin to the ridges is the same as the power of light directed to the valleys.
- the refractive index is 1.0
- the same contrast as in the first embodiment is obtained.
- the contrast value is negative.
- the protective cover 801 with a refractive index of 1.1 has a contrast of valleys and ridges of zero.
- the refractive index of the protective cover 801 needs to be greater than or equal to 1.0 and less than 1.1, or greater than 1.1. Since almost no light transmitting solid having a refractive index of less than 1.1, the protective cover 801 may be composed of a light transmitting solid having a refractive index of substantially greater than 1.1.
- the refractive index of the transparent solid film has a contrast in the range of 1. 4 to 2.0. Is particularly high. If the whole peeled part of the skin does not come in contact with the transparent solid film, the whole part does not have the same contrast, but rather a pattern reflecting the structure of the inside of the finger as described above. Therefore, if the contrast between the ridges in contact with the transparent solid film and the valleys not in contact with the transparent solid film is abnormally high compared to the contrast of the pattern, the sensor's dynamic range is not broad. Detection becomes difficult. Therefore, for the protective cover 801, a refractive index in the range of 1. 4 to 2.0, for which the contrast is particularly high in FIG. 16, is not suitable.
- the refractive index of the protective cover 801 be greater than 1.1 and less than 1.4 or less than 2.0 greater than 5.0.
- An example of a substance having a refractive index of less than 1.4 and suitable for the protective cover 801 is, for example, a glass containing BeF 3 (beryllium fluoride) as a main component.
- the protective cover 801 As a solid suitable for the protective cover 801 with a substance having a refractive index of greater than 2.0, for example, a glass containing a large amount of BaO (barium oxide) or PbO (lead oxide), hematite (red iron and steel), rutile (gold akaiseki), germana -There is Yuum, diamond or silicon.
- silicon is easy to obtain as a semiconductor material, easy to process, and relatively inexpensive. If a silicon wafer is used as a protective cover with a thickness of 200 m or less, a sufficient sensor light output can be obtained with high transparency in the low wavelength region of light, in particular, near infrared light with a wavelength of 800 to 1000 nm.
- Silicon is also an environmentally friendly material compared to glasses containing harmful substances.
- the lower part of the image sensor such as CMOS and CCD created from silicon wafer was thinly polished and the thickness to the photosensitive layer was set to 200 m or less and turned upside down, and it was the base of the original silicon wafer. If the lower part is in contact with the skin, an equivalent structure can be obtained without applying a special cover.
- the finger of the finger 4 read by the biometric feature input device of the present embodiment provided with a protective cover 801
- the image of the print is shown in FIG. It can be seen that the contrast of the ridges is also obtained in the round peeled area at the upper left of the image. However, the bright and dark areas are reversed with other places.
- the protective cover 801 is provided in this way, depending on the refractive index of the protective cover 801 as described above, the fingerprint ridge becomes bright depending on the conditions of the contact, and the valley becomes dark and the non-contact and the light and dark reverse. .
- This problem can be solved by image processing and fingerprint authentication. That is, it is sufficient to extract and connect only the continuity of ridges by edge emphasis.
- the authentication method is based on the position relationship between the bifurcation point and the end point of the fingerprint and the feature point, the inversion of light and dark does not affect the authentication.
- a band pass filter 1801 and an automatic gain control circuit 1802 are connected between the image sensor 5 and the AZD conversion unit 7.
- the third embodiment is the same as the third embodiment except for the biometric feature input device according to the third embodiment shown in FIG.
- the band pass filter 1801 extracts only the image component of the fingerprint pitch from the image signal power output from the image sensor 5.
- the optimum frequency characteristics of the band-pass filter 1801 are determined from the sensor density and scanning frequency, taking into account fingerprint ridge pitch 0.2 mm to 0.5 mm.
- the image component extracted by the band pass filter 1801 is amplified by the automatic gain control circuit 1802 in the subsequent stage and output to the AZD conversion unit 7.
- the band pass filter 1801 for extracting only the image component of the fingerprint pitch from the output of the image sensor 5 and the automatic gain control circuit 1802 for amplifying the output are provided.
- the small output of the part can also be increased.
- the output of the peeled portion becomes too small and recognition becomes difficult.
- the example can improve such a problem. It is advantageous in price if it is possible to use a material such as ordinary glass with a refractive index of 1.4 or more and 2.0 or less for the protective cover 801. Of course, 1. It is made of a material with a refractive index other than 4 to 2.0 This embodiment is also effective when the protective cover 801 is configured.
- the biometric feature input device differs from the third embodiment in that the entire finger travel guide 3 is a protective cover 901, and the other points are the third. It is the same as the example.
- the protective cover 901 is made of a light transmitting solid having the same refractive index as that of the protective cover 801 in the third embodiment, and the thickness condition is the same as that of the protective cover 801.
- the entire travel guide 3 becomes the protective cover 901, so that it is excellent in assemblability,! There is.
- the biometric feature input device includes a light source 161 for illuminating the back (nail side) of the finger 4 from above, and simultaneously with the skin pattern of the finger 4, a blood vessel image.
- the second embodiment is different from the second embodiment in reading, and the other points are the same as the second embodiment.
- a light source 161 attached to the upper part (back) of the finger 4 by means of a support (not shown) is for reading blood vessels of the finger, and has a stronger absorption of near-infrared light than other biological fibers, and hemoglobin.
- the light is well absorbed and light is irradiated from around 800 to 100 nm.
- an LED developed for infrared remote control in the wavelength range of 820 to 950 nm has a large output and is suitable for the light source 161.
- the image derived from the light source 151 placed under the finger 4 can only obtain an image of the skin surface.
- the image from the light source irradiated from the upper light source 161 passes through a thick blood vessel where blood containing hemoglobin is concentrated.
- the second joint of the finger 4 is applied to the vicinity of the gap 2 of the finger travel guide 3 and the finger 4 Pull the finger 4 as you trace the gap 2 in the abdomen of the stomach, and so on. While the finger 4 is moving, an image is captured by the image sensor 5.
- the light source 151 disposed under the finger is turned on to obtain an image.
- the light source 151 below the finger is turned off and the light source 161 above the finger is turned on to obtain the next frame image of the image sensor 5.
- an image 1701 by the lower light source 151 and an image 1702 by the upper light source 161 as shown in FIG. 21 are obtained.
- a fingerprint 1704 and a texture 1707 between the first joint 1705 and the second joint 1706 exist, but the image from the light source 161 from the top further includes a blood vessel image 1708.
- An image 1703 of only the blood vessel image 1709 can be obtained by calculating the difference between the two images 1701 and 1702 in which these are alternately switched. The process of obtaining this difference is performed by the microprocessor unit 8.
- Figure 22 shows an example of this process.
- a partial image of one frame of the image sensor 5 is read in a state where only the light source 151 is lit, and is written in a first memory (not shown) (steps S201 and S202).
- a partial image of one frame of the image sensor 5 is read in a state where only the light source 1 61 is lit, and is written in a second memory (not shown) (steps S203 and S204).
- the partial image of one frame of the image sensor 5 is read in a state where only the light source 151 is turned on again, and compared with the line on the highest line side of the image stored in the first memory in line units (step S205 , S2 06), if there is a line that is different from the line on the top line side of the image stored in the first memory among the lines read this time, the line after the line with the difference is the top of the first memory Add to the line side (steps S207 and S208).
- the partial image of one frame of the image sensor 5 is read in a state where only the light source 161 is lighted, and compared with the line on the highest line side of the image stored in the second memory in line units (step S209, S210), if there is a line having a difference from the line on the top line side of the image stored in the second memory among the lines read this time, the line after the line having the difference is the top of the second memory It is added to the upper line side (steps S211 and S212).
- the processes in steps S205 to S212 are repeated until image data for one finger is obtained (step S213).
- the image 1701 of FIG. 21 is stored in the first memory
- the image 1702 of FIG. 17 is stored in the second memory.
- the image stored in the second memory and the image stored in the first memory are subtracted to generate the image 1703 of FIG. 21 (step S214).
- the fingerprint of the fingertip 1704 and the first joint and the second joint can be performed in one operation at the same time. It becomes possible to read skin patterns 1707 and blood vessels 1709 between joints.
- the accuracy is not sufficient as information for personal identification, but it can be used as interpolation data for personal identification by skin patterns such as fingerprints, or for determination of false fingers. Therefore, all images have the effect of enabling personal authentication with higher accuracy than fingerprints of fingertips alone.
- the biometric feature input apparatus is the same as the first embodiment in that the processing for correcting the distortion of the image is executed after the connection processing of partial images in the microprocessor unit 8.
- the other points are the same as those of the first embodiment, unlike the vital biometric feature input device. Therefore, the configuration of this embodiment is the same as that of FIG.
- the microprocessor unit 8 joins partial images together and performs processing in this order, and processing for distortion correction of the image.
- the process of connecting partial images is the same as in the first embodiment.
- the image in the horizontal direction of the finger is not distorted at the sensor pitch of the image sensor 5 uniquely, but an image can be obtained in the vertical direction, even if the correlation is checked. I will.
- the authentication methods such as fingerprints
- the method of viewing the correlation between ridge branch points and end points is relatively resistant to distortion, but it is still desirable that correction be made to improve authentication accuracy. Therefore, in the present embodiment, it is used in the fingerprint that the components of the ridges are in the horizontal direction and the vertical direction and the interval between the ridges is almost constant in the individual. Predict and correct as follows.
- the frequency component 1202 of the horizontal ridge of the original finger pattern image 1201 is fl
- the frequency component 1203 of the vertical ridge is f2
- the pixel on the ridge before correction is Assuming that the vertical coordinate is Y, the ordinate Y 'of the pixel on the ridge after correction is given by the following equation.
- FIG. 23A shows a case where the image is drawn by pulling the finger slowly and the force is too fast and it becomes too short.
- Figures 23A and 23B show swirl marks in the form of fingerprints
- Figures 24A and 24B show wrinkle marks.
- these two patterns are often included in the case of humans.
- Figure 25 In the case of this bow-like print, the frequency components of the horizontal ridges differ greatly. Therefore, the above-mentioned method can not be applied.
- the arch pattern is statistically small (it is said to be less than 1% in Japanese), it is possible to correct most pattern images by the above correction method.
- FIG. 26 shows the flow of image processing performed by the microprocessor unit 8.
- Steps S301 to S305 show a process of combining partial images! /, Which is the same as steps S101 to S105 in FIG.
- Steps S306 to S308 show the procedure of the process of correcting the distortion of the image.
- image processing such as edge enhancement and skeleton processing is performed on the image stored in the first memory to extract ridges (step S306).
- the number of horizontal and vertical ridges of the image is determined and divided by the number of horizontal and vertical pixels to determine the frequency component fl of the horizontal ridge and the frequency component f2 of the vertical ridge (Step S 307).
- the form of the fingerprint pattern is judged from the shape of the ridge line, and if it is a vortex-like pattern and a ridge-like pattern shown in FIGS. 23A and 23B and FIGS. 24A and 245B, the ridgeline is Correct the ordinate of the pixel and stretch the image vertically (step S308).
- the same effect as that of the first embodiment can be obtained, and it is possible to obtain a pattern-like image with little distortion.
- the reason is the ability to predict and correct the vertical distortion of the image based on the difference in frequency components of the horizontal and vertical ridges in the image reconstructed by connecting the partial images.
- the present invention has been described with reference to several embodiments.
- the present invention is not limited to only the above embodiments, and various other additions and modifications are possible.
- the bandpass filter 1801 and the automatic gain in the fourth embodiment in the third to seventh embodiments in which the light source 151 of the second embodiment is provided, and in the first, third and fifth to seventh embodiments, the bandpass filter 1801 and the automatic gain in the fourth embodiment.
- an embodiment in which the control circuit 1802 is provided may be combined with any of the above-described embodiments as appropriate.
- the biometric feature input device is useful as a small and inexpensive reading device for stably reading a pattern such as a fingerprint of a finger or a blood vessel image. It is suitable as a device that can input biological characteristics even under adverse conditions such as skin peeling due to dermatitis.
- a biometric feature such as a fingerprint of a finger can be stably input by the image sensor.
- Two-dimensional image sensor at a fixed distance to a finger with curvature Although it is difficult to obtain a fingerprint image stably while maintaining a non-contact state, the present invention uses a one-dimensional or quasi-one-dimensional image sensor, and the finger and one-dimensional or quasi-one-dimensional
- the image sensor is provided with a finger travel guide for keeping a substantially constant distance without contact between the finger and the effective pixel portion of the image sensor during relative movement in which the image sensor and the image sensor slide relative to each other.
- FIG. 12 is an example image of the same part of the same finger according to an embodiment of the present invention.
- FIG. 17 is an example of the image of the same site according to another embodiment of the present invention, and in this image also, although the contrast is reversed with respect to the site of peeling, the image is obtained without any loss.
- biometric feature input device with higher accuracy than a conventional biometric feature input device using only a fingerprint.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Image Input (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA2588172A CA2588172C (en) | 2004-11-15 | 2005-11-15 | Apparatus for inputting biometrical feature |
AU2005302945A AU2005302945B2 (en) | 2004-11-15 | 2005-11-15 | Living body feature innput device |
EP05806995A EP1834581B1 (en) | 2004-11-15 | 2005-11-15 | Living body feature input device |
US11/719,293 US7903847B2 (en) | 2004-11-15 | 2005-11-15 | Apparatus for inputting biometrical feature |
HK07113730.1A HK1108341A1 (en) | 2004-11-15 | 2007-12-18 | Living body feature input device |
US12/905,184 US8170301B2 (en) | 2004-11-15 | 2010-10-15 | Apparatus for inputting biometrical feature |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004330830 | 2004-11-15 | ||
JP2004-330830 | 2004-11-15 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/719,293 A-371-Of-International US7903847B2 (en) | 2004-11-15 | 2005-11-15 | Apparatus for inputting biometrical feature |
US12/905,184 Division US8170301B2 (en) | 2004-11-15 | 2010-10-15 | Apparatus for inputting biometrical feature |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006051976A1 true WO2006051976A1 (ja) | 2006-05-18 |
Family
ID=36336641
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/020905 WO2006051976A1 (ja) | 2004-11-15 | 2005-11-15 | 生体特徴入力装置 |
Country Status (9)
Country | Link |
---|---|
US (2) | US7903847B2 (ja) |
EP (1) | EP1834581B1 (ja) |
KR (1) | KR100944021B1 (ja) |
CN (1) | CN100577102C (ja) |
AU (1) | AU2005302945B2 (ja) |
CA (1) | CA2588172C (ja) |
HK (1) | HK1108341A1 (ja) |
TW (1) | TW200632764A (ja) |
WO (1) | WO2006051976A1 (ja) |
Families Citing this family (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9286457B2 (en) | 2004-06-14 | 2016-03-15 | Rodney Beatson | Method and system for providing password-free, hardware-rooted, ASIC-based authentication of a human to a mobile device using biometrics with a protected, local template to release trusted credentials to relying parties |
JP2008020942A (ja) | 2006-07-10 | 2008-01-31 | Rohm Co Ltd | 個人識別装置及びこれを用いた電子機器 |
JP4640295B2 (ja) * | 2006-09-07 | 2011-03-02 | 株式会社日立製作所 | 個人認証装置及び方法 |
JP2008123206A (ja) * | 2006-11-10 | 2008-05-29 | Sony Corp | 登録装置、照合装置、登録方法、照合方法及びプログラム |
TWI340920B (en) * | 2007-02-09 | 2011-04-21 | Egis Technology Inc | Biometrics method based on a thermal image of a finger |
JP5773645B2 (ja) * | 2007-06-25 | 2015-09-02 | リアル イメージング リミテッド | 画像分析のための方法、装置およびシステム |
JP4941311B2 (ja) * | 2008-01-09 | 2012-05-30 | ソニー株式会社 | マウス |
JP5292821B2 (ja) * | 2008-01-16 | 2013-09-18 | ソニー株式会社 | 静脈画像取得装置および静脈画像取得方法 |
US7792334B2 (en) * | 2008-03-31 | 2010-09-07 | Immersion Corporation | Locating blood vessels |
JP5040835B2 (ja) * | 2008-07-04 | 2012-10-03 | 富士通株式会社 | 生体情報読取装置、生体情報読取方法および生体情報読取プログラム |
ES2335565B1 (es) | 2008-09-26 | 2011-04-08 | Hanscan Ip, B.V. | Sistema optico, procedimiento y programa de ordenador para detectar la presencia de un elemento biologico vivo. |
JP5287868B2 (ja) * | 2008-12-17 | 2013-09-11 | 富士通株式会社 | 生体認証装置及び生体認証方法 |
EP2383694B1 (en) * | 2009-01-28 | 2018-05-09 | Fujitsu Limited | Fingerprint reader and electronic device |
JP5424788B2 (ja) * | 2009-09-16 | 2014-02-26 | 株式会社日立ソリューションズ | 生体認証装置に用いる生体情報作成方法および認証方法並びに装置 |
EP2511872B1 (en) * | 2009-12-07 | 2020-05-13 | Nec Corporation | Fake finger discrimination device |
JP2011243042A (ja) * | 2010-05-19 | 2011-12-01 | Nec Corp | 生体撮像装置、及び生体撮像方法 |
EP2596478B1 (en) * | 2010-07-19 | 2019-09-04 | Risst Ltd. | Fingerprint sensors and systems incorporating fingerprint sensors |
TWI485629B (zh) * | 2011-11-21 | 2015-05-21 | Pixart Imaging Inc | An optical input device, an input detection method thereof, and a method for the optical input device |
US9846799B2 (en) | 2012-05-18 | 2017-12-19 | Apple Inc. | Efficient texture comparison |
US9135496B2 (en) | 2012-05-18 | 2015-09-15 | Apple Inc. | Efficient texture comparison |
US20140003683A1 (en) * | 2012-06-29 | 2014-01-02 | Apple Inc. | Far-Field Sensing for Rotation of Finger |
US9715616B2 (en) | 2012-06-29 | 2017-07-25 | Apple Inc. | Fingerprint sensing and enrollment |
US9202099B2 (en) | 2012-06-29 | 2015-12-01 | Apple Inc. | Fingerprint sensing and enrollment |
TWI518306B (zh) * | 2012-10-04 | 2016-01-21 | 原相科技股份有限公司 | 影像擷取裝置以及光學位移估測裝置 |
US9111125B2 (en) | 2013-02-08 | 2015-08-18 | Apple Inc. | Fingerprint imaging and quality characterization |
US10068120B2 (en) | 2013-03-15 | 2018-09-04 | Apple Inc. | High dynamic range fingerprint sensing |
JP6134662B2 (ja) * | 2014-01-31 | 2017-05-24 | 株式会社 日立産業制御ソリューションズ | 生体認証装置および生体認証方法 |
JPWO2015145778A1 (ja) | 2014-03-28 | 2017-04-13 | パイオニア株式会社 | 車載用照明装置 |
US9760755B1 (en) * | 2014-10-03 | 2017-09-12 | Egis Technology Inc. | Fingerprint matching methods and device |
JP6660720B2 (ja) * | 2015-12-08 | 2020-03-11 | 株式会社日立製作所 | 指静脈認証装置 |
KR102466995B1 (ko) * | 2015-12-21 | 2022-11-14 | 삼성전자주식회사 | 사용자 인증 장치 및 방법 |
JP6743429B2 (ja) * | 2016-03-11 | 2020-08-19 | 富士通株式会社 | 生体撮影装置、生体撮影方法および生体撮影プログラム |
US10713458B2 (en) | 2016-05-23 | 2020-07-14 | InSyte Systems | Integrated light emitting display and sensors for detecting biologic characteristics |
US10931859B2 (en) * | 2016-05-23 | 2021-02-23 | InSyte Systems | Light emitter and sensors for detecting biologic characteristics |
CN106022067B (zh) | 2016-05-30 | 2018-03-27 | 广东欧珀移动通信有限公司 | 一种解锁控制方法及终端设备 |
US10037454B2 (en) * | 2016-12-19 | 2018-07-31 | Fingerprint Cards Ab | Method and device for forming a fingerprint representation |
US10339361B2 (en) * | 2017-03-23 | 2019-07-02 | International Business Machines Corporation | Composite fingerprint authenticator |
CN107256068B (zh) | 2017-05-12 | 2019-08-23 | Oppo广东移动通信有限公司 | 指纹的采集方法及相关产品 |
JP6978665B2 (ja) | 2017-07-25 | 2021-12-08 | 富士通株式会社 | 生体画像処理装置、生体画像処理方法及び生体画像処理プログラム |
CN108073912B (zh) * | 2018-01-03 | 2021-01-26 | 京东方科技集团股份有限公司 | 指纹识别装置和指纹识别设备 |
US11308339B2 (en) * | 2018-01-30 | 2022-04-19 | T-Mobile Usa, Inc. | Methods and systems for identifying and profiling biological tissue |
TWI654441B (zh) * | 2018-06-29 | 2019-03-21 | 金佶科技股份有限公司 | 取像裝置 |
US10599909B2 (en) * | 2018-08-07 | 2020-03-24 | UITResFP, LLC | Electronic device and method for non-contact capacitive and optical pin hole fingerprint detection |
KR102716356B1 (ko) * | 2019-01-25 | 2024-10-10 | 삼성전자주식회사 | 생체신호 측정용 텍스쳐 인터페이스 및 이를 포함한 생체신호 측정장치 |
US11353993B2 (en) * | 2020-05-05 | 2022-06-07 | Pixart Imaging Inc. | Optical detection device |
CN112464866B (zh) * | 2020-06-15 | 2024-02-27 | 神盾股份有限公司 | 指纹感测装置以及指纹感测方法 |
CN112603276B (zh) * | 2020-12-28 | 2022-08-02 | 中科彭州智慧产业创新中心有限公司 | 双手寸口脉搏波的快速检测设备及方法 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04190470A (ja) * | 1990-11-26 | 1992-07-08 | Sharp Corp | 指紋入力装置 |
JPH08154921A (ja) * | 1994-12-06 | 1996-06-18 | Nippon Telegr & Teleph Corp <Ntt> | 指紋撮像装置 |
JPH10222641A (ja) * | 1997-02-05 | 1998-08-21 | Nec Corp | 指ガイド付指紋画像入力装置 |
JP2003303178A (ja) * | 2002-04-12 | 2003-10-24 | Nec Corp | 個人識別システム |
JP2004234040A (ja) * | 2003-01-28 | 2004-08-19 | Hitachi Ltd | 個人認証装置 |
JP2005174280A (ja) * | 2003-11-18 | 2005-06-30 | Canon Inc | 画像取得装置、指紋認証装置及び画像取得方法 |
JP2005242907A (ja) * | 2004-02-27 | 2005-09-08 | Casio Comput Co Ltd | リング型読み取り装置 |
Family Cites Families (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0210988B1 (en) | 1984-03-20 | 1989-07-05 | National Research Development Corporation | Method and apparatus for the identification of individuals |
JPH0345629A (ja) | 1989-07-14 | 1991-02-27 | Shin Etsu Chem Co Ltd | ビス―シリルアルコキシアリーレン化合物 |
JPH03150126A (ja) | 1989-11-08 | 1991-06-26 | Toppan Printing Co Ltd | 真空圧空成形機 |
US5177802A (en) * | 1990-03-07 | 1993-01-05 | Sharp Kabushiki Kaisha | Fingerprint input apparatus |
JPH05168610A (ja) | 1991-12-20 | 1993-07-02 | Kawatetsu Techno Res Corp | 指紋検出方法 |
JP3045629B2 (ja) | 1993-02-17 | 2000-05-29 | 三菱電機株式会社 | 凹凸パターン検出装置 |
FR2749955B1 (fr) | 1996-06-14 | 1998-09-11 | Thomson Csf | Systeme de lecture d'empreintes digitales |
JPH10143663A (ja) | 1996-11-13 | 1998-05-29 | Hamamatsu Photonics Kk | 指紋情報処理装置 |
JPH10208022A (ja) | 1997-01-23 | 1998-08-07 | Hamamatsu Photonics Kk | ファイバ光学プレート |
JP2980051B2 (ja) | 1997-03-12 | 1999-11-22 | 日本電気株式会社 | 指紋検知方法および装置 |
US6259804B1 (en) * | 1997-05-16 | 2001-07-10 | Authentic, Inc. | Fingerprint sensor with gain control features and associated methods |
NO307065B1 (no) * | 1998-02-26 | 2000-01-31 | Idex As | Fingeravtrykksensor |
US6381347B1 (en) | 1998-11-12 | 2002-04-30 | Secugen | High contrast, low distortion optical acquistion system for image capturing |
JP3150126B2 (ja) | 1999-02-03 | 2001-03-26 | 静岡日本電気株式会社 | 指紋入力装置 |
JP4253827B2 (ja) | 1999-09-27 | 2009-04-15 | カシオ計算機株式会社 | 2次元画像読取装置 |
JP3738629B2 (ja) | 1999-11-25 | 2006-01-25 | 三菱電機株式会社 | 携帯型電子機器 |
JP3825222B2 (ja) * | 2000-03-24 | 2006-09-27 | 松下電器産業株式会社 | 本人認証装置および本人認証システムならびに電子決済システム |
JP2002049913A (ja) | 2000-08-02 | 2002-02-15 | Nec Corp | 指紋認証装置および指紋認証方法 |
WO2002061668A1 (en) * | 2000-12-05 | 2002-08-08 | Arete Associates, A California Corporation | Linear contact sensor apparatus and method for use in imaging features of an object |
JP2003006627A (ja) | 2001-06-18 | 2003-01-10 | Nec Corp | 指紋入力装置 |
JP4281272B2 (ja) | 2001-09-14 | 2009-06-17 | 三菱電機株式会社 | 指紋画像撮像方法、指紋画像取得方法、指紋画像撮像装置および個人識別装置 |
JP2003150943A (ja) | 2001-11-13 | 2003-05-23 | Casio Comput Co Ltd | 画像読み取り装置 |
JP4169185B2 (ja) * | 2002-02-25 | 2008-10-22 | 富士通株式会社 | 画像連結方法、プログラム及び装置 |
JP2003308516A (ja) | 2002-04-12 | 2003-10-31 | Matsushita Electric Ind Co Ltd | 指紋センサ及び電子機器 |
US6904126B2 (en) | 2002-06-19 | 2005-06-07 | Canon Kabushiki Kaisha | Radiological imaging apparatus and method |
DE60238281D1 (de) | 2002-09-17 | 2010-12-23 | Fujitsu Ltd | Gerät zur aufzeichnung biologischer informationen und biologische informationen benutzendes berechtigungsgerät |
JP4457593B2 (ja) * | 2003-08-11 | 2010-04-28 | 株式会社日立製作所 | 指認証装置 |
JP4207717B2 (ja) | 2003-08-26 | 2009-01-14 | 株式会社日立製作所 | 個人認証装置 |
JP2005182474A (ja) | 2003-12-19 | 2005-07-07 | Fujitsu Ltd | 指紋センサのセンサ面保護構造 |
JP4556111B2 (ja) | 2004-09-02 | 2010-10-06 | ソニー株式会社 | 情報処理装置 |
JP2006098340A (ja) | 2004-09-30 | 2006-04-13 | Sharp Corp | 内部検出装置 |
US7376451B2 (en) | 2004-10-27 | 2008-05-20 | General Electric Company | Measurement and treatment system and method |
JP2006221514A (ja) * | 2005-02-14 | 2006-08-24 | Canon Inc | 生体認証装置及び画像取得方法 |
JP4101243B2 (ja) | 2005-02-23 | 2008-06-18 | 上銀科技股▲分▼有限公司 | 交換可能な貯油装置付きボールねじ |
-
2005
- 2005-11-15 KR KR1020077010894A patent/KR100944021B1/ko not_active IP Right Cessation
- 2005-11-15 CA CA2588172A patent/CA2588172C/en not_active Expired - Fee Related
- 2005-11-15 CN CN200580039057A patent/CN100577102C/zh not_active Expired - Fee Related
- 2005-11-15 AU AU2005302945A patent/AU2005302945B2/en not_active Ceased
- 2005-11-15 WO PCT/JP2005/020905 patent/WO2006051976A1/ja active Application Filing
- 2005-11-15 TW TW094140058A patent/TW200632764A/zh not_active IP Right Cessation
- 2005-11-15 US US11/719,293 patent/US7903847B2/en not_active Expired - Fee Related
- 2005-11-15 EP EP05806995A patent/EP1834581B1/en not_active Not-in-force
-
2007
- 2007-12-18 HK HK07113730.1A patent/HK1108341A1/xx not_active IP Right Cessation
-
2010
- 2010-10-15 US US12/905,184 patent/US8170301B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04190470A (ja) * | 1990-11-26 | 1992-07-08 | Sharp Corp | 指紋入力装置 |
JPH08154921A (ja) * | 1994-12-06 | 1996-06-18 | Nippon Telegr & Teleph Corp <Ntt> | 指紋撮像装置 |
JPH10222641A (ja) * | 1997-02-05 | 1998-08-21 | Nec Corp | 指ガイド付指紋画像入力装置 |
JP2003303178A (ja) * | 2002-04-12 | 2003-10-24 | Nec Corp | 個人識別システム |
JP2004234040A (ja) * | 2003-01-28 | 2004-08-19 | Hitachi Ltd | 個人認証装置 |
JP2005174280A (ja) * | 2003-11-18 | 2005-06-30 | Canon Inc | 画像取得装置、指紋認証装置及び画像取得方法 |
JP2005242907A (ja) * | 2004-02-27 | 2005-09-08 | Casio Comput Co Ltd | リング型読み取り装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1834581A4 * |
Also Published As
Publication number | Publication date |
---|---|
CA2588172C (en) | 2012-08-14 |
US20090074263A1 (en) | 2009-03-19 |
HK1108341A1 (en) | 2008-05-09 |
EP1834581A4 (en) | 2009-09-16 |
US7903847B2 (en) | 2011-03-08 |
TWI312136B (ja) | 2009-07-11 |
EP1834581A1 (en) | 2007-09-19 |
CN101056578A (zh) | 2007-10-17 |
CA2588172A1 (en) | 2006-05-18 |
TW200632764A (en) | 2006-09-16 |
US8170301B2 (en) | 2012-05-01 |
AU2005302945B2 (en) | 2012-07-19 |
US20110025835A1 (en) | 2011-02-03 |
AU2005302945A1 (en) | 2006-05-18 |
EP1834581B1 (en) | 2012-01-11 |
KR20070068453A (ko) | 2007-06-29 |
CN100577102C (zh) | 2010-01-06 |
KR100944021B1 (ko) | 2010-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006051976A1 (ja) | 生体特徴入力装置 | |
JP4182988B2 (ja) | 画像読取装置および画像読取方法 | |
JP4466529B2 (ja) | 生体特徴入力装置 | |
US7801338B2 (en) | Multispectral biometric sensors | |
US7945073B2 (en) | Vein authentication device | |
JP5292821B2 (ja) | 静脈画像取得装置および静脈画像取得方法 | |
JP4182987B2 (ja) | 画像読取装置 | |
JP2000217803A (ja) | 指紋入力装置 | |
WO2013146761A1 (ja) | 認証装置、認証用プリズム体及び認証方法 | |
JP5556663B2 (ja) | 照合装置、照合方法、及びプログラム | |
JP6320277B2 (ja) | 生体認証装置 | |
JP2010503079A (ja) | ロバストな指紋取得のためのシステムおよび装置 | |
KR20070094736A (ko) | 인증 장치, 인증 방법 및 프로그램 | |
JP6443349B2 (ja) | 指認証プリズムを用いた生体認証装置及び生体認証方法 | |
JP5811386B2 (ja) | 認証装置、認証用プリズム体及び認証方法 | |
JP5811385B2 (ja) | 認証装置、認証用プリズム体及び認証方法 | |
JPH025190A (ja) | 指紋センサ | |
KR101547659B1 (ko) | 손가락 정맥 스캐닝을 이용하는 생체 인증 장치 및 이를 구비한 단말기 | |
JP2799054B2 (ja) | 指紋入力装置 | |
CN110580433A (zh) | 身份识别装置 | |
JPH04120671A (ja) | 指紋入力装置 | |
JPH04242486A (ja) | 指紋入力装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005302945 Country of ref document: AU Ref document number: 2588172 Country of ref document: CA Ref document number: 2005806995 Country of ref document: EP Ref document number: 11719293 Country of ref document: US Ref document number: 1020077010894 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580039057.1 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2005302945 Country of ref document: AU Date of ref document: 20051115 Kind code of ref document: A |
|
WWP | Wipo information: published in national office |
Ref document number: 2005302945 Country of ref document: AU |
|
WWP | Wipo information: published in national office |
Ref document number: 2005806995 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: JP |