WO2006025289A1 - 情報処理装置 - Google Patents
情報処理装置 Download PDFInfo
- Publication number
- WO2006025289A1 WO2006025289A1 PCT/JP2005/015595 JP2005015595W WO2006025289A1 WO 2006025289 A1 WO2006025289 A1 WO 2006025289A1 JP 2005015595 W JP2005015595 W JP 2005015595W WO 2006025289 A1 WO2006025289 A1 WO 2006025289A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- imaging
- image
- blood vessel
- finger
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/14—Vascular patterns
- G06V40/145—Sensors therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/14—Vascular patterns
Definitions
- the present invention relates to an information processing apparatus, and is suitably applied to, for example, imaging a blood vessel as a biometric authentication target.
- Biometrics authentication includes specific structures such as blood vessels that intervene in the living body. This unique structure intervening inside the living body makes it difficult not only to steal directly from the living body but also to impersonate a registrant by a third party, compared to a unique structure such as a fingerprint on the surface of the living body. Therefore, it is attracting attention as something that can enhance security.
- a finger is irradiated from the finger pad surface with near infrared light whose intensity is higher than the intensity of reflected light reflected from a living body (ordinary light in an atmosphere such as visible light), and the vascular tissue inside the finger Infrared light absorbed by hemoglobin and scattered in tissues other than vascular tissue is guided to a CCD (Charge Coupled Device) through a Mac mouth lens that transmits near-infrared light.
- CCD Charge Coupled Device
- the authentication device photoelectrically converts the near-infrared light to change the amount of charge per unit time that is stored in the CCD into the near-infrared light in the CCD.
- the blood vessel image signal is generated by adjusting the CCD so that the imaging sensitivity for the image becomes more sensitive than the normal light, and the presence or absence of a regular user is determined based on the blood vessel formation pattern in the blood vessel image signal. Yes.
- the present invention has been made in consideration of the above points, and can improve immediacy. It is intended to propose a processing device.
- the present invention provides an irradiating unit that irradiates a living body with a plurality of lights having different wavelengths, a spectroscopic unit that divides each light obtained from a living body, and an imaging for each light that is dispersed by the spectroscopic unit.
- the imaging target can be imaged at the same time without requiring control processing of the optical system, so that the processing load at the time of imaging can be reduced.
- an imaging signal that is emitted as an imaging result of an imaging element for each of the divided light beams, which is obtained by irradiating a living body with a plurality of lights having different wavelengths. Therefore, by separating the multiple image components corresponding to each light and executing the processing corresponding to these image components individually, the imaging target can be captured at the same time without the need for optical system control processing. Therefore, it is possible to reduce the processing load at the time of imaging, and thus to realize an information processing apparatus that can improve immediacy.
- FIG. 1 is a schematic diagram showing the overall configuration of the authentication apparatus according to the present embodiment.
- FIG. 2 is a schematic diagram showing an external configuration of the blood vessel imaging unit.
- Fig. 3 is a schematic diagram showing the direction of near-infrared light irradiation.
- FIG. 4 is a schematic diagram showing the structural units and characteristics of the filter array.
- FIG. 5 is a block diagram showing the configuration of the signal processing unit. BEST MODE FOR CARRYING OUT THE INVENTION Embodiments to which the present invention is applied will be described in detail below with reference to the drawings.
- reference numeral 1 denotes an authentication apparatus according to the present embodiment as a whole, and includes a hybrid imaging unit 2 that images a surface of a living finger FG and images a blood vessel inside the finger FG as an authentication target, and The imaging control unit 3 controls the hybrid imaging unit 2 so that the surface of the finger FG and the blood vessel are simultaneously imaged, and various types based on the imaging signal output as the imaging result of the hybrid imaging unit 2. And a signal processing unit 4 that executes processing.
- the hybrid imaging unit 2 has a substantially rectangular parallelepiped casing 11, and a curved guide groove 12 is formed on the upper surface of the casing 11.
- the imaging opening 13 is formed on the bottom surface in the vicinity of the tip of the guide groove 12.
- the hybrid imaging unit 2 guides the finger pad of the finger FG arranged so as to be placed in the guide groove 12 onto the imaging opening 13, and the tip of the guide groove 12 is connected to the tip of the finger.
- the position of the imaging opening portion 13 with respect to the finger FG arranged so as to abut is adapted to be positioned in accordance with the photographer.
- a transparent transparent opening cover 14 made of a predetermined material is provided on the surface of the imaging opening 13.
- the camera 11 1 has a camera directly below the imaging opening 13. Part 15 is provided.
- this hybrid imaging unit 2 prevents the inflow of foreign matter from the imaging opening 13 into the inside of the casing 11 and the camera caused by placing the finger FG in the imaging opening 13.
- Part 15 is designed to prevent contamination of 5 in advance.
- a pair of near-infrared light sources 16 (16 A and 16 B) that irradiate near-infrared light as blood vessel imaging light are provided on the side surfaces of the guide groove 12 to guide grooves. It is provided so as to sandwich the imaging aperture 13 in a state parallel to the short direction of 12.
- This near-infrared light source 16 has a wavelength range of about 900 to 1000 [nm], which is wavelength-dependent on both oxygenated hemoglobin and deoxygenated hemoglobin passing through the blood vessel (hereinafter referred to as the blood vessel-dependent wavelength region). It is designed to irradiate near infrared light.
- the imaging is not performed in the direction perpendicular to the imaging surface in the camera unit 15.
- Irradiation direction that forms an acute angle with surface F .1 hereinafter referred to as the near-infrared light irradiation direction
- the near-infrared light is irradiated from id.
- the irradiation direction that forms 30 ° to 60 ° with the imaging surface of the camera unit 15 becomes more effective.
- the high-speed imaging unit 2 can irradiate the finger pad side portion of the finger FG disposed in the guide groove 12 from the near-infrared light source 16 with near-infrared light.
- the near-infrared light is absorbed by the hemoglobin present in the blood vessel tissue inside the finger FG and is scattered in the tissue other than the blood vessel tissue so as to pass through the finger FG and the finger.
- the blood enters the camera unit 15 as the blood vessel projection light from the FG through the imaging aperture 13 and the aperture cover 14 one by one.
- This blood vessel projection light generally contains both oxygenated and deoxygenated hemoglobin in the capillary tissue inherent in the finger FG, but both of these hemoglobins have a wavelength dependent near red region in the blood vessel dependent wavelength region. Since the external light is irradiated, the capillary tissue inherent in the finger FG is more reflected.
- a pair of visible light sources 17 (17A and 17B) for irradiating visible light as fingerprint imaging light is provided on the side surface of the guide groove 12 in the longitudinal direction of the guide groove 12. Is provided so as to sandwich the imaging aperture 13 in a state of being parallel to each other.
- the visible light source 17 emits visible light from an irradiation direction (hereinafter, referred to as a visible light irradiation direction) substantially orthogonal to the imaging surface of the camera unit 15. Therefore, the hybrid imaging unit 2 can irradiate visible light from the visible light source 17 to the center of the finger pad of the finger FG disposed in the guide groove 12. In this case, the visible light is reflected on the surface of the finger FG, and is incident on the camera unit 15 through the imaging aperture 13 and the aperture cover 14 as sequential finger surface projection light.
- This camera unit 15 is configured by sequentially arranging a macro lens 2 1, a filter array 2 2, and a CCD image sensor 2 3 on the optical path of light incident from the aperture cover part 14. .
- the macro lens 21 condenses the blood vessel projection light and finger surface projection light incident from the aperture cover part 14 on the filter array 22.
- the filter array 22 has a plurality of pixel filters that transmit light having a wavelength corresponding to a predetermined color as a unit (hereinafter referred to as a color spectroscopic unit) arranged in a grid pattern.
- a color spectroscopic unit a unit that transmit light having a wavelength corresponding to a predetermined color as a unit (hereinafter referred to as a color spectroscopic unit) arranged in a grid pattern.
- the “R” pixel filter, “G” pixel filter, and “B” pixel filter are used as color spectral units.
- Fig. 4 (A) and Fig. 4 (B) light in the wavelength region of about 500 to 600 [nm] is transmitted through the upper left and lower right of the four adjacent pixel filters.
- “G” pixel filter, upper right part transmits light in the wavelength range of approximately 400 to 500 [nm]
- B” pixel filter evening lower left part transmits light in the wavelength range of approximately 600 to 700 [nm]
- “R” Filter array 22 is configured as a general RGB filter array because it is arranged as a pixel filter.
- the “R” pixel filter is designed to transmit the blood vessel-dependent wavelength region (approximately 900 to 1000 [ni]). It is different.
- the fill array 22 can disperse the finger surface projection light and blood vessel projection light obtained from the macro lens 21.
- the CCD image sensor 2 3 has a plurality of photoelectric elements arranged in a grid corresponding to the pixels.
- a conversion element is provided on the imaging surface, and the blood vessel projection light and finger surface projection light incident on the imaging surface are photoelectrically converted. Then, the CCD image pickup device 23 reads out the electric charge charged by the photoelectric conversion result under the control of the image pickup control unit 3, and outputs the read out charge to the signal processing unit 4 as the image pickup signal S1 0. It is made like that.
- the near-infrared light incident on the camera part 15 from the aperture cover part 14 is not only that obtained via the inside of the finger FG (blood vessel projection light) as described above, Some of them are reflected mainly on the surface of the finger FG (hereinafter, near infrared light reflected on the surface of the finger FG is referred to as surface reflected near infrared light).
- This surface-reflected near-infrared light is incident mainly from the direction perpendicular to the direction of near-infrared light irradiation.
- the blood vessel projection light and the finger surface projection light incident on the camera unit 15 from the aperture cover part 14 to the imaging unit 15 are caused by the presence of bone in the center of the finger cross section and the irradiation direction, etc.
- the light enters from a substantially vertical direction or a vertical direction. Therefore, in addition to the above-described configuration, the camera unit 15 has a polarization axis in a direction perpendicular to the direction perpendicular to the near-infrared light irradiation direction and a direction parallel to the visible light irradiation direction.
- a polarizing plate 24 having a polarization axis is provided on the RGB filter array 21.
- this polarizing plate 24 has a polarization axis in a direction perpendicular to the direction perpendicular to the near-infrared light irradiation direction, the surface-reflecting near infrared light incident on the camera unit 15 is emitted. Since it has a polarization axis that deviates from the road and is parallel to the visible light irradiation direction, it can transmit blood vessel projection light and finger surface projection light that are perpendicular to the imaging surface.
- blood vessel projection light and finger surface projection light incident through the opening cover unit 14 can be selectively guided to the image pickup surface of the CCD image pickup device 23. It is made like that.
- the hybrid imaging unit 2 captures an image of the finger FG surface. Both are designed to image blood vessels inside
- the imaging control unit 3 drives and controls the near-infrared light source 16, the visible light source 1 ⁇ ⁇ , and the CCD image sensor 23, respectively.
- the imaging control unit 3 uses a near-infrared light source control signal S 2 1 that is supplied from a main power supply unit (not shown) provided in the information processing apparatus 1 at a first voltage level. And a visible light source control signal S 2 2 at the second voltage level. Then, the imaging control unit 3 is driven by applying the near-infrared light source control signal S 2 1 and the visible light source control signal S 2 2 to the corresponding near-infrared light source 16 and visible light source 17. (Do it.)
- the finger pad side portion of the finger FG placed in the guide groove 12 is irradiated with near infrared light from the near infrared light irradiation direction, and at the same time, visible to the finger pad center portion of the finger FG. Visible light is irradiated from the light irradiation direction.
- the finger surface projection light obtained from the surface of the finger FG and the blood vessel projection light obtained via the inside of the finger FG are incident on the imaging surface of the CCD imager 23 at the same time. Is done.
- the imaging control unit 3 generates a CCD imaging device control signal S 23 having a predetermined duty ratio based on a clock signal supplied from a clock generation unit (not shown), It is driven by outputting to the CCD image sensor 23.
- the falling edge (or rising edge) of the CCD image sensor control signal S23 is used as a readout time point, and as a photoelectric conversion result of both finger surface projection light and blood vessel projection light by the readout time point.
- the charged charge is sequentially output to the signal processing unit 4 as the imaging signal S 10.
- the imaging control unit 3 can control the hybrid imaging unit 2 so as to simultaneously image the surface of the finger FG and the blood vessel.
- the signal processing unit 4 converts the first image signal component corresponding to the finger surface projection light from the imaging signal S 10 (hereinafter referred to as finger surface image component) and the blood vessel projection light.
- a signal separation unit 31 that separates the corresponding second image signal component (hereinafter referred to as a blood vessel image component) 3 1, executes position shift detection processing for the blood vessel image in the blood vessel image component based on the finger surface image component
- the misalignment detection processing unit 32 and the authentication processing unit 33 that executes the authentication processing based on the blood vessel imaging component.
- the signal separation unit 31 generates image data by performing A / D (Analog / Digita conversion) on the image signal S 10 output from the C C D image sensor 23.
- the signal separation unit 31 extracts, for example, pixel data corresponding to “G” for each color spectroscopic unit from the imaging data, and extracts the pixel data for the finger surface image component data (hereinafter referred to as “this”). This is called finger surface image data) and sent to the displacement detection processing unit 3 2 as D 3 1.
- the signal separation unit 31 extracts pixel data corresponding to “R” for each color spectroscopic unit from the imaging data, and extracts these pixel data groups from the blood vessel image component (hereinafter referred to as blood vessel image data). It is sent to the authentication processing unit 3 3 as D 3 2.
- the signal separation unit 31 can separate the finger surface image component and the blood vessel image component corresponding to the blood vessel projection light from the imaging signal S 10.
- the misalignment detection processing unit 3 2 holds an image of the finger FG surface (hereinafter referred to as a reference finger surface image) placed at the reference position.
- the cross-correlation between the image and the finger surface image of the finger surface image data D 31 is calculated, and the positional deviation state in the X direction and the Y direction in the finger surface image is detected.
- the misregistration detection processing unit 32 then authenticates the detection result as data D 33 for correcting the position of the blood vessel image data D 32 in the blood vessel image (hereinafter referred to as position correction data). Sends to part 3 3.
- the positional deviation detection processing unit 32 detects the positional deviation state of the finger FG at the time of imaging with respect to the imaging result of the finger FG surface. Compared to the above, the position shift state can be detected with high accuracy by the amount of noise components caused by scattering and the like.
- the misregistration detection processing unit 3 2 obtains the pixel data corresponding to “G” having the highest light quantity in the color spectral unit as the imaging result of the finger FG surface. Since 1) is used, the resolution of the finger surface image can be increased, and as a result, the displacement state can be detected with higher accuracy.
- the authentication processing unit 3 3 includes a blood vessel extraction unit 3 3 A and a collation unit 3 3 B, and is supplied from the blood vessel image data D 3 2 supplied from the signal separation unit 3 1 and the displacement detection processing unit 3 2.
- the position correction data D 3 3 is input to the blood vessel extraction unit 3 3 A.
- the blood vessel extraction unit 3 3 A corrects the position of the blood vessel image based on the blood vessel image data D 3 2 by shifting it by an amount corresponding to the position correction data D 3 3, and the corrected blood vessel image data D 3 2
- the noise component is removed by performing median fill evening processing.
- the blood vessel extraction unit 33 A performs, for example, Laplacian processing on the blood vessel image data D 32 from which the noise component has been removed, so that the blood vessel image data is extracted.
- the blood vessel image of the blood vessel image based on D 3 2 is extracted so as to be emphasized, and the blood vessel image from which the blood vessel contour has been extracted in this way is sent as authentication information D 3 4 to the matching unit 33 B.
- the verification unit 3 3 B executes a registration mode or an authentication mode in response to a mode determination signal supplied from an operation unit (not shown).
- the registration mode the blood vessel extraction unit Registration information D 3 4 supplied from 3 3 A is registered in the registration database DB as authentication information D 3 5.
- the verification unit 3 3 B in the authentication mode, uses the blood vessel image of the authentication information D 3 4 supplied from the blood vessel extraction unit 3 3 A and the registered authentication information D 3 5 registered in the registration database DB.
- the cross-correlation with the blood vessel image is calculated, and the blood vessel formation padan of the blood vessel image is collated.
- the collation unit 3 3 B determines that the person to be imaged captured by the hybrid imaging unit 2 at this time is registered database DB. If the cross-correlation value higher than the threshold is obtained, it is determined that the person to be imaged is the registrant, and the determination result is used as the determination data D 3 Send to the outside as 6.
- the authentication processing unit 33 performs the authentication process on the angiogenesis pattern inherent in the living body, thereby comparing the fingerprint formation pattern on the living body surface. In addition to preventing direct theft from living organisms, impersonation of registrants by third parties can also be prevented.
- the authentication processing unit 33 corrects the positional displacement of the blood vessel image as a pre-processing of the authentication processing, it is possible to avoid erroneous determination of the presence or absence of the registrant due to the positional displacement of the finger FG during imaging. As a result, it is possible to prevent a decrease in authentication accuracy (collation accuracy) due to the misalignment. Furthermore, in this case, the authentication processing unit 33 does not detect misalignment from the imaging result (blood vessel image) inside the finger FG with relatively many image quality degradation factors, but rather than the imaging result inside the finger FG. Because correction is performed using the position correction data D 3 3 detected from the imaging result (fingerprint image) with few image quality degradation factors, the displacement of the blood vessel image can be corrected easily and with high accuracy. A reduction in authentication accuracy (matching accuracy) can be further prevented.
- the authentication device 1 irradiates the living body with the first light (visible light) and the second light (near-infrared light) having a different wavelength from the first light at the same time.
- the first light (finger surface projection light (visible light)) obtained from the living body is transmitted mainly through the “G” pixel filter of the filter array 22 and the second light (blood vessel projection light). (Near-infrared light)) is transmitted through the “R” pixel filter and dispersed.
- the authentication device 1 uses the first signal corresponding to the first light from the imaging signal S 10 0 output as the imaging result of the imaging device with respect to the first light and the second light thus dispersed.
- the image signal component (finger surface image data D 3 1) and the second image signal component corresponding to the second light (blood vessel image data D 3 2) are separated, and the first image signal component (finger surface data) Based on the image data D 3 1), the first process (misalignment correction process) is executed, and on the basis of the second image signal component (blood vessel image data D 3 2), the second process (authentication process). Execute. Therefore, in this authentication device 1, it is possible to omit imaging the imaging object twice by imaging the imaging object at the same time and performing different processing from the imaging result. The processing load can be reduced.
- the authentication device 1 can avoid the situation where the complicated signal processing is adopted because the first image signal component and the second image signal component are not separated only by the signal processing system.
- the control processing of the optical system can be avoided, The processing load at the time of imaging can be further reduced. Furthermore in this case
- the authentication device 1 can avoid the physical switching of the optical system at the time of imaging, and thus can be downsized.
- the authentication device 1 employs visible light as the first light, is different from the first wavelength as the second light, and is dependent on the blood vessel inside the living body to be authenticated.
- Light is used to split the finger surface projection light (visible light) obtained from the surface of the living body and the blood vessel projection light (near-infrared light) obtained via the inside of the living body. Therefore, the authentication device 1 can simultaneously obtain different properties in the depth direction of the living body, while reducing the processing load during the imaging 9
- the authentication device 1 uses the image signal component corresponding to the finger surface projection light (visible light) and the blood vessel image in the second image signal component corresponding to the blood vessel projection light (near infrared light). A misalignment state is detected, and authentication processing is executed based on the second image signal component corrected according to the detection result.
- this authentication device 1 does not detect a position shift from the second component signal inside the finger FG having a relatively large image quality degradation factor, but has a smaller image quality degradation factor than the imaging result inside the finger FG. Since correction is performed using the result detected from the component signal of 1, the displacement of the blood vessel image of the second component signal can be corrected easily and with high accuracy. As a result, it is possible to prevent a decrease in authentication accuracy. it can
- the first and second light obtained from the living body are irradiated with the first light having the first wavelength and the second light having a wavelength different from the first wavelength.
- the imaging signal S10 After separating the first component signal corresponding to the first and the second component signal corresponding to the second light from the imaging signal S10 output as the imaging result of the image sensor for the second light
- the first process is executed based on the first component signal
- the second process is executed based on the second component signal, thereby eliminating the need to image the imaging target twice. Because it can Thus, it is possible to realize an information processing apparatus that can reduce the processing load at the time of imaging and thus improve immediacy.
- an irradiation means for irradiating a living body with a plurality of light beams having different wavelengths visible light and near-outside light of 900 to 1000 [nm] having dependency on a blood vessel to be authenticated.
- the present invention is not limited to this.
- a marker that is specific to a lesion in the living body is injected to be close to the visible light.
- the RGB filter array 22 shown in FIG. 4 is applied as a spectroscopic stage for separating each light obtained from a living body has been described.
- the invention is not limited to this, and various other file arrays can be applied.
- a complementary color filter array that color-divides the visible light (finger surface projection light) obtained from the surface of the living body into “C y”, “Y e”, “M g”, and “G”.
- Various units can be used as the color spectral unit in the complementary color filter array.
- the pixel fill corresponding to “M g” is designed to transmit infrared light, so that it can be applied without changing the fill rate characteristic. There is an advantage that it can be done.
- a filter array 22 configured so that the “R” pixel filter has a characteristic of transmitting a blood vessel-dependent wavelength region (approximately 900 to 1000 [nm]) is also suitable.
- a commonly used RGB filter may be applied.
- the “R” pixel filter is not strictly configured to cut near-infrared light near the wavelength range corresponding to “R”. Therefore, the resolution of the blood vessel image data D 3 2 obtained by extracting the pixel data corresponding to the “R” for each color spectroscopic unit is inferior to that of the above-described embodiment, but is largely contrary to the authentication processing result. Not reflected. Therefore, even in this case, the same effect as in the above-described embodiment can be obtained.
- Various color spectral units in the RGB filter array can be used instead of the one shown in Fig. 4 (A).
- a filter array in which a pixel filter that transmits visible light, near-infrared light, or third light is configured as a color spectral unit can be applied.
- a pixel filter that transmits visible light, near-infrared light, or third light is configured as a color spectral unit.
- manufacturing costs are increased, but there is an advantage that a plurality of lights irradiated by the irradiation means can be dispersed with high accuracy.
- a specific Mar force is injected into the lesion in the living body, and the third light having a wavelength different from the visible light and near infrared light and having dependency on the marker is used. Effective in applications where light is irradiated o
- the finger surface projection is performed as the signal processing means for executing the processing corresponding to each image component separated by the separating means.
- the first image signal component corresponding to the light visible light
- the position of the blood vessel image in the second image signal component corresponding to the blood vessel projection light near infrared light
- the present invention is not limited to this, and other signal processing units 4 You may make it suitable.
- the signal processing unit 4 determines the position of the blood vessel image in the second image signal component corresponding to the blood vessel projection light (near infrared light) based on the image signal component corresponding to the finger surface projection light (visible light). A misalignment state is detected, and fingerprint matching processing with a pre-registered fingerprint image is executed. Then, when the determination result of the registrant is obtained as the fingerprint collation processing result, the signal processing unit 4 executes the authentication process based on the second image signal component corrected according to the detection result. To do. In this way, the authentication accuracy in the authentication device 1 can be further improved.
- the signal processing unit 4 In addition, as described above, a marker that is specific to a lesion inside the living body is injected, and the visible light and the near infrared light are different in wavelength and have a dependency on the force.
- the signal processing unit 4 In the case of irradiating the third light, for example, the signal processing unit 4 generates tomographic image data based on the third light. Then, the signal processing unit 4 executes authentication processing based on the second image signal component corrected in accordance with the position shift detection result in the same manner as in the above-described embodiment, and registers as the authentication processing result.
- processing such as registering the tomographic image data in the database or displaying it on the display unit is executed.
- the signal processing means can select the processing corresponding to each image component separated by the separation means according to the application, and execute these processes accordingly. it can.
- CMOS Complementary Metal Oxide Semiconductor
- a high-intensity imaging is performed by irradiating near-infrared light from the finger pad side of the finger FG and imaging the blood vessel projection light obtained from the finger pad side through the inside of the finger FG.
- the imaging unit 2 is applied has been described, the present invention is not limited to this, and the near-infrared light is irradiated from the finger back side of the finger FG and is passed through the inside of the finger FG.
- a hybrid imaging unit that images blood vessel projection light obtained from the finger pad side may be applied. Even when this hybrid imaging unit is applied, the same effects as those of the above-described embodiment can be obtained.
- the image pickup unit 2 having the configuration shown in FIG. 1 and FIG. 2 is applied, other types of configurations may be adopted. Industrial applicability
- the present invention can be used, for example, when observing an imaging target from multiple angles.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Vascular Medicine (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Heart & Thoracic Surgery (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Biology (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Input (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020067006886A KR101159142B1 (ko) | 2004-09-02 | 2005-08-22 | 정보 처리 장치 |
BRPI0506170 BRPI0506170A (pt) | 2004-09-02 | 2005-08-22 | aparelho de processamento de imagem |
EP05775162.0A EP1785937B1 (en) | 2004-09-02 | 2005-08-22 | Information processing device |
US10/577,728 US7634116B2 (en) | 2004-09-02 | 2005-08-22 | Information processing device |
HK07101566A HK1096751A1 (en) | 2004-09-02 | 2007-02-09 | Information processing device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004256188A JP4556111B2 (ja) | 2004-09-02 | 2004-09-02 | 情報処理装置 |
JP2004-256188 | 2004-09-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006025289A1 true WO2006025289A1 (ja) | 2006-03-09 |
Family
ID=35999943
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/015595 WO2006025289A1 (ja) | 2004-09-02 | 2005-08-22 | 情報処理装置 |
Country Status (9)
Country | Link |
---|---|
US (1) | US7634116B2 (ja) |
EP (1) | EP1785937B1 (ja) |
JP (1) | JP4556111B2 (ja) |
KR (1) | KR101159142B1 (ja) |
CN (1) | CN100478989C (ja) |
BR (1) | BRPI0506170A (ja) |
HK (1) | HK1096751A1 (ja) |
RU (1) | RU2328035C2 (ja) |
WO (1) | WO2006025289A1 (ja) |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7460696B2 (en) | 2004-06-01 | 2008-12-02 | Lumidigm, Inc. | Multispectral imaging biometrics |
US7751594B2 (en) | 2003-04-04 | 2010-07-06 | Lumidigm, Inc. | White-light spectral biometric sensors |
US8229185B2 (en) * | 2004-06-01 | 2012-07-24 | Lumidigm, Inc. | Hygienic biometric sensors |
US8787630B2 (en) * | 2004-08-11 | 2014-07-22 | Lumidigm, Inc. | Multispectral barcode imaging |
CN100577102C (zh) | 2004-11-15 | 2010-01-06 | 日本电气株式会社 | 生物体特征输入装置 |
JP4864511B2 (ja) * | 2006-03-31 | 2012-02-01 | 富士フイルム株式会社 | 電子内視鏡装置およびプログラム |
JP4182987B2 (ja) | 2006-04-28 | 2008-11-19 | 日本電気株式会社 | 画像読取装置 |
JP5015496B2 (ja) * | 2006-06-01 | 2012-08-29 | ルネサスエレクトロニクス株式会社 | 固体撮像装置、撮像方法および撮像システム |
US7995808B2 (en) | 2006-07-19 | 2011-08-09 | Lumidigm, Inc. | Contactless multispectral biometric capture |
US8175346B2 (en) * | 2006-07-19 | 2012-05-08 | Lumidigm, Inc. | Whole-hand multispectral biometric imaging |
US8355545B2 (en) * | 2007-04-10 | 2013-01-15 | Lumidigm, Inc. | Biometric detection using spatial, temporal, and/or spectral techniques |
JP4969206B2 (ja) * | 2006-11-01 | 2012-07-04 | 京セラ株式会社 | 生体認証装置 |
US8027519B2 (en) * | 2006-12-13 | 2011-09-27 | Hitachi Maxwell, Ltd. | Imaging module for biometrics authentication, biometrics authentication apparatus and prism |
JP2008198083A (ja) * | 2007-02-15 | 2008-08-28 | Mitsubishi Electric Corp | 個人識別装置 |
FR2913788B1 (fr) * | 2007-03-14 | 2009-07-03 | Sagem Defense Securite | Procede et installation d'identification d'un individu par capture optique d'une image d'une empreinte corporelle |
KR101484566B1 (ko) * | 2007-03-21 | 2015-01-20 | 루미다임 인크. | 국소적으로 일관된 피처를 기초로 하는 생체인식 |
JP5050644B2 (ja) * | 2007-05-15 | 2012-10-17 | ソニー株式会社 | 登録装置、照合装置、プログラム及びデータ構造 |
JP5050642B2 (ja) * | 2007-05-15 | 2012-10-17 | ソニー株式会社 | 登録装置、照合装置、プログラム及びデータ構造 |
JP5034713B2 (ja) * | 2007-06-28 | 2012-09-26 | 株式会社日立製作所 | 指静脈認証装置および情報処理装置 |
JP4910923B2 (ja) * | 2007-07-20 | 2012-04-04 | ソニー株式会社 | 撮像装置、撮像方法及び撮像プログラム |
US7787112B2 (en) * | 2007-10-22 | 2010-08-31 | Visiongate, Inc. | Depth of field extension for optical tomography |
US20090159786A1 (en) * | 2007-12-19 | 2009-06-25 | Sony Corporation | Display apparatus and illumination apparatus |
JP5186929B2 (ja) * | 2008-01-21 | 2013-04-24 | 日本電気株式会社 | 認証用撮像装置 |
US20100246902A1 (en) * | 2009-02-26 | 2010-09-30 | Lumidigm, Inc. | Method and apparatus to combine biometric sensing and other functionality |
BR112012004177A2 (pt) * | 2009-08-26 | 2016-03-29 | Lumidigm Inc | método e sistema biométrico, sistema, método, métodos de localização de objeto, e de discriminação de objeto e de segundo plano, e, prisma multifacetado |
EP2500863B1 (en) * | 2009-11-10 | 2023-09-13 | Nec Corporation | Fake-finger determination device, fake-finger determination method and fake-finger determination program |
US8570149B2 (en) | 2010-03-16 | 2013-10-29 | Lumidigm, Inc. | Biometric imaging using an optical adaptive interface |
JP2011197786A (ja) * | 2010-03-17 | 2011-10-06 | Sony Corp | 情報処理装置および情報処理方法 |
JP5435746B2 (ja) * | 2011-01-24 | 2014-03-05 | 富士フイルム株式会社 | 内視鏡装置 |
BR112013019253B1 (pt) * | 2011-01-27 | 2021-06-29 | Lynxrail Corporation | Sistema de visão de máquina para a extração de descontinuidade de profundidade de imagem |
KR101517371B1 (ko) * | 2012-03-16 | 2015-05-04 | 유니버셜 로봇 가부시키가이샤 | 개인인증방법 및 개인인증장치 |
TWI536272B (zh) | 2012-09-27 | 2016-06-01 | 光環科技股份有限公司 | 生物辨識裝置及方法 |
US10229257B2 (en) * | 2013-01-31 | 2019-03-12 | Nec Corporation | Authentication apparatus, prism member for authentication, and authentication method |
CN103279733A (zh) * | 2013-03-19 | 2013-09-04 | 陈威霖 | 可提高辨识成功率的手指静脉辨识装置 |
US10254855B2 (en) | 2013-06-04 | 2019-04-09 | Wen-Chieh Geoffrey Lee | High resolution and high sensitivity three-dimensional (3D) cursor maneuvering device |
JP2013225324A (ja) * | 2013-06-12 | 2013-10-31 | Hitachi Ltd | 個人認証装置、画像処理装置、端末、及びシステム |
FR3049089B1 (fr) * | 2016-03-21 | 2018-02-16 | Sebastien Jean Serge Dupont | Procede permettant de gerer les validations des messages relatifs a une chaine de messages de facon unitaire a travers un reseau de validation decentralise |
FR3049090B1 (fr) * | 2016-03-21 | 2021-06-25 | Sebastien Jean Serge Dupont | Dispositif d'authentification biometrique adaptatif par echographie, photographies en lumiere visible de contraste et infrarouge, sans divulgation, a travers un reseau informatique decentralise |
DE102016213111B4 (de) * | 2016-07-19 | 2018-08-09 | Koenig & Bauer Ag | Inspektionssystem mit mehreren Erfassungsbereichen |
EP3657381B1 (en) * | 2018-09-25 | 2022-08-17 | Shenzhen Goodix Technology Co., Ltd. | Fingerprint recognition apparatus and method, and terminal device |
JP7519871B2 (ja) | 2020-10-21 | 2024-07-22 | 株式会社日立製作所 | 生体認証装置および生体認証方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11128176A (ja) * | 1997-10-29 | 1999-05-18 | Hitachi Ltd | 生体光計測装置 |
JP2003075135A (ja) * | 2001-08-31 | 2003-03-12 | Nec Corp | 指紋画像入力装置および指紋画像による生体識別方法 |
JP2003303178A (ja) * | 2002-04-12 | 2003-10-24 | Nec Corp | 個人識別システム |
JP2004054698A (ja) * | 2002-07-22 | 2004-02-19 | Io Network:Kk | 個人識別装置 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2565168B2 (ja) * | 1986-07-16 | 1996-12-18 | ソニー株式会社 | 撮像装置 |
JP2655571B2 (ja) * | 1986-12-27 | 1997-09-24 | オリンパス光学工業株式会社 | 撮像装置 |
US5246002A (en) * | 1992-02-11 | 1993-09-21 | Physio-Control Corporation | Noise insensitive pulse transmittance oximeter |
US5726443A (en) * | 1996-01-18 | 1998-03-10 | Chapman Glenn H | Vision system and proximity detector |
JPH10289304A (ja) * | 1997-02-12 | 1998-10-27 | Nec Corp | 指紋画像入力装置 |
JP3869545B2 (ja) * | 1998-01-19 | 2007-01-17 | 株式会社日立製作所 | 指の特徴パターン特徴検出装置及び個人識別装置 |
JP3658227B2 (ja) * | 1999-01-20 | 2005-06-08 | シャープ株式会社 | 画像読取装置 |
US6292576B1 (en) * | 2000-02-29 | 2001-09-18 | Digital Persona, Inc. | Method and apparatus for distinguishing a human finger from a reproduction of a fingerprint |
AU2001241925A1 (en) * | 2000-02-29 | 2001-09-12 | Digitalpersona, Inc. | Method and apparatus for detecting a color change of a live finger |
JP3558025B2 (ja) * | 2000-09-06 | 2004-08-25 | 株式会社日立製作所 | 個人認証装置及び方法 |
JP3396680B2 (ja) * | 2001-02-26 | 2003-04-14 | バイオニクス株式会社 | 生体認証装置 |
JP2003006627A (ja) * | 2001-06-18 | 2003-01-10 | Nec Corp | 指紋入力装置 |
JP3617476B2 (ja) * | 2001-07-19 | 2005-02-02 | 株式会社日立製作所 | 指認証装置 |
JP2003050993A (ja) * | 2001-08-06 | 2003-02-21 | Omron Corp | 指紋読取方法および指紋読取装置 |
JP3751872B2 (ja) * | 2001-10-30 | 2006-03-01 | 日本電気株式会社 | 指紋入力装置 |
EP1353292B1 (en) * | 2002-04-12 | 2011-10-26 | STMicroelectronics (Research & Development) Limited | Biometric sensor apparatus and methods |
-
2004
- 2004-09-02 JP JP2004256188A patent/JP4556111B2/ja not_active Expired - Fee Related
-
2005
- 2005-08-22 WO PCT/JP2005/015595 patent/WO2006025289A1/ja active Application Filing
- 2005-08-22 US US10/577,728 patent/US7634116B2/en not_active Expired - Fee Related
- 2005-08-22 CN CNB200580001260XA patent/CN100478989C/zh not_active Expired - Fee Related
- 2005-08-22 RU RU2006114755A patent/RU2328035C2/ru not_active IP Right Cessation
- 2005-08-22 BR BRPI0506170 patent/BRPI0506170A/pt not_active IP Right Cessation
- 2005-08-22 KR KR1020067006886A patent/KR101159142B1/ko not_active IP Right Cessation
- 2005-08-22 EP EP05775162.0A patent/EP1785937B1/en not_active Ceased
-
2007
- 2007-02-09 HK HK07101566A patent/HK1096751A1/xx not_active IP Right Cessation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11128176A (ja) * | 1997-10-29 | 1999-05-18 | Hitachi Ltd | 生体光計測装置 |
JP2003075135A (ja) * | 2001-08-31 | 2003-03-12 | Nec Corp | 指紋画像入力装置および指紋画像による生体識別方法 |
JP2003303178A (ja) * | 2002-04-12 | 2003-10-24 | Nec Corp | 個人識別システム |
JP2004054698A (ja) * | 2002-07-22 | 2004-02-19 | Io Network:Kk | 個人識別装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1785937A4 * |
Also Published As
Publication number | Publication date |
---|---|
JP4556111B2 (ja) | 2010-10-06 |
CN1879127A (zh) | 2006-12-13 |
HK1096751A1 (en) | 2007-06-08 |
KR20070050860A (ko) | 2007-05-16 |
RU2328035C2 (ru) | 2008-06-27 |
CN100478989C (zh) | 2009-04-15 |
EP1785937A1 (en) | 2007-05-16 |
RU2006114755A (ru) | 2007-11-20 |
BRPI0506170A (pt) | 2006-10-31 |
EP1785937B1 (en) | 2015-05-06 |
KR101159142B1 (ko) | 2012-06-22 |
US20070014437A1 (en) | 2007-01-18 |
US7634116B2 (en) | 2009-12-15 |
EP1785937A4 (en) | 2012-03-21 |
JP2006072764A (ja) | 2006-03-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006025289A1 (ja) | 情報処理装置 | |
KR101349892B1 (ko) | 다중 생체인식 다중 스펙트럼 이미저 | |
JP4745084B2 (ja) | 撮像装置 | |
JP4636140B2 (ja) | 静脈撮像装置、静脈撮像方法および静脈認証装置 | |
US20110200237A1 (en) | Pattern matching device and pattern matching method | |
WO2017187718A1 (ja) | 撮像装置、認証処理装置、撮像方法、認証処理方法およびプログラム | |
CN107028602B (zh) | 生物体信息测定装置、生物体信息测定方法以及记录介质 | |
KR102560710B1 (ko) | 광학적 스펙클을 이용하는 장치 및 방법 | |
TWI403960B (zh) | 用來鑑定使用者之成像設備及方法 | |
US20090214083A1 (en) | Vein authentication device and vein authentication method | |
US7835546B2 (en) | Pseudorandom number generation apparatus, pseudorandom number generation method and program | |
JP4281272B2 (ja) | 指紋画像撮像方法、指紋画像取得方法、指紋画像撮像装置および個人識別装置 | |
JP4708232B2 (ja) | 撮像装置 | |
JP5229489B2 (ja) | 生体認証装置 | |
JP3788043B2 (ja) | 指紋像入力装置 | |
JP2006288872A (ja) | 血管画像入力装置、血管画像構成方法、およびこれらを用いた個人認証システム | |
US10726283B2 (en) | Finger vein authentication device | |
CN219960739U (zh) | 图像传感器、摄像头模组及识别设备 | |
JP5229490B2 (ja) | 生体認証装置 | |
JP2007219624A (ja) | 血管画像入力装置、及び個人認証システム | |
JP2008097328A (ja) | 画像入力装置、個人認証装置及び電子機器 | |
JP4626801B2 (ja) | 撮像装置 | |
JP2007133656A (ja) | 指紋照合装置 | |
EP4239586A1 (en) | Photographing apparatus and authentication apparatus | |
US20240161536A1 (en) | Biometric information acquiring apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200580001260.X Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067006886 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005775162 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006114755 Country of ref document: RU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007014437 Country of ref document: US Ref document number: 10577728 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
ENP | Entry into the national phase |
Ref document number: PI0506170 Country of ref document: BR |
|
WWP | Wipo information: published in national office |
Ref document number: 10577728 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2005775162 Country of ref document: EP |