WO2013137078A1 - Procédé d'authentification d'un individu et dispositif d'authentification d'individus - Google Patents

Procédé d'authentification d'un individu et dispositif d'authentification d'individus Download PDF

Info

Publication number
WO2013137078A1
WO2013137078A1 PCT/JP2013/056122 JP2013056122W WO2013137078A1 WO 2013137078 A1 WO2013137078 A1 WO 2013137078A1 JP 2013056122 W JP2013056122 W JP 2013056122W WO 2013137078 A1 WO2013137078 A1 WO 2013137078A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
color space
data
image
template image
Prior art date
Application number
PCT/JP2013/056122
Other languages
English (en)
Japanese (ja)
Inventor
岩田英三郎
Original Assignee
Iwata Eizaburo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Iwata Eizaburo filed Critical Iwata Eizaburo
Publication of WO2013137078A1 publication Critical patent/WO2013137078A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Definitions

  • the present invention relates to a personal authentication method and a personal authentication device. More specifically, the present invention relates to a technique for performing authentication using a template image acquired in advance and a target image to be authenticated.
  • NIR near infrared
  • NIR light is applied to the wrist portion of the subject, and a wrist image is taken in this state. Since the NIR light is easily transmitted through the living body and has a high absorption rate in the vein portion, an image showing the vein portion can be obtained by photographing with a camera capable of photographing the NIR light. This image is collated with a template image acquired in the same manner in advance. It is possible to authenticate an individual by judging whether or not both images match. Authentication using vein images has the advantage that impersonation is difficult.
  • a NIR light source and a camera therefor are not generally used. Further, in the authentication device using such a special device, when the light in the visible light region is mixed with the reflected light, the SN ratio is deteriorated, so that personal authentication becomes difficult. If a camera (visible light camera) provided in a general mobile phone can perform accurate authentication, a special device is not required, so that personal authentication can be further spread.
  • a camera visible light camera
  • mounting the above-described NIR light source and a camera therefor on a general mobile phone leads to an increase in cost and weight, and cannot completely block invasion of light in the visible light region existing in the environment. Therefore, it is difficult in practice.
  • Patent Document 1 described above describes a biometric authentication device that positions a body using visible light. However, this technique only uses visible light for positioning, and does not use an image obtained with visible light for personal authentication.
  • An object of the present invention is to provide means capable of authenticating an individual even when using visible light and a camera therefor.
  • One aspect of the present invention is a personal authentication device that uses an image of a vein portion that appears on the surface of a human body, the light source that irradiates the human body surface with light having a wavelength in the visible light region, and the human body surface that is irradiated from the light source.
  • An imaging unit that obtains a reflected image composed of the light reflected in step S3, a first color signal in the first color space, and a second color signal in the second color space from data corresponding to the reflected image at the time of authentication.
  • the template image data is provided with a collation unit for collation, and when the template image is registered, the data corresponding to the reflection image is obtained from the first color signal in the first color space, and A personal identification device have been extracted by acquiring the second color signal in the second color space.
  • the first color signal in the first color space is a cyan signal in the CMYK color space
  • the second color signal in the second color space is an R signal and a B signal in the RGB color space. It is.
  • the apparatus obtains a cyan signal in the CMYK color space and an R signal and a B signal in the RGB color space from data corresponding to the reflected image when registering the template image.
  • the image processing apparatus further includes a template image processing unit that extracts data feature of the template image.
  • the RGB color space used in the template image processing unit and the target image processing unit converts data corresponding to the reflected image into an HSV color space, and H This is obtained by changing the phase of the signal and the intensity of the S signal and then converting the HSV color space to the RGB color space.
  • the template image processing unit is configured to further extract data features of the template image using a magenta signal or a yellow signal in a CMYK color space, and the target image processing The unit further extracts the feature of the data of the target image by further using a magenta signal or a yellow signal in the CMYK color space.
  • the image of the vein portion appearing on the surface of the human body is photographed in a non-contact manner by an ordinary photographing device (that is, capable of photographing visible light). It is also possible to perform shooting continuously.
  • the imaged part is preferably a part where veins are concentrated in a shallow part under the skin. For example, it is a part of the wrist where the veins can be seen to be raised, but in addition, a part such as the back of the hand where the veins are easily visible is suitable.
  • R signal and the B signal in the RGB color space can be obtained by the following equation (1), for example.
  • GP abs ( ⁇ 1 * R- ⁇ 2 * B- ⁇ 3 * C) (1) here, GP: Grayscale data obtained from cyan, R, and B signal values abs (): absolute value R: value of R signal in the RGB color space B: B signal value in the RGB color space C: Cyan signal value ⁇ in the CMYK color space ⁇ : coefficient.
  • the light source may be an artificial light source having a wavelength in the visible light region.
  • the light source is, for example, a white LED, which expresses white by any one of (1) blue LED + green LED + red LED, (2) blue LED + phosphor, and (3) near ultraviolet LED + phosphor. It can also be.
  • the artificial light source or white LED described above may be mixed with ambient light (available light) such as natural light or indoor natural light.
  • This apparatus includes a light source 1, an imaging unit 2, a template image processing unit 3, a target image processing unit 4, a storage unit 5, and a matching unit 6.
  • the imaging unit 2 has a function of acquiring image data.
  • Such an imaging unit 2 can be configured by an appropriate device such as a digital camera or an image scanner.
  • the imaging unit 2 acquires a digital image.
  • the imaging unit 2 can capture and record an image composed of light in the visible light region.
  • the storage unit 5 is configured by an appropriate device capable of recording digital data, such as a hard disk, an optical disk, a magneto-optical disk, and a semiconductor memory.
  • the storage unit 5 includes a template storage unit 51 that records template data and a target image storage unit 52 that records target image data. These storage units 51 and 52 are separated into different blocks for the sake of convenience, but may be configured by physically the same storage device or by different storage devices. It may be.
  • the light source 1 is composed of a light emitter having a wavelength in the visible light region.
  • the light of the mobile phone can be used as a light source.
  • the effect of the light source will be described in detail in the authentication method in the embodiment described later.
  • the light source can basically be sunlight or ambient light.
  • the accuracy of authentication can be improved by using the light source as artificial light and accurately grasping the wavelength range of the irradiated light.
  • the template image processing unit 3 is configured to acquire template image data from the cyan signal in the CMYK color space and the R signal and the B signal in the RGB color space, and extract the feature of the template image data. .
  • a template image processing unit 2 can be realized by a combination of computer software and computer hardware, for example. The detailed operation of the template image processing unit 3 will be described in detail in the authentication method in the present embodiment described later.
  • the target image processing unit 4 is configured to acquire data of the target image from the cyan signal in the CMYK color space and the R signal and the B signal in the RGB color space and extract the characteristics of the target image data. .
  • a target image processing unit 4 can be realized by a combination of computer software and computer hardware, for example. Detailed operation of the target image processing unit 4 will be described in detail in an authentication method in an embodiment described later.
  • the collation unit 6 is configured to perform authentication by collating both images using the feature of the template image data and the feature of the target image data.
  • Such a collation part 6 is realizable by the combination of computer software and computer hardware, for example. The detailed operation of the verification unit 6 will be described in detail in the authentication method of the embodiment described later.
  • FIG. 1 An example of a hardware configuration for realizing the authentication apparatus according to the present embodiment is shown in FIG. This configuration example includes a lighting device 7, an image input device 8, a CPU 9, a storage device 10, and a communication path 11.
  • the illuminating device 7 is hardware corresponding to the light source 1, and can be constituted by, for example, an LED that emits light in the visible light region.
  • the illuminating device 7 can be comprised by white LED with which the mobile telephone etc. are equipped.
  • white LED (1) one that expresses white by emitting blue, green and red LEDs, which are the three primary colors of light, and (2) blue light emitted by blue LED is converted into yellow phosphor.
  • white LED There are things that express white by hitting, and (3) those that express white by applying light emitted from purple near-UV LEDs to blue, green, and red phosphors, but any type can be used .
  • the image input device 8 is hardware corresponding to the imaging unit 2 described above, and includes, for example, a camera or a scanner.
  • the CPU 9 is, for example, a microcomputer or a general-purpose processor.
  • the CPU 9 performs a predetermined operation according to the computer program stored in the storage device 10.
  • the storage device 10 is hardware corresponding to the storage unit 5 described above, and can be configured by, for example, a magnetic recording device, an optical storage device, a semiconductor memory, or the like.
  • the communication path 11 enables data exchange between elements, and is a bus line, for example.
  • the communication path 110 may be a wireless communication path, and the physical configuration is not limited. Further, the protocol used for communication is not particularly limited.
  • a hardware circuit designed for image processing may be a part of a computer program or, in some cases, all of the computer program. It can also be used to implement a function.
  • Step SA-1 in FIG. 3 First, after obtaining a template image, the image is processed to generate data necessary for authentication.
  • the number of template images is determined according to the purpose of use.
  • a plurality of template images are often prepared for each person. It is usual to prepare template images for a plurality of people.
  • Step SA-2 in FIG. 3 Next, the target image is acquired. Thereafter, as in the case of the template image, the target image is processed to generate data necessary for authentication.
  • Step SA-3 in FIG. 3 Next, authentication is performed by comparing the target image with the template image.
  • the processing is performed in the order of template image processing, target image processing, and authentication (collation), but each processing may be performed in whole or in part in parallel. The order of each process can be appropriately set so as to suit the purpose of use.
  • a template image (that is, a template original image) is acquired by the imaging unit 2.
  • the acquired original image is stored in the template storage unit 51.
  • the number of original images to be acquired is determined according to the purpose of use. In the following, a single original image will be described as an example, but in principle, the same processing is performed for each original image.
  • a vein image of a person's wrist part (that is, an image of a vein part appearing on the surface of a human body) can be used.
  • This original image can be acquired by a normal visible light camera under a light source in the visible light range. In this embodiment, this image is acquired and stored as data in the RGB color space.
  • the color space of the image acquired in hardware by the imaging unit 2 need not be RGB. Rather, in general, there are many devices that acquire data in the YUV color space by hardware. In this case, for example, data in the YUV color space can be converted by software to generate RGB color space data, which can be used for subsequent calculations.
  • the imaging unit 2 may be configured to acquire RGB color space data in hardware. Note that the RGB color space and the YUV color space have a complementary color relationship that can be mutually converted.
  • Step SB-2 in FIG. 4 the template image processing unit 3 converts the RGB color space data acquired by the imaging unit 2 to generate, for example, a bitmap image, and further converts it into a grayscale image for extracting data features. .
  • the R signal and the B signal in the RGB value of each pixel on the image are HSV converted and mapped onto the hue circle.
  • the R signal value and the B signal value mapped on the hue circle (that is, the phase of the hue H in the HSV space) are moved by appropriately set values.
  • the phase of H is shifted + 115 °.
  • the width of the hue shift (that is, the width of change) is determined by experiment.
  • the intensity (magnitude) of the saturation (value of S) in the HSV space is changed to an appropriately set value.
  • the value of S is increased by 30% in the positive direction.
  • the value of this saturation change is also determined by experiment.
  • the image data in the RGB color space can be converted into the HSV space by the following equation.
  • R′G′B ′ space data obtained in this way is regarded as RGB space data, and the subsequent processing is performed. That is, the R signal value in the RGB space regarded as described above is R1, the B signal value is B1, and the feature is extracted by taking the difference from the cyan signal in the CMYK space.
  • GP1 abs ( ⁇ 1 * R 1 ⁇ 2 * B 1 ⁇ 3 * C) (1) here, GP1: Gray scale data obtained from cyan, R1 and B1 signal values abs (): absolute value R1: value obtained by performing hue conversion on the value of the R signal in the RGB color space, moving the hue, and changing the saturation B1: Value obtained by performing hue conversion on the value of the B signal in the RGB color space, moving the hue, and changing the saturation C: Value of the cyan signal in the CMYK color space ⁇ : Coefficient
  • the image data GP is, for example, an 8-bit grayscale image.
  • the G signal in the RGB color space the magenta signal in the CMYK color space, and the yellow signal are used. It can be used additionally.
  • the magenta signal can be added to the feature quantity as in the following equation.
  • GP2 abs ( ⁇ 1 * R + ⁇ 2 * M ⁇ 3 * B ⁇ 4 * C) (2) here, GP2: Grayscale data obtained from cyan, magenta, R, and B signal values abs (): absolute value R: value of R signal in the RGB color space M: value of magenta signal in the CMYK color space B: value of B signal in the RGB color space C: value of cyan signal in the CMYK color space ⁇ : Coefficient
  • GP3 abs ( ⁇ 1 * R ⁇ 2 * B ⁇ 3 * C ⁇ ⁇ 1 * M ⁇ ⁇ 2 * Y) (1) here, GP3: Gray scale data obtained from values of cyan signal, magenta signal, yellow signal, R signal and B signal abs (): absolute value R: value of R signal in the RGB color space B: Value of B signal in the RGB color space C: Value of cyan signal in the CMYK color space M: Value of magenta signal in the CMYK color space Y: Value of yellow signal in the CMYK color space ⁇ : Coefficient ⁇ : Coefficient (however, the coefficient range is preferably
  • the RGB color space is used directly, but the color space (for example, CMYK, HSV, YCbCr, YIQ, Luv, Lab, XYZ) that can be converted to the RGB color space is replaced with the RGB color space.
  • the feature of the data in the template image or the target image can be extracted. That is, the data in the RGB space and the data in the convertible color space can be converted by a predetermined mathematical formula. Therefore, the description above applies to the case where data other than the RGB color space is used by interposing predetermined data conversion. Therefore, instead of the data representing the feature in the RGB space in the present invention, the feature of the image is represented using the data obtained by mapping this data to another color space, or the feature represented in this way It is within the scope of the present invention to authenticate using a quantity.
  • each coefficient in the above description can determine an optimum value experimentally.
  • the coefficient may be negative.
  • the coefficient ⁇ is generally experimentally determined by the external light source environment (for example, brightness).
  • Step SB-3 in FIG. 4 the template imaging unit 2 binarizes the image data GP obtained as described above. Since the binarization can be performed by a general method such as taking a moving average in each pixel or each block, detailed description is omitted here. An example of binarized data is shown in FIG.
  • the template image processing unit 3 extracts the data features of the template image.
  • a method for extracting features for example, a method using Hough transform is known. In this method, votes for straight line candidates are voted, and those having a large number of votes are extracted as straight lines. It can be determined that the extracted straight line represents the feature of the image. Since feature extraction by Hough transform is well known, detailed description is omitted.
  • Step SB-5 in FIG. 4 the template imaging unit 2 uses the straight line extracted in step SB-4 to perform coordinate conversion so as to be convenient for post-processing.
  • the data in the ( ⁇ , ⁇ ) space is subjected to Fourier transform, and further post-processing coordinate transformation ⁇ ⁇ log ( ⁇ ) is performed. Further, in this coordinate conversion, a difference is further calculated as log ( ⁇ i) ⁇ log ( ⁇ i ⁇ 1) so as to be convenient for post-processing.
  • coordinate conversion is also well known, and detailed description thereof is omitted.
  • data obtained by coordinate conversion is data representing the characteristics of the template image.
  • Step SB-6 in FIG. 4 The data generated as described above is recorded in the template storage unit 51. Various types of data generated during the processing are also recorded in the template storage unit 51 as necessary.
  • a target image (that is, a target original image) is acquired by the imaging unit 2.
  • the acquired original image is stored in the storage unit 52 for the target image.
  • the number of original images to be acquired is determined according to the purpose of use. In the following, a single original image will be described as an example, but in principle, the same processing is performed for each original image. For example, once authentication fails, the target image can be acquired again and the same processing can be performed.
  • the acquisition conditions for the target image are the same as those for the template image.
  • the target image processing unit 4 can convert the RGB color space data acquired by the imaging unit 2 to obtain target image data.
  • the formula (1) or (2) can be used as described above.
  • Step SC-3 in FIG. 5 the target image processing unit 4 binarizes the image data obtained as described above.
  • the binarization method may be the same as that for the template image.
  • Step SC-4 in FIG. 5 the target image processing unit 4 extracts data features of the target image.
  • the method for extracting features is the same as that for the template image.
  • Step SC-5 in FIG. 5 the target image processing unit 4 performs coordinate conversion. This is the same as in the case of the template image.
  • data that has undergone coordinate transformation (including difference calculation) is data representing the characteristics of the target image.
  • the data generated as described above may be recorded in the target image storage unit 52. Or you may transfer to the following authentication process, without performing such a preservation
  • the collation unit 6 performs the collation and authentication shown in step SA-3.
  • collation using the phase only correlation method can be performed.
  • the phase-only correlation is calculated from the data (coordinate-transformed data) generated in the above-described Step SB-5 and Step SC-5.
  • the rotation angle ( ⁇ ) and magnification ( ⁇ ) of the target image with respect to the template image can be calculated.
  • the maximum value of the phase-only correlation in the ( ⁇ , ⁇ ) space and the value obtained from the periphery thereof are adopted as threshold values, and the identity is determined.
  • the method of this embodiment can be implemented by a computer program that can be executed by a computer.
  • the program can be recorded on various types of computer-readable media.
  • the reason why personal authentication is possible is as follows. That is, when a visible light source is applied to the human body surface, a part of the visible light is absorbed by the human body and the remaining part is reflected. Since the absorbed wavelength is different between the vein portion and the other skin portion, the wavelength of the light contained in the reflected light is different between the vein portion and the other skin portion. It is considered that the vein shape can be accurately extracted by selecting and emphasizing color information representing the characteristics of the vein.
  • the vein shape is extracted using the feature that the near infrared rays are easily absorbed in the vein. Therefore, in this conventional technology, the feature of the vein is obtained using a color image. Cannot be extracted.
  • each component described above may exist as a functional block, and may not exist as independent hardware.
  • a mounting method hardware or computer software may be used.
  • one functional element in the present invention may be realized by a set of a plurality of functional elements.
  • a plurality of functional elements in the present invention may be realized by one functional element.
  • the RGB color space used in the template image processing unit and the target image processing unit is changed from the H signal and the S signal on the HSV color space and then converted to the RGB color space. .
  • such a change is not essential, and the change of the H signal and / or the S signal in the HSV space can be omitted.
  • the functional element may be arrange
  • the functional elements may be connected by a network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Color Image Communication Systems (AREA)
  • Collating Specific Patterns (AREA)

Abstract

L'invention concerne un moyen pour permettre l'authentification d'un individu, même dans le cas où de la lumière visible et une caméra sont utilisées. En émettant de la lumière d'une longueur d'onde dans la région de la lumière visible sur la surface d'un corps humain, une image de réflexion constituée par la lumière réfléchie par le corps humain est obtenue. Lors de l'enregistrement d'une image de modèle, un signal cyan dans l'espace de couleur CMYK et un signal R et un signal B dans l'espace de couleur RVB sont acquis à partir des données correspondant à l'image de réflexion et en conséquence les caractéristiques des données de l'image de modèle sont extraites. Lors de l'authentification, un signal cyan dans l'espace de couleur CMYK et un signal R et un signal B dans l'espace de couleur RVB sont acquis à partir des données correspondant à l'image de réflexion, et en conséquence, les caractéristiques des données d'une image cible sont acquises. En utilisant les caractéristiques des données de l'image de modèle et les caractéristiques des données de l'image cible, les deux images sont comparées.
PCT/JP2013/056122 2012-03-16 2013-03-06 Procédé d'authentification d'un individu et dispositif d'authentification d'individus WO2013137078A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012060602A JP2013196152A (ja) 2012-03-16 2012-03-16 個人認証方法及び個人認証装置
JP2012-060602 2012-03-16

Publications (1)

Publication Number Publication Date
WO2013137078A1 true WO2013137078A1 (fr) 2013-09-19

Family

ID=49160990

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/056122 WO2013137078A1 (fr) 2012-03-16 2013-03-06 Procédé d'authentification d'un individu et dispositif d'authentification d'individus

Country Status (2)

Country Link
JP (1) JP2013196152A (fr)
WO (1) WO2013137078A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046681A (zh) * 2015-05-14 2015-11-11 江南大学 一种基于SoC的图像显著性区域检测方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101901519B1 (ko) 2017-05-24 2018-09-28 전자부품연구원 개인 피부 속 광학 패턴인식 장치의 신호 처리 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11203452A (ja) * 1998-01-19 1999-07-30 Hitachi Ltd 個人特徴パターン検出装置及びそれを用いた個人識別装置
JP2004265269A (ja) * 2003-03-04 2004-09-24 Hitachi Ltd 個人認証装置
WO2008139883A1 (fr) * 2007-05-16 2008-11-20 Sony Corporation Système de gestion de configuration des veines, dispositif et procédé d'enregistrement de configuration des veines, dispositif et procédé d'authentification de configuration des veines, programme et structure de données des veines
WO2012014300A1 (fr) * 2010-07-29 2012-02-02 ユニバーサルロボット株式会社 Dispositif et procédé de vérification de l'identité d'un individu

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11203452A (ja) * 1998-01-19 1999-07-30 Hitachi Ltd 個人特徴パターン検出装置及びそれを用いた個人識別装置
JP2004265269A (ja) * 2003-03-04 2004-09-24 Hitachi Ltd 個人認証装置
WO2008139883A1 (fr) * 2007-05-16 2008-11-20 Sony Corporation Système de gestion de configuration des veines, dispositif et procédé d'enregistrement de configuration des veines, dispositif et procédé d'authentification de configuration des veines, programme et structure de données des veines
WO2012014300A1 (fr) * 2010-07-29 2012-02-02 ユニバーサルロボット株式会社 Dispositif et procédé de vérification de l'identité d'un individu

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046681A (zh) * 2015-05-14 2015-11-11 江南大学 一种基于SoC的图像显著性区域检测方法

Also Published As

Publication number Publication date
JP2013196152A (ja) 2013-09-30

Similar Documents

Publication Publication Date Title
KR101517371B1 (ko) 개인인증방법 및 개인인증장치
US11321963B2 (en) Face liveness detection based on neural network model
Steiner et al. Design of an active multispectral SWIR camera system for skin detection and face verification
CN104463074B (zh) 真伪指纹的辨识方法及辨识装置
US11341348B2 (en) Hand biometrics system and method using digital fingerprints
Lynch et al. Colour constancy from both sides of the shadow edge
TW201839654A (zh) 指紋辨識方法以及指紋辨識裝置
KR20220052828A (ko) 생체 인증 장치 및 생체 인증 방법
AU2021286405A1 (en) Personal authentication method and personal authentication device
WO2013137078A1 (fr) Procédé d'authentification d'un individu et dispositif d'authentification d'individus
US10726282B2 (en) Biometric authentication apparatus, biometric authentication system and biometric authentication method
KR101336834B1 (ko) Usb 홍채 인식기
EP3312767B1 (fr) Appareil de capture d'image et appareil d'authentification biométrique
Vyas et al. A collaborative approach using ridge-valley minutiae for more accurate contactless fingerprint identification
JP4694352B2 (ja) 指紋照合装置
WO2023210081A1 (fr) Système d'authentification biométrique et procédé d'authentification
TW201510882A (zh) 個人認證方法、系統及其認證圖像取得裝置、模板圖像取得裝置
JP2004013768A (ja) 個人識別方法
Bianco Color constancy using single colors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13761152

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30.01.2015)

122 Ep: pct application non-entry in european phase

Ref document number: 13761152

Country of ref document: EP

Kind code of ref document: A1