WO2009107470A1 - 黒子識別装置、個人認証装置、方法、及び、プログラム - Google Patents
黒子識別装置、個人認証装置、方法、及び、プログラム Download PDFInfo
- Publication number
- WO2009107470A1 WO2009107470A1 PCT/JP2009/052023 JP2009052023W WO2009107470A1 WO 2009107470 A1 WO2009107470 A1 WO 2009107470A1 JP 2009052023 W JP2009052023 W JP 2009052023W WO 2009107470 A1 WO2009107470 A1 WO 2009107470A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mole
- image
- moles
- true
- absorption spectrum
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
- A61B5/1176—Recognition of faces
Definitions
- the present invention relates to a mole identification device, method, and program, and more particularly, to a mole identification device, method, and program for extracting mole from a human face image.
- the present invention also relates to a personal authentication apparatus, method, and program for performing personal authentication using moles extracted from face images.
- Non-Patent Document 1 Face authentication technology that uses human face images and uses feature points of face images is known. Face authentication that uses nevi appearing on the face and skin of a child such as Kuroko is considered to be an excellent technique because there is little secular change due to nevus.
- face authentication using a mole there is a technique described in Non-Patent Document 1.
- Non-Patent Document 1 the degree of separation of the luminance between the central part and the peripheral part of the circular region is obtained, and the degree of separation is assumed to be a mole.
- Ten moles are extracted from the face image in descending order of the likelihood of moles, and personal authentication is performed based on the similarity of mole positions.
- a region where the brightness of the central portion is lower than that of the peripheral portion and the area of the dark portion is small is defined as a mole.
- Patent Document 1 There is a technique described in Patent Document 1 as another technique related to face authentication.
- Patent Document 1 describes that a facial feature to be searched is designated as a search condition, and a facial image having the designated feature is searched from a facial image database.
- mole, eyelid, beard, glasses, sex, estimated age, and skin color are listed as search conditions when searching for a face image.
- a technique for extracting moles there is a technique in which a mole is a region where a predetermined number or more of pixels whose luminance values with a peripheral region are equal to or less than a threshold are gathered.
- JP 2006-318375 A Kawahara Tomokazu, Yamaguchi Osamu, and Fukui Kazuhiro, “Personal Authentication Using Global Structures of Micro-features on the Face” 5th System Integration Division Scientific Lecture (SI2004), December 17-19, 200419pp. 619-620
- Patent Document 1 and Non-Patent Document 1 a low brightness area in the face image is used as a mole, and a problem is that the mole and other low brightness areas appearing on the face are not distinguished. For example, when a black dot is written on the face by using pseudo ink or the like, the brightness of the region is low in the grayscale image, and therefore, it is recognized as a mole. Therefore, there is a problem in that when a malicious impersonator adds a mole to the same position as a registered person, it cannot be detected and spoofing can be prevented.
- An object of the present invention is to provide a mole identification device, a personal authentication device, a method, and a program that can resolve the above problems and can discriminate between true moles and pseudo moles.
- the present invention includes a step of inputting a multispectral image composed of a plurality of spectra, which is taken using an imaging device, a step of detecting mole candidates from the multispectral image, and the detected mole candidates And a step of identifying whether the mole candidate is a true mole or a pseudo mole based on the absorption spectrum of the mole.
- the present invention includes a step of inputting a multispectral image composed of a plurality of spectra, which is taken using an imaging device, a step of detecting mole candidates from the multispectral image, and the detected mole candidates Identifying whether a candidate for a mole is a true mole or a pseudo mole, and detecting a position of the mole identified as the true mole; and
- a personal authentication method including a step of collating face images based on a positional relationship between a position of an identified mole and a mole position detected from a registered image for collation.
- the present invention relates to an image input means for inputting a multispectral image composed of a plurality of spectra, which is taken using an imaging device, and detects a mole candidate from the multispectral image, and absorbs the detected mole candidate.
- a mole identification device comprising mole position estimation means for identifying whether a mole candidate is a true mole or a pseudo mole based on a spectrum.
- the present invention relates to an image input means for inputting a multispectral image composed of a plurality of spectra, which is taken using an imaging device, and detects a mole candidate from the multispectral image, and absorbs the detected mole candidate. Based on a spectrum, a mole position estimation means for identifying whether the mole candidate is a true mole or a pseudo mole, and detecting the position of the mole identified as a true mole; and the true mole
- a personal authentication device comprising image collation means for collating face images based on the positional relationship between the positions of identified moles and mole positions detected from registered images.
- the present invention is a program for causing a computer to execute a process of identifying moles included in a face image, and inputs a multispectral image composed of a plurality of spectra, which is captured using an imaging device, to the computer. Processing, detecting a mole candidate from the multispectral image, and determining whether the mole candidate is a true mole or a pseudo mole based on an absorption spectrum of the detected mole candidate.
- a program for executing identification processing is provided.
- the present invention is a program for causing a computer to execute a personal authentication process using moles included in a face image, wherein a multi-spectral image composed of a plurality of spectra is captured on the computer using an imaging device.
- a process for detecting a mole candidate from the multispectral image, and an absorption spectrum of the detected mole candidate the mole candidate is a true mole or a pseudo mole.
- the position of the mole identified as a true mole, and the positional relationship between the mole position identified as the true mole and the mole position detected from the registered image Provided is a program that executes a process of collating face images.
- the mole identification device, personal authentication apparatus, method, and program of the present invention can discriminate true moles and pseudo moles from areas that appear to be moles included in a face image.
- FIG. 1 shows a personal authentication device (system) according to a first embodiment of the present invention.
- the personal authentication system includes a multispectral image input unit 10, a skin region extraction unit 11, a mole position estimation unit 12, an image collation unit 13, and an identity determination unit 14.
- the image input means 10 inputs an image used for collation at the time of personal authentication.
- the image input by the image input means 10 is two or more spectral images (multispectral images) composed of a plurality of spectra.
- the skin region extraction unit 11 extracts a skin region of the face that does not include the eye, mouth, and hair regions from the multispectral image input by the image input unit 10.
- the means of the personal authentication system is constituted by one or a plurality of programs stored in a computer-readable recording medium.
- the mole position estimation means 12 extracts moles from the extracted skin region and estimates mole positions. At this time, a true mole and a pseudo mole are identified based on an average absorption spectrum of each mole.
- the image collating unit 13 uses a geometric constraint on the positional deviation of moles between images used for collation from the mole positions of the moles detected as true moles by the mole position estimating unit 12. Are compared, and the similarity between the images is calculated.
- the person determination unit 14 performs threshold processing on the obtained similarity and determines whether or not the target person is the person.
- FIG. 2 shows the overall operation procedure.
- the image input means (unit) 10 captures a face image using a multispectral camera that can simultaneously acquire a plurality of spectrum images, and inputs the multispectral image (step A1).
- the luminance value of the input multispectral image is represented by I (x, ⁇ ) where x is the pixel position in the face image and ⁇ is the wavelength.
- Nsp face images are obtained.
- Skin area extracting means 11 extracts a skin area in the face image for each spectrum image (step A2).
- a face image is blurred to some extent using a Gaussian filter, etc., and a median luminance value in each spectrum is obtained. Extract as skin area.
- blurring with a Gaussian filter etc. is first performed, so that moles contained in the skin are extracted as skin areas, while areas with different luminance distributions such as eyes, lips, etc. Can be excluded from the skin area.
- the mole position estimation means 12 detects the mole position from the skin area extracted in step A2 for each spectrum image (step A3). At this time, the mole position estimation means 12 identifies a true mole and a pseudo mole. Compared with a normal skin region, moles including pseudo moles tend to absorb light more strongly in the entire spectrum, and the average luminance is lower. Therefore, first, one gray image is generated from a plurality of spectral images, and a mole-like region is extracted from the gray image. Next, an average absorption spectrum in each spectrum image is obtained for the extracted mole-like region. Then, true moles and pseudo moles are identified by comparing the average absorption spectrum of each mole.
- FIG. 3 shows the detailed procedure of mole position detection in step A3.
- the mole position estimation means 12 calculates the average of the luminance values for each pixel over the entire spectrum, and generates a grayscale image (step B1).
- the luminance value I (x) at the pixel position x of the grayscale image is obtained by the following equation, where I (x, ⁇ ) is the luminance value at the pixel position x of the spectral image of wavelength ⁇ , and Nsp is the number of spectra. .
- the mole position estimation means 12 obtains the moleiness of each pixel of the light and shade image in order to obtain a mole-like area from the light and shade image generated in Step B1 (Step B2).
- a value obtained by dividing the luminance of the central pixel by the minimum luminance value of the peripheral pixels included within the radius of 3 pixels is defined as the moleiness. That is, the luminance ratio r between the central pixel and the peripheral pixels defined by the following formula is defined as the moleiness.
- the center pixel is not included when obtaining the minimum value of the luminance of the peripheral pixels of the denominator in the above formula 1.
- the mole position estimation means 12 estimates the mole position using the moleiness of each pixel obtained in step B2 (step B3).
- the moleiness r defined by the above equation 1 takes a large value when the luminance of the central pixel is higher than that of the peripheral pixels, and takes a small value when the luminance of the central pixel is low as compared to the peripheral pixels. Since moles tend to have lower luminance at the center pixel than surrounding pixels, in Step B3, an area having a small moleiness r value is estimated as a mole area. For example, the top N pixels having a small moleiness r value are selected and set as the center position of moles.
- the mole position estimation means 12 calculates the average absorption spectrum of each mole for each wavelength from each spectrum image for the mole-like area (step B4).
- step B4 the luminance value of the central pixel x i in the i-th mole among the N moles obtained is set as I (x i , ⁇ ) and included in the radius c from the pixel x i according to the following equation.
- An average of pixels is obtained, and an average absorption spectrum of moles is calculated for each wavelength.
- N i is the number of pixels included within a radius c
- Omega i denotes the set of pixels included within a radius c around the x i.
- c is a variable indicating the size of moles, and is adjusted to an appropriate value depending on the resolution of the image.
- the mole whose position is estimated in step B3 includes a pseudo mole in addition to a true mole.
- the mole position estimation means 12 identifies a pseudo mole by using the fact that there are a plurality of image spectra and comparing the average absorption spectrum of each wavelength (step B5).
- the pseudo mole is identified as follows. Kuroko is a region having a high melanin concentration in the skin, and thus basically has characteristics similar to the absorption spectrum of the skin.
- Equation 3 the absorption spectrum of the skin is expressed by the following equation 3.
- ⁇ s is a set of pixels determined to be a skin region
- N s is the number of pixels.
- the absorption spectrum of the mole is approximated by the following formula 4 using the formula 3 and an appropriate coefficient a.
- I m ( ⁇ ) in Equation 5 includes an unknown coefficient a. Substituting Equation 4 into Equation 5, the coefficient a is canceled by the numerator and denominator, so Equation 5 can be expressed by Equation 6 below.
- extracted mole determines whether a false moles. That is, a t i calculated by Equation 6, and compared with a predetermined threshold value T, and determines that when t i is equal to or larger than the threshold i th moles are true lentigines, than the threshold When it is small, it is determined to be a pseudo mole.
- the mole position estimating means 12 detects the moles of the N moles extracted in step B3, excluding moles determined to be pseudo moles, as true moles (step B6). In step B6, if the number of moles determined to be pseudo moles is Nf, the position of (N ⁇ Nf) moles is detected.
- the image matching means 13 performs a matching process with an image registered in advance in a database (not shown) using the true mole position detected at step A3 (step A4).
- the registered image registered in the database is the same spectral image as the multispectral image input in step A1. For example, if the wavelength of the input multispectral image is ⁇ 1 and ⁇ 2, two spectral images of wavelengths ⁇ 1 and ⁇ 2 are prepared for the registered image.
- the identity determination means 14 determines whether or not the subject is the identity based on the collation result in step A4 (step A5).
- FIG. 4 shows the detailed procedure of the collation process in step A4.
- the face images used for collation are called registered images and collation images.
- the registered image is a face image registered in the database in association with user identification information or the like, and the collation image is the face image input in step A1 in FIG. It is assumed that N1 mole positions are obtained from the registered image. Further, it is assumed that N2 mole positions are obtained from the collation image.
- the mole position of the registered image can be obtained by the same procedure as shown in FIG. 3 when estimating the mole position from the collation image.
- the mole position of the registered image may be estimated from the registered image each time collation is performed, or may be registered in advance in the database as mole position data.
- the image collating means 13 searches for corresponding points of moles in the registered image and the collated image (step C1).
- the face positions and sizes in the registered image and the collation image are normalized in advance using the eye positions and the like.
- the positions of N1 moles obtained from the registered image are x1 (1), x1 (2),..., X1 (N1).
- the positions of N2 moles obtained from the collation image are x2 (1), x2 (2),..., X2 (N2).
- the position coordinates of the mole are represented by a two-dimensional vector.
- the image matching unit 13 After searching for the corresponding points, the image matching unit 13 obtains a difference vector of the corresponding mole positions (step C2).
- a difference z1 between the mole position of the corresponding point in the collated image viewed from the registered image and a difference z2 between the mole position of the corresponding point in the registered image viewed from the collated image are obtained.
- the difference vectors z1 and z2 are expressed by calculation formulas, they can be expressed by the following 7.
- z1 (i) x1 (i) -x2 (i *)
- z2 (i) x2 (i) -x1 (i *) (7)
- the image matching means 13 calculates a weighting coefficient using the distance between moles (step C3).
- the weight coefficient is a value corresponding to the distance between moles. Desirably, the smaller the distance between moles, the smaller the value.
- the distance between the i-th mole and the j-th mole in the registered image is expressed by the following equation.
- the weighting coefficient is defined by the following equation using the distance between moles.
- the collation image is defined by the following equation using the distance (d 2 , i, j) between moles in the collation image.
- a weighting coefficient is obtained for all i and j pairs for the registered image and the collated image.
- the image collating means 13 calculates the similarity between mole positions (step C4). It is considered that the difference vector of the corresponding points between the registered image and the collation image indicates the direction seen by all moles if it is the person himself. Therefore, the following formula 8 is defined as the similarity between mole positions.
- s0 is a normalization term and is expressed by the following equation.
- the identity determination means 14 performs identity determination using the similarity s calculated by the image matching means 13 in step A5 of FIG. For example, threshold processing is performed on the degree of similarity, and if it is equal to or greater than the threshold, it is determined that the person is the person, and if it is smaller than the threshold, it is determined that the person is a false person.
- the input image is a multispectral image having a plurality of wavelengths, and a region that appears to be a mole is extracted from the multispectral image. Then, using the fact that the input image is a multispectral image, true moles and pseudo moles are identified by comparing the average spectra of areas that appear to be moles. Whether it is a true mole or a pseudo mole can be determined using the characteristic that each spectrum image shows an absorption spectrum similar to the skin region. At the time of authentication, pseudo moles are excluded, and moles identified as true moles are used to collate with moles in the registered image. Mismatching can be reduced by not performing matching with pseudo moles. Further, it is possible to prevent “spoofing” caused by a malicious impersonator who puts a false mole, and to reject the intruder.
- Non-Patent Document 1 uses the similarity based on the position on the image, so the robustness when the posture changes is not sufficient. That is, in Non-Patent Document 1, the performance tends to be greatly reduced in the case of collation other than the same posture.
- Patent Document 1 only describes that the degree of similarity is calculated for collation between moles, and no consideration is given to the position fluctuation of moles.
- the degree of similarity of moles is calculated using a geometric constraint condition regarding the position shift of moles between images used for collation.
- the configuration of the personal authentication system is the same as that of the first embodiment shown in FIG. FIG. 5 shows an operation procedure in the present embodiment.
- the image input means 10 inputs a multispectral image (step D1).
- the skin region extracting means 11 extracts a skin region from the multispectral image (step D2). Up to this point, the operations are the same as those in steps A1 and A2 in FIG.
- the mole position estimation means 12 detects the mole position and the number of pseudo moles from the extracted skin region (step D3).
- FIG. 6 shows the detailed procedure of step D3.
- the mole position estimating means 12 generates a grayscale image from the multispectral image (step E1), and estimates the moleiness of each pixel (step E2). Thereafter, the mole position is estimated (step E3), the average absorption spectrum of each mole is calculated (step E4), and the pseudo mole is identified (step E5).
- the operation from step E1 to step E5 is the same as the operation from step B1 to step B5 in FIG.
- the mole position estimation means 12 includes the number Nf of moles determined to be pseudo moles among the N moles extracted in step E3, and (N ⁇ Nf) moles determined to be true moles.
- the position is output (step E6).
- the image collation means 13 determines whether or not the person is an impersonator based on the number of pseudo moles detected in step D3 before collation (step D4). For example, in the detection of mole position and the number of pseudo moles in step D3, if a pseudo mole is detected at a predetermined threshold value or more, it can be regarded as a malicious misrepresenter, and therefore, it is determined to be a misrepresenter. More specifically, when even one pseudo mole is detected, it is determined that the person is an impersonator. If it is determined that the person is an impersonator, the process ends without performing verification.
- Step D5 determines whether or not the registered image and the verification image are the same person based on the verification result (step D6).
- the operations of Step D5 and Step D6 are the same as the operations of Step A4 and Step A5.
- the number of pseudo moles in the registered image may be obtained by a procedure similar to the procedure shown in FIG. 6 to confirm that the pseudo mole is zero. If there are one or more pseudo moles in the registered image, it may be re-started from the acquisition of the image or not registered in the database.
- the person is an impersonator based on the number of pseudo moles.
- the present invention can be used in the security field where personal authentication is required.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Collating Specific Patterns (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Analysis (AREA)
Abstract
Description
なお、上記式1にて、分母の周辺画素の輝度の最小値を求める際には、中心画素を含まないものとする。
ここで、Niは半径c以内に含まれる画素数であり、Ωiはxiを中心とした半径c以内に含まれる画素の集合を表す。cは黒子の大きさを意味する変数であり、画像の解像度により適当な値に調整する。
ここで、Ωsは肌領域と判定された画素の集合であり、Nsは画素数である。
黒子の吸収スペクトルを、式3と適当な係数aとを用いて、以下の式4で近似する。
次に、式4の黒子の吸収スペクトルと、式2の各黒子の平均吸収スペクトルとの類似度を求める。類似度は、下記式5で定義する。
式5中のIm(λ)は、未知の係数aを含んでいる。式4を式5に代入すると、係数aは分子と分母でキャンセルされるため、式5は下記式6で表せる。
照合画像についても同様に、照合画像の黒子位置座標の各点x2(i)(i=1、・・・、N2)について、登録画像上での各黒子位置との距離計算を行い、最近傍の黒子を対応点とする。
z1(i)=x1(i)-x2(i*)
z2(i)=x2(i)-x1(i*) (7)
重み係数は、黒子間の距離を用いて、下記式で定義する。
照合画像についても同様に、照合画像における黒子間の距離(d2,i,j)を用いて、下記式で定義する。
ステップC3では、登録画像と照合画像について、重み係数を、全てのi,jの組について求める。
ここで、s0は正規化項で、次式で表される。
本人判定手段14は、図2のステップA5で、画像照合手段13が計算した類似度sを用いて、本人判定を行う。例えば、類似度をしきい値処理し、しきい値以上の場合は本人と判定し、しきい値よりも小さい場合は詐称者と判定する。
Claims (12)
- 撮像装置を用いて撮影された、複数スペクトルで構成されるマルチスペクトル画像を入力するステップと、
前記マルチスペクトル画像から、黒子の候補を検出するステップと、
前記検出された黒子の候補の吸収スペクトルに基づいて、前記黒子の候補が真の黒子であるか、擬似黒子であるかを識別するステップとを有する黒子識別方法。 - 前記識別するステップでは、肌の吸収スペクトルと、前記黒子の吸収スペクトルとに基づいて、真の黒子と擬似黒子とを識別する、請求項1に記載の黒子識別方法。
- 前記識別するステップでは、真の黒子の吸収スペクトルが前記肌の吸収スペクトルの関数で表せると仮定して、前記黒子の候補の吸収スペクトルと前記肌の吸収スペクトルとの類似度を求め、該求めた類似度に従って、真の黒子と擬似黒子とを識別する、請求項2に記載の黒子識別方法。
- 前記識別するステップでは、前記真の黒子の吸収スペクトルと、前記肌の吸収スペクトルとが比例関係にあると仮定して、前記類似度を求める、請求項3に記載の黒子識別方法。
- 前記黒子の候補を検出するステップが、前記マルチスペクトル画像から肌領域を抽出するステップと、前記抽出された肌領域の画素と、当該画素の周辺画素との輝度比を求めるステップと、前記輝度比に基づいて、黒子の候補を検出するステップとを有する、請求項1乃至4の何れか一に記載の黒子識別方法。
- 撮像装置を用いて撮影された、複数スペクトルで構成されるマルチスペクトル画像を入力するステップと、
前記マルチスペクトル画像から、黒子の候補を検出するステップと、
前記検出された黒子の候補の吸収スペクトルに基づいて、前記黒子の候補が真の黒子であるか、擬似黒子であるかを識別し、前記真の黒子と識別された黒子の位置を検出するステップと、
前記真の黒子と識別された黒子の位置と、照合用の登録画像から検出される黒子位置との位置関係に基づいて、顔画像の照合を行うステップとを有する個人認証方法。 - 前記顔画像の照合を行うステップが、前記真の黒子と識別された黒子と、前記登録画像にて真の黒子と識別される黒子との対応点を抽出するステップと、対応点の座標から差分ベクトルを計算するステップと、各特徴点における差分ベクトルの向きが近いことを利用して特徴点間の類似度を求めるステップとを有する、請求項6に記載の個人認証方法。
- 前記真の黒子、擬似黒子を識別するステップで、所定数以上の擬似黒子が検出されたときは、詐称者と判定して処理を終了する、請求項6又は7に記載の個人認証方法。
- 撮像装置を用いて撮影された、複数スペクトルで構成されるマルチスペクトル画像を入力する画像入力手段と、
前記マルチスペクトル画像から黒子の候補を検出し、該検出した黒子の候補の吸収スペクトルに基づいて、前記黒子の候補が真の黒子であるか、擬似黒子であるかを識別する黒子位置推定手段とを備える黒子識別装置。 - 撮像装置を用いて撮影された、複数スペクトルで構成されるマルチスペクトル画像を入力する画像入力手段と、
前記マルチスペクトル画像から黒子の候補を検出し、該検出した黒子の候補の吸収スペクトルに基づいて、前記黒子の候補が真の黒子であるか、擬似黒子であるかを識別し、真の黒子と識別された黒子の位置を検出する黒子位置推定手段と、
前記真の黒子と識別された黒子の位置と、登録画像から検出される黒子位置との位置関係に基づいて、顔画像の照合を行う画像照合手段とを備える個人認証装置。 - コンピュータに、顔画像に含まれる黒子を識別する処理を実行させるプログラムであって、前記コンピュータに、
撮像装置を用いて撮影された、複数スペクトルで構成されるマルチスペクトル画像を入力する処理と、
前記マルチスペクトル画像から、黒子の候補を検出する処理と、
前記検出された黒子の候補の吸収スペクトルに基づいて、前記黒子の候補が真の黒子であるか、擬似黒子であるかを識別する処理とを実行させるプログラム。 - コンピュータに、顔画像に含まれる黒子を用いた個人認証処理を実行させるプログラムであって、前記コンピュータに、
撮像装置を用いて撮影された、複数スペクトルで構成されるマルチスペクトル画像を入力する処理と、
前記マルチスペクトル画像から、黒子の候補を検出する処理と、
前記検出された黒子の候補の吸収スペクトルに基づいて、前記黒子の候補が真の黒子であるか、擬似黒子であるかを識別し、真の黒子と識別された黒子の位置を検出する処理と、
前記真の黒子と識別された黒子の位置と、登録画像から検出される黒子位置との位置関係に基づいて、顔画像の照合を行う処理とを実行させるプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010500632A JP5278424B2 (ja) | 2008-02-27 | 2009-02-06 | 黒子識別装置、個人認証装置、方法、及び、プログラム |
US12/919,061 US20110002511A1 (en) | 2008-02-27 | 2009-02-06 | Mole identifying device, and personal authentication device, method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008046463 | 2008-02-27 | ||
JP2008-046463 | 2008-02-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009107470A1 true WO2009107470A1 (ja) | 2009-09-03 |
Family
ID=41015873
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/052023 WO2009107470A1 (ja) | 2008-02-27 | 2009-02-06 | 黒子識別装置、個人認証装置、方法、及び、プログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20110002511A1 (ja) |
JP (1) | JP5278424B2 (ja) |
WO (1) | WO2009107470A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016096987A (ja) * | 2014-11-20 | 2016-05-30 | 株式会社日立製作所 | 生体認証装置 |
CN109978810A (zh) * | 2017-12-26 | 2019-07-05 | 柴岗 | 痣的检测方法、系统、设备及存储介质 |
WO2020158158A1 (ja) * | 2019-02-01 | 2020-08-06 | ミツミ電機株式会社 | 認証装置 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8666130B2 (en) * | 2010-03-08 | 2014-03-04 | Medical Image Mining Laboratories, Llc | Systems and methods for bio-image calibration |
US9135693B2 (en) | 2010-05-18 | 2015-09-15 | Skin Of Mine Dot Com, Llc | Image calibration and analysis |
US8837832B2 (en) * | 2010-05-18 | 2014-09-16 | Skin Of Mine Dot Com, Llc | Systems and methods for monitoring the condition of the skin |
JP2016053482A (ja) * | 2014-09-02 | 2016-04-14 | キヤノン株式会社 | 光音響波測定装置および光音響波測定方法 |
US10547610B1 (en) * | 2015-03-31 | 2020-01-28 | EMC IP Holding Company LLC | Age adapted biometric authentication |
US10674953B2 (en) * | 2016-04-20 | 2020-06-09 | Welch Allyn, Inc. | Skin feature imaging system |
AU2019208182B2 (en) | 2018-07-25 | 2021-04-08 | Konami Gaming, Inc. | Casino management system with a patron facial recognition system and methods of operating same |
US11521460B2 (en) | 2018-07-25 | 2022-12-06 | Konami Gaming, Inc. | Casino management system with a patron facial recognition system and methods of operating same |
US11074340B2 (en) | 2019-11-06 | 2021-07-27 | Capital One Services, Llc | Systems and methods for distorting CAPTCHA images with generative adversarial networks |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005071118A (ja) * | 2003-08-26 | 2005-03-17 | Hitachi Ltd | 個人認証装置及び方法 |
JP2006107288A (ja) * | 2004-10-07 | 2006-04-20 | Toshiba Corp | 個人認証方法、装置及びプログラム |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2002952748A0 (en) * | 2002-11-19 | 2002-12-05 | Polartechnics Limited | A method for monitoring wounds |
JPWO2006088042A1 (ja) * | 2005-02-16 | 2008-07-03 | 松下電器産業株式会社 | 生体判別装置および認証装置ならびに生体判別方法 |
JP4702598B2 (ja) * | 2005-03-15 | 2011-06-15 | オムロン株式会社 | 監視システム、監視装置および方法、記録媒体、並びにプログラム |
US8131029B2 (en) * | 2005-09-20 | 2012-03-06 | Brightex Bio-Photonics Llc | Systems and methods for automatic skin-based identification of people using digital images |
-
2009
- 2009-02-06 WO PCT/JP2009/052023 patent/WO2009107470A1/ja active Application Filing
- 2009-02-06 JP JP2010500632A patent/JP5278424B2/ja active Active
- 2009-02-06 US US12/919,061 patent/US20110002511A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005071118A (ja) * | 2003-08-26 | 2005-03-17 | Hitachi Ltd | 個人認証装置及び方法 |
JP2006107288A (ja) * | 2004-10-07 | 2006-04-20 | Toshiba Corp | 個人認証方法、装置及びプログラム |
Non-Patent Citations (1)
Title |
---|
TOMOKAZU KAWAHARA ET AL.: "Kao Hyomen no Bisho Tokucho ga Nasu Global Kozo o Mochiita Jinbutsu Ninsho", DAI 5 KAI SYSTEM INTEGRATION BUMON GAKUJUTSU KOENKAI (SI2004), 17 December 2004 (2004-12-17), pages 619 - 620 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016096987A (ja) * | 2014-11-20 | 2016-05-30 | 株式会社日立製作所 | 生体認証装置 |
CN109978810A (zh) * | 2017-12-26 | 2019-07-05 | 柴岗 | 痣的检测方法、系统、设备及存储介质 |
CN109978810B (zh) * | 2017-12-26 | 2024-03-12 | 南通罗伯特医疗科技有限公司 | 痣的检测方法、系统、设备及存储介质 |
WO2020158158A1 (ja) * | 2019-02-01 | 2020-08-06 | ミツミ電機株式会社 | 認証装置 |
JP2020126371A (ja) * | 2019-02-01 | 2020-08-20 | ミツミ電機株式会社 | 認証装置 |
Also Published As
Publication number | Publication date |
---|---|
US20110002511A1 (en) | 2011-01-06 |
JP5278424B2 (ja) | 2013-09-04 |
JPWO2009107470A1 (ja) | 2011-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5278424B2 (ja) | 黒子識別装置、個人認証装置、方法、及び、プログラム | |
Priesnitz et al. | An overview of touchless 2D fingerprint recognition | |
KR102561723B1 (ko) | 모바일 디바이스를 사용하여 캡처된 화상을 사용하여 지문 기반 사용자 인증을 수행하기 위한 시스템 및 방법 | |
Raghavendra et al. | Novel image fusion scheme based on dependency measure for robust multispectral palmprint recognition | |
CN107423690B (zh) | 一种人脸识别方法及装置 | |
JP4443722B2 (ja) | 画像認識装置及び方法 | |
JP6650946B2 (ja) | モバイル・デバイスを用いてキャプチャしたイメージを使用する指紋ベースのユーザ認証を実行するためのシステムおよび方法 | |
US11263432B2 (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices | |
JP6664163B2 (ja) | 画像識別方法、画像識別装置及びプログラム | |
US8385613B2 (en) | Biometric authentication | |
US20180165508A1 (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices | |
JP2016081212A (ja) | 画像認識装置、画像認識方法、および、画像認識プログラム | |
KR101373274B1 (ko) | 안경 제거를 통한 얼굴 인식 방법 및 상기 안경 제거를 통한 얼굴 인식 방법을 이용한 얼굴 인식 장치 | |
JP5241606B2 (ja) | オブジェクト識別装置及びオブジェクト識別方法 | |
Narang et al. | Face recognition in the SWIR band when using single sensor multi-wavelength imaging systems | |
WO2020195732A1 (ja) | 画像処理装置、画像処理方法、およびプログラムが格納された記録媒体 | |
JP5698418B2 (ja) | 虹彩認識による識別 | |
Pan et al. | Securitas: user identification through rgb-nir camera pair on mobile devices | |
Srinivas et al. | Human identification using automatic and semi‐automatically detected facial Marks | |
JP2010257158A (ja) | オブジェクト識別装置及びオブジェクト識別方法 | |
JP2008015871A (ja) | 認証装置、及び認証方法 | |
Hongo et al. | Personal authentication with an iris image captured under visible-light condition | |
JP7103443B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
Nakazaki et al. | Fingerphoto recognition using cross-reference-matching multi-layer features | |
Jilani et al. | The computer nose best |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09713927 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12919061 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010500632 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09713927 Country of ref document: EP Kind code of ref document: A1 |