WO2004097743A1 - 偽眼識別方法および装置、偽眼識別プログラム、虹彩認証方法、偽造印刷物識別方法、並びに画像識別方法 - Google Patents
偽眼識別方法および装置、偽眼識別プログラム、虹彩認証方法、偽造印刷物識別方法、並びに画像識別方法 Download PDFInfo
- Publication number
- WO2004097743A1 WO2004097743A1 PCT/JP2004/006224 JP2004006224W WO2004097743A1 WO 2004097743 A1 WO2004097743 A1 WO 2004097743A1 JP 2004006224 W JP2004006224 W JP 2004006224W WO 2004097743 A1 WO2004097743 A1 WO 2004097743A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- eye
- feature amount
- false
- image data
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
Definitions
- the present invention belongs to a technology for identifying whether or not an image of an eye provided for iris authentication is a forged image (a false eye image). It also relates to the application of the technology. Background art
- Patent Literature 1 and Patent Literature 2 disclose a conventional method for identifying forgery of an iris image by selectively turning on a plurality of near-infrared light sources, a biological response obtained from a temporal change in pupil diameter, and a bright spot generated near the pupil.
- a system that discriminates between a living eye and a false eye by changing the position is disclosed.
- Patent Document 1 Patent No. 3 3 1 2 3 0 3
- Patent Document 2 Patent No. 3 3 5 6 4 8
- an object of the present invention is to provide a false eye identification method that can be realized with a simple configuration. Disclosure of the invention
- the present invention focuses on the fact that an image output from a printer has a feeling of roughness, and detects the presence or absence of the feeling of roughness in an image by image processing.
- Ingredient Specifically, for example, a band limitation is applied to the image data, a predetermined feature amount is extracted from the image data after the band limitation, and the presence or absence of roughness is detected from the feature amount.
- a band limitation is applied to the image data
- a predetermined feature amount is extracted from the image data after the band limitation
- the presence or absence of roughness is detected from the feature amount.
- the present invention uses the false eye identification technology described above to identify whether a banknote or a security taken in an image is a forged print. It also identifies whether or not the image is a photograph of a printed matter.
- a false eye image and a living eye image can be distinguished from each other by image processing. Therefore, with a simple configuration, illegal spoofing due to the false eye image can be eliminated, and the practical effect is large.
- FIG. 1 is a flowchart showing a processing flow of the false eye identification method according to the first embodiment of the present invention.
- Figure 2 is a living eye image of a living eye.
- Figure 3 is a false eye image obtained by capturing an eye image printed out on recycled paper.
- FIG. 4 is a false eye image taken from an eye image printed out on high-quality dedicated paper.
- FIG. 5 is a block diagram showing a configuration of the false eye identification device according to the first embodiment of the present invention.
- FIG. 6 shows the result of band limitation on the living eye image of FIG.
- FIG. 7 shows the result of band limiting the false eye image of FIG.
- FIG. 8 shows the result of band limiting the false eye image of FIG.
- Fig. 9 is a graph showing the distribution of the feature values of the living eye image and the false eye image, in which the variation and variance around the center of gravity are used as the feature values.
- Fig. 10 is a graph showing the distribution of feature values between the living eye image and the false eye image.
- the variance is a feature amount.
- FIG. 11 is a graph showing the characteristic amount distribution of the living eye image and the false eye image, in which skewness and variance are used as characteristic amounts.
- FIG. 12 is a graph showing the feature value distribution of the living eye image and the false eye image, and the feature value is the mean square and the variance.
- FIG. 13 is a graph showing the distribution of the feature amounts of the living eye image and the false eye image, in which kurtosis and variance are used as the feature amounts.
- FIG. 14 is a graph showing the characteristic amount distribution of the living eye image and the false eye image, in which the cubic mean and the quartic mean are used as characteristic amounts.
- FIG. 15 is a flowchart showing the flow of the process of the iris authentication method according to the second embodiment of the present invention.
- FIG. 16 is a diagram illustrating a configuration example of a vending machine that executes a method of identifying a forged printed material according to the third embodiment of the present invention.
- a false eye identification method a step of receiving image data of an image including an eye, and detecting presence or absence of roughness in the image by surface image processing from the image data And a step of determining that the eye is a false eye when it is detected that the image has a feeling of roughness.
- the image processing includes a step of performing a band limitation on the image data, and a step of extracting a predetermined feature amount from the image data after the band limitation.
- a false eye identification method according to a first aspect for detecting presence / absence of roughness using feature amount data is provided.
- the predetermined feature amount is one or a combination of two or more of a moment of a pixel value, a center moment, a skewness, and a kurtosis.
- the false eye identification method according to the second aspect, wherein a pixel coordinate value is used together with a pixel value in extracting the predetermined feature amount.
- the present invention provides a false eye identification method according to a second aspect, wherein a center position of a pupil or an iris is used in combination with a pixel value.
- the false eye identification method according to the second aspect, wherein a high-pass filter or a band-pass filter is used for the band limitation.
- the false eye identification method according to the second aspect, wherein the extraction of the predetermined feature amount is performed in the vicinity of an iris region or a pupil region.
- the pseudo-eye identification method according to the second aspect, wherein the extraction of the predetermined feature amount is performed on or near a line passing through a pupil center or an iris center.
- the image processing includes a step of performing frequency analysis on the image data, and a step of extracting a predetermined feature amount from the data after frequency analysis. Is provided.
- a false eye identification method a step of receiving image data of an image including an eye, a step of performing band limitation on the image data, and a step of performing band limitation on the image data after band limitation
- a method comprising: a step of extracting a predetermined feature amount; and a step of identifying whether the eye is a false eye or a living eye based on data of the extracted feature amount.
- a distribution of the predetermined feature amount is prepared in advance for the living eye image and the false eye image, and the living eye image is extracted for the extracted feature amount data. And the distance from the feature amount distribution of the pseudo-eye image is calculated, and the eye corresponds to the eye of the living eye or the false eye that has the shorter calculated distance.
- a 10th aspect of the present invention provides a false eye identification method according to a tenth aspect.
- an image input unit that inputs image data of an image including eyes, and a band that performs band limitation on the image data input to the image input unit
- a limiting unit a feature amount extracting unit that extracts a predetermined feature amount from the image data processed by the band limiting unit, and the eye based on the feature amount data extracted by the feature amount extracting unit.
- an identification unit for identifying a false eye or a living eye.
- the computer includes: Performing a band limitation, extracting a predetermined feature amount from the band-limited image data, and identifying whether the eye is a false eye or a living eye based on the extracted feature amount data.
- Performing a band limitation extracting a predetermined feature amount from the band-limited image data, and identifying whether the eye is a false eye or a living eye based on the extracted feature amount data.
- an image identification method there is provided a step of receiving image data, and a step of detecting presence or absence of roughness in the image by image processing from the image data, When it is detected that the image has a feeling of roughness, the image is determined to be a photograph of a printed matter.
- FIG. 1 is a flowchart showing a processing flow of the false eye identification method according to the first embodiment of the present invention.
- step S1 surface image data of an image including an eye is input, and in step S2, band limitation is performed on the image data input in step S1.
- step S3 a predetermined characteristic amount is extracted from the image data subjected to the band limiting process in step S2, and in step S4, the image is projected on the image based on the characteristic amount extracted in step S3. Identify the false eye or living eye.
- the predetermined feature amount used here can detect the presence or absence of a feeling of roughness in an image, and will be specifically described later. Then, when it is detected from the extracted feature amount data that the image has a glare, it is determined that the eye shown in the image is a false eye. I do.
- FIG. 5 is a block diagram showing the configuration of the false eye identification device according to the present embodiment.
- the false eye discriminating apparatus 10 includes an image input unit 11 for executing step S1, a band limiting unit 12 for executing step S2, and a feature amount extracting unit 1 for executing step S3. 3, and an identification unit 14 for executing step S4.
- the image input unit 11 is provided with, for example, an iris image captured by the camera 5.
- Figure 2 is a schematic diagram of an iris image (biological eye image) taken by a camera with infrared sensitivity under near-infrared illumination.
- Figs. 3 and 4 show iris images (false eye images) obtained by outputting the image of Fig. 2 to a printer and photographing the output of the printer again with a camera with infrared sensitivity under near-infrared illumination. .
- the iris image of the legitimate user the living eye image in Fig. 2 is output to a printer and used.
- the printer output as shown in FIGS. 3 and 4 is grainy in appearance, and that the image has a so-called “grainy” feeling.
- step S1 an image including eyes, more preferably, an iris image is input.
- the input of the image is usually performed by the camera on the spot at the time of identification, but the image captured by the camera may be transmitted and input via a network.
- the image input in step S1 is band-limited by a band-pass filter and a high-pass filter. As a bandpass filter
- a DOG (Difference of Gaussian) finoleta represented by a difference between G aus s s an an filters having different characteristics, a Laplacian G a s s s i an fin ⁇ ⁇ ⁇ a b a a a b b b a a ⁇ ⁇ ⁇ ⁇
- a known one such as a Sover 1 filter can be used as the high-pass finoleta.
- FIGS. 6, 7, and 8 are examples of band limiting the iris images of FIGS. 2, 3, and 4 using a band-pass filter.
- Figures 6 to 8 show that the brighter the pixel, the greater the power. Also, the square of the filter output is shown so that the power distribution can be easily understood.
- FIG. 6 is compared with FIG. 7 and FIG. 8, it is obvious that the characteristics are greatly different.
- the band limit is applied to the living eye image
- the power is intensively increased near the pupil and eyelid, and the shape of the eye is raised
- the false eye image output from the printer is
- Fig. 7 and Fig. 8 with the band limited the power distribution is uniform outside the pupil, and the shape of the eye cannot be recognized at all. This uniform power distribution is considered to be due to a paper pattern or a toner-specific pattern.
- a predetermined feature amount to be used for identification is extracted from the image band-limited in step S2.
- the predetermined feature amount for example, (Equation 1)
- I (x, y) is the pixel value of the image after band limitation
- N is the number of pixels from which the feature is extracted
- A is the target area from which the feature is extracted.
- the statistics shown in (Equation 4), (Equation 5), and (Equation 6) may be used.
- the feature value on ⁇ is a statistic related to the distribution of pixel values.
- a feature value such as (Expression 7) may be used in consideration of the pixel coordinate value (x, y).
- the square of the pixel coordinate value is multiplied by the square of the pixel value I (x, y) in order to evaluate the power of the band-limited image and its position together.
- a characteristic amount as shown in (Equation 8), which is obtained by normalizing (Equation 7) with the bandwidth of an image whose band is limited, may be used.
- Equation 7 the square of the pixel coordinate value is multiplied by the square of the pixel value I (x, y) in order to evaluate the power of the band-limited image and its position together.
- a characteristic amount as shown in (Equation 8), which is obtained by normalizing (Equation 7) with the bandwidth of an image whose band is limited, may be used. (Equation 7)
- the feature amount may be extracted for the entire image as a matter of course, or may be extracted for pixels at intervals of several pixels, or for a predetermined area or a pixel on a predetermined line.
- step S4 whether the image input in step s1 is a living eye image or a false eye image is identified from the feature amount data obtained in step S3. Identification here Is performed as follows, for example. First, the distribution of feature amounts extracted from a plurality of living eye images and the distribution of feature amounts extracted from a plurality of false eye images are prepared in advance. Then, the distance from the feature distribution of the living-eye image and the distance from the feature distribution of the false-eye image are calculated for the data of the characteristic amount obtained in step S3. For this distance calculation, for example, the Mahalanobis distance shown in (Equation 9) is used. Then, it is determined that the calculated distance belongs to the closer distribution, and according to the determination result, it is determined whether the eye shown in the image is a living eye or a false eye.
- the Mahalanobis distance shown in (Equation 9) is used.
- X is the feature vector extracted from the input image
- ⁇ is the average of the feature
- ⁇ is the covariance matrix of the feature
- d is the Mahalanobis distance
- FIGS. 9 to 14 are graphs showing the distribution of the feature amounts of the living eye image and the false eye image.
- the vertical axis represents the variance (Equation 4) and the horizontal axis represents the variation around the center of gravity (Equation 8).
- the vertical axis represents the variance
- the horizontal axis represents the cubic mean (Equation 2)
- the vertical axis in Figure 11 the variance
- the horizontal axis the skewness (Equation 3)
- the vertical axis represents the vertical axis in Figure 12
- the horizontal axis represents the mean square (Equation 3)
- the vertical axis represents variance
- the horizontal axis represents kurtosis (Equation 6)
- the horizontal axis represents kurtosis (Equation 6).
- the root mean is taken. From the distributions shown in Fig. 9 to Fig. 14, it can be seen that it is possible to distinguish between a living eye and a false eye with a relatively small number of feature values, such as one or two. In particular, as can be seen from FIGS. 10 and 12, even if cubic mean and quartic mean are used alone, accurate discrimination is possible.
- a living eye image obtained by capturing a living eye and a false eye image obtained by capturing an eye image output by a printer can be identified by image processing. Wear.
- the filter for applying the band limitation may be a single filter or a plurality of filters. That is, from a single filter output,
- One or two or more types of feature values may be extracted to distinguish between a living eye image and a false eye image, or more feature values may be extracted using a plurality of filters having different frequency characteristics. It may be extracted, and the same or better effects as in the present embodiment can be obtained.
- the method of detecting the roughness is not limited to the method described in the present embodiment, and another method may be used.
- a frequency analysis such as FFT (Fast Fourier Transform) may be performed, and a predetermined feature amount may be extracted from the analyzed frequency data to detect a sense of roughness.
- the luminance frequency distribution luminance histogram
- the roughness may be detected from the shape of the luminance histogram.
- FIG. 15 is a flowchart showing the flow of the process of the iris authentication method according to the second embodiment of the present invention.
- step S11 image data of an image including an eye is input, and in step S12, te color recognition SE is performed.
- the iris authentication here may be performed using a known method disclosed in, for example, Japanese Patent Application Laid-Open No. 8-504979, and a detailed description thereof will be omitted.
- the iris authentication for example, when it is determined that the input iris pattern matches the registered iris pattern by the above-described method, the authentication is determined to be OK (YES in S13), and the next false eye identification is performed. Proceed to step S14. If not (NO at S13), the certifier is rejected from passing through the gate or accessing the information.
- step S14 a living eye and a false eye are identified by the same method as that described in the first embodiment. That is, the band limitation step S2, the feature amount extraction step S3, and the living eye false eye identification step S4 in the flowchart of FIG. 1 are executed in step S14. Then, when it is determined that the eye is a living eye in step S14 (YES in S15), access is permitted. If it is determined that the eye is false, the access is denied (NO in S15). When the access is denied, for example, a notification may be made to a predetermined place or organization such as a security guard room or police.
- the centroids x g and y g in (Equation 7) and (Equation 8) may be replaced with the coordinate value of the pupil center or the iris center. As a result, the amount of calculation for false eye identification can be reduced.
- the target area for feature extraction is limited to the iris area or the vicinity of the iris area, the pupil outer edge or the iris outer edge, or a line passing through the center of the pupil or the iris, preferably a horizontal line. Alternatively, it may be limited to above or near the vertical line. As a result, the calculation time for false eye identification can be significantly reduced.
- the above process may be performed by obtaining information on the pupil region and the iris region in the process of false eye identification itself.
- the false eye identification may be performed prior to the iris authentication, and the iris authentication may be performed only when it is determined that the subject is a living eye. Even in this case, a notification may be sent to a predetermined organization when false eyes are determined.
- the iris authentication process may be performed halfway, and false eye identification may be performed at a stage where the pupil region or the iris region is detected.
- Each step of the false eye identification method according to the present invention may be implemented in whole or in part using dedicated hardware, or may be implemented in software by a computer program. That is, the pseudo-eye identification method according to the present invention can be realized by an apparatus including a computer that executes a program for realizing the method, and the program for realizing the method is computer-readable. It can be realized by recording the program on a simple recording medium and causing a computer to execute the program recorded on the recording medium. (Third embodiment)
- the false eye identification technology as described above can be widely used for other purposes. For example, it can be used to identify whether banknotes or securities taken in images are genuine or genuine, or counterfeit printed matter output from a printer.
- FIG. 16 is a block diagram showing a schematic configuration of a vending machine which executes a counterfeit printed matter identification method according to the third embodiment of the present invention.
- the banknote take-in section 21 takes in the banknote issued by the user.
- the first authenticity determination unit 22 determines whether or not the banknote captured by the banknote capturing unit 21 is a genuine one by the same processing as the false eye identification method described in the first embodiment. Is determined. That is, image data of a banknote image is obtained, and from this image data, the presence or absence of roughness in the image is detected by surface image processing. Is determined to be a forged print output from the printer.
- the coin take-in section 23 takes in coins issued by the user.
- the second authenticity determination unit 24 determines whether or not the coin captured by the coin capturing unit 23 is genuine by the existing counterfeit coin identification method.
- the processing unit 25 performs a commodity transaction with the user in accordance with the amount of the bill and coin determined to be legitimate by the first and second authenticity determination units 22 and 24. Perform processing. As a result, the user can receive the product of his choice.
- each step of the forged printed material identification method and the printed material identification method described here may be realized by using dedicated hardware, or may be realized by software using a computer program. It doesn't matter. That is, the method for identifying a forged printed material and the method for identifying a printed material according to the present invention can be realized by an apparatus including a computer that executes a program for realizing the method, and the method is realized. The program can be realized by recording a program for this on a computer-readable recording medium and causing a computer to execute the program recorded on the recording medium.
- counterfeit banknotes can be detected, and unauthorized intrusion using printed images can be prevented.
Landscapes
- Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Collating Specific Patterns (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04730091A EP1619624A4 (en) | 2003-04-28 | 2004-04-28 | ARTIFICIAL EYE DISTINCTION DEVICE AND METHOD, PROGRAM THEREOF, IRIS RECOGNITION METHOD, FALSE PRINT DISTINGUISHING METHOD, AND IMAGE DISTINCTION METHOD |
US10/529,316 US7660443B2 (en) | 2003-04-28 | 2004-04-28 | Artificial eye distinguishing method and device, artificial eye distinguishing program, iris recognition method, false printed matter distinguishing method, and image distinguishing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-123204 | 2003-04-28 | ||
JP2003123204 | 2003-04-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004097743A1 true WO2004097743A1 (ja) | 2004-11-11 |
Family
ID=33410118
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/006224 WO2004097743A1 (ja) | 2003-04-28 | 2004-04-28 | 偽眼識別方法および装置、偽眼識別プログラム、虹彩認証方法、偽造印刷物識別方法、並びに画像識別方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US7660443B2 (ja) |
EP (1) | EP1619624A4 (ja) |
CN (1) | CN1698068A (ja) |
WO (1) | WO2004097743A1 (ja) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB0705223D0 (en) * | 2007-03-19 | 2007-04-25 | Univ Sussex | Method, apparatus and computer program for analysing medical image data |
US8229803B2 (en) * | 2007-10-16 | 2012-07-24 | Eb Associates Inc. | Systems and methods for tracking lumber in a sawmill |
US9827643B2 (en) | 2011-03-07 | 2017-11-28 | Weyerhaeuser Nr Company | Machine vision based sawmill audit system |
US20120328160A1 (en) * | 2011-06-27 | 2012-12-27 | Office of Research Cooperation Foundation of Yeungnam University | Method for detecting and recognizing objects of an image using haar-like features |
US9505072B2 (en) | 2012-02-03 | 2016-11-29 | Weyerhaeuser Nr Company | Systems and methods for auditing optimizers tracking lumber in a sawmill |
US9230383B2 (en) | 2012-12-28 | 2016-01-05 | Konica Minolta Laboratory U.S.A., Inc. | Document image compression method and its application in document authentication |
US9595038B1 (en) * | 2015-05-18 | 2017-03-14 | Amazon Technologies, Inc. | Inventory confirmation |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04282776A (ja) * | 1991-03-12 | 1992-10-07 | Omron Corp | パターン照合装置 |
JPH0662233A (ja) * | 1992-08-04 | 1994-03-04 | Ricoh Co Ltd | 特殊原稿判定機能付き複写機 |
JPH0662241A (ja) * | 1992-08-13 | 1994-03-04 | Ricoh Co Ltd | 画像形成装置 |
JPH07121722A (ja) * | 1993-06-23 | 1995-05-12 | Toshiba Corp | 画像評価装置 |
JPH11244261A (ja) * | 1998-03-05 | 1999-09-14 | Oki Electric Ind Co Ltd | アイリス認識方法及び装置、データ変換方法及び装置 |
JP2000076514A (ja) * | 1998-08-31 | 2000-03-14 | Oji Paper Co Ltd | 偽札識別方法および装置 |
JP2000298727A (ja) * | 1999-03-23 | 2000-10-24 | Lg Electronics Inc | 虹彩認識システムの偽造判別方法 |
JP2003030659A (ja) * | 2001-07-16 | 2003-01-31 | Matsushita Electric Ind Co Ltd | 虹彩認証装置及び虹彩撮像装置 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4641349A (en) * | 1985-02-20 | 1987-02-03 | Leonard Flom | Iris recognition system |
US5291560A (en) * | 1991-07-15 | 1994-03-01 | Iri Scan Incorporated | Biometric personal identification system based on iris analysis |
US6363164B1 (en) * | 1996-05-13 | 2002-03-26 | Cummins-Allison Corp. | Automated document processing system using full image scanning |
JP3436293B2 (ja) * | 1996-07-25 | 2003-08-11 | 沖電気工業株式会社 | 動物の個体識別装置及び個体識別システム |
EP0953183B1 (en) * | 1997-01-17 | 2003-06-04 | BRITISH TELECOMMUNICATIONS public limited company | Security apparatus and method |
DE69811472T2 (de) * | 1997-09-16 | 2003-12-18 | Invisitech Corp | Personenidentifizierungssystem mit auswertung multipler parameter mit geringer kreuzkorrelation |
US6104812A (en) * | 1998-01-12 | 2000-08-15 | Juratrade, Limited | Anti-counterfeiting method and apparatus using digital screening |
JP3315648B2 (ja) * | 1998-07-17 | 2002-08-19 | 沖電気工業株式会社 | アイリスコード生成装置およびアイリス認識システム |
US6138185A (en) | 1998-10-29 | 2000-10-24 | Mcdata Corporation | High performance crossbar switch |
US6377699B1 (en) * | 1998-11-25 | 2002-04-23 | Iridian Technologies, Inc. | Iris imaging telephone security module and method |
JP2000185031A (ja) | 1998-12-22 | 2000-07-04 | Oki Electric Ind Co Ltd | 個体識別装置 |
US6332193B1 (en) * | 1999-01-18 | 2001-12-18 | Sensar, Inc. | Method and apparatus for securely transmitting and authenticating biometric data over a network |
US6247813B1 (en) * | 1999-04-09 | 2001-06-19 | Iritech, Inc. | Iris identification system and method of identifying a person through iris recognition |
JP2001034754A (ja) * | 1999-07-19 | 2001-02-09 | Sony Corp | 虹彩認証装置 |
US6920236B2 (en) * | 2001-03-26 | 2005-07-19 | Mikos, Ltd. | Dual band biometric identification system |
US7054461B2 (en) * | 2002-02-15 | 2006-05-30 | Pitney Bowes Inc. | Authenticating printed objects using digital watermarks associated with multidimensional quality metrics |
EP1481347A4 (en) * | 2002-02-19 | 2009-08-26 | Digimarc Corp | SECURITY SYSTEMS USING DRIVING LICENSES AND OTHER DOCUMENTS |
-
2004
- 2004-04-28 EP EP04730091A patent/EP1619624A4/en not_active Withdrawn
- 2004-04-28 US US10/529,316 patent/US7660443B2/en active Active
- 2004-04-28 CN CN200480000461.3A patent/CN1698068A/zh active Pending
- 2004-04-28 WO PCT/JP2004/006224 patent/WO2004097743A1/ja active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04282776A (ja) * | 1991-03-12 | 1992-10-07 | Omron Corp | パターン照合装置 |
JPH0662233A (ja) * | 1992-08-04 | 1994-03-04 | Ricoh Co Ltd | 特殊原稿判定機能付き複写機 |
JPH0662241A (ja) * | 1992-08-13 | 1994-03-04 | Ricoh Co Ltd | 画像形成装置 |
JPH07121722A (ja) * | 1993-06-23 | 1995-05-12 | Toshiba Corp | 画像評価装置 |
JPH11244261A (ja) * | 1998-03-05 | 1999-09-14 | Oki Electric Ind Co Ltd | アイリス認識方法及び装置、データ変換方法及び装置 |
JP2000076514A (ja) * | 1998-08-31 | 2000-03-14 | Oji Paper Co Ltd | 偽札識別方法および装置 |
JP2000298727A (ja) * | 1999-03-23 | 2000-10-24 | Lg Electronics Inc | 虹彩認識システムの偽造判別方法 |
JP2003030659A (ja) * | 2001-07-16 | 2003-01-31 | Matsushita Electric Ind Co Ltd | 虹彩認証装置及び虹彩撮像装置 |
Also Published As
Publication number | Publication date |
---|---|
US7660443B2 (en) | 2010-02-09 |
CN1698068A (zh) | 2005-11-16 |
US20050286747A1 (en) | 2005-12-29 |
EP1619624A1 (en) | 2006-01-25 |
EP1619624A4 (en) | 2010-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102387571B1 (ko) | 라이브니스 검사 방법 및 장치 | |
US9355299B2 (en) | Fraud resistant biometric financial transaction system and method | |
JP4686153B2 (ja) | 情報処理装置、不正者検出方法および現金自動預け払い装置 | |
Jain | Biometric recognition: how do I know who you are? | |
JP5076563B2 (ja) | 顔照合装置 | |
KR20190089387A (ko) | 라이브니스 검사 방법 및 장치 | |
EP1416425A1 (en) | System and method for detecting a face | |
WO2018129687A1 (zh) | 指纹防伪方法和设备 | |
US20200349372A1 (en) | Method and apparatus with liveness detection | |
KR102038576B1 (ko) | 홍채 인식 시스템의 부정행위 검출 방법 | |
JP2002236666A (ja) | 個人認証装置 | |
KR100825689B1 (ko) | 얼굴 위장 판별 방법 | |
CN108491768A (zh) | 角膜反射人脸认证抗欺诈攻击方法、人脸特征认证系统 | |
WO2004097743A1 (ja) | 偽眼識別方法および装置、偽眼識別プログラム、虹彩認証方法、偽造印刷物識別方法、並びに画像識別方法 | |
Aishwarya et al. | Palm print recognition using liveness detection technique | |
JP3598114B1 (ja) | 偽眼識別方法および装置、偽眼識別プログラム並びに虹彩認証方法 | |
WO2022059151A1 (ja) | 顔認証方法、顔認証プログラム、および顔認証装置 | |
Park et al. | Iris recognition against counterfeit attack using gradient based fusion of multi-spectral images | |
CN115775409A (zh) | 一种人脸图像防篡改融合检测方法 | |
JP2010244570A (ja) | 情報処理装置、不正者検出方法および現金自動預け払い装置 | |
Thukral et al. | IRIS spoofing through print attack using SVM classification with gabor and HOG features | |
JP2004355652A (ja) | 偽造印刷物識別方法および装置、並びに画像識別方法および装置 | |
JP2005084979A (ja) | 顔認証システムおよび方法並びにプログラム | |
Priyanka et al. | Genuine selfie detection algorithm for social media using image quality measures | |
WO2024042674A1 (ja) | 情報処理装置、認証方法および記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 20048004613 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004730091 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10529316 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2004730091 Country of ref document: EP |