WO2010050206A1 - Système de détection de mystification, procédé de détection de mystification et programme de détection de mystification - Google Patents

Système de détection de mystification, procédé de détection de mystification et programme de détection de mystification Download PDF

Info

Publication number
WO2010050206A1
WO2010050206A1 PCT/JP2009/005709 JP2009005709W WO2010050206A1 WO 2010050206 A1 WO2010050206 A1 WO 2010050206A1 JP 2009005709 W JP2009005709 W JP 2009005709W WO 2010050206 A1 WO2010050206 A1 WO 2010050206A1
Authority
WO
WIPO (PCT)
Prior art keywords
feature point
image
feature
coordinates
impersonation
Prior art date
Application number
PCT/JP2009/005709
Other languages
English (en)
Japanese (ja)
Inventor
鈴木哲明
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2010535673A priority Critical patent/JP5445460B2/ja
Priority to CN200980142953.9A priority patent/CN102197412B/zh
Priority to US13/126,339 priority patent/US8860795B2/en
Publication of WO2010050206A1 publication Critical patent/WO2010050206A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63KRACING; RIDING SPORTS; EQUIPMENT OR ACCESSORIES THEREFOR
    • A63K1/00Race-courses; Race-tracks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/008Specific post-processing after tomographic reconstruction, e.g. voxelisation, metal artifact correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces

Definitions

  • the present invention relates to an impersonation detection system, an impersonation detection method, and an impersonation detection program capable of authenticating an individual using a photograph of a registered person or a face image taken on a monitor, in particular when authenticating the person.
  • An impersonation detection system refers to a system that performs identity authentication using a photograph of a registered person or a face image on a monitor, for example, when performing identity authentication using a face or head.
  • An impersonation detection system and impersonation detection method there exists a thing described in patent document 1, for example.
  • Paragraph [0015] of the same document describes that an illumination environment is changed using an illuminating lamp, and impersonation is excluded based on the similarity of face images in different illumination environments.
  • paragraph [0061] of the same document describes that various images whose face orientations are changed in advance are acquired, and the acquired face images from a specific direction are registered as an authentication dictionary. .
  • Patent Document 2 three-dimensional information of a user is generated using a plurality of user images and respective imaging angles, and the face of the person's face stored in advance is stored. It is described that the identity impersonation by a photograph is eliminated by collating with a three-dimensional shape.
  • Patent Document 3 Non-Patent Document 1, Non-Patent Document 2, and Non-Patent Document 3 will be described later.
  • Kanazawa, Kanaya “Extraction of image feature points for computer vision”
  • IEICE Vol.87, No.12, 2004 T. K. Leung, M. C. Burl, and P. Perona "Finding Faces in Cluttered Scenes using Random Labeled Graph Matching", Fifth International Conference on Computer Vision pp.637-644, 1995 R. Suk thankar, R. G. Stockton, M. D. Mullin, “Smarter presentations: Exploiting Homography in Camera-Projector Systems", Proceedings of International Conference on Computer Vision, Vol. 1, pp. 247-253, July 2001
  • an additional device other than the image capturing device used for authentication is required.
  • controllable external lighting is required to create different lighting environments
  • dedicated distance detection devices are required for distance measurement
  • complicated processing devices are required to acquire the three-dimensional information of the person. It is.
  • impersonation of a photograph or the like is erroneously accepted due to a change in the lighting environment at the time of authentication.
  • impersonation is determined using variations in luminance of the subject, but it is easy to increase variations in luminance of the subject by changing the lighting environment during authentication. As a result, the probability of accepting a photograph as a real thing increases.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide an impersonation detection system that can detect impersonation with high accuracy based on a captured image without using an additional apparatus other than the image photographing apparatus. It is to provide an impersonation detection method and an impersonation detection program. Another object of the present invention is to provide an impersonation determination system that is robust against changes in the lighting environment.
  • a first image is obtained by photographing the inspection target object from the first angle
  • a second image is obtained by photographing the inspection target object from a second angle different from the first angle.
  • the first feature point is detected from the imaging unit and the first image
  • the first feature point coordinates representing the position of the detected feature point are obtained
  • the second feature point is detected from the second image.
  • a calculation unit that obtains second feature point coordinates representing the position of the detected feature point, a feature point association unit that associates the first feature point with the second feature point, and the second feature point coordinate
  • a feature conversion unit that performs a planar projective transformation from the second image to the first image to obtain conversion coordinates, and an error between the conversion coordinates and the first feature point coordinates is a certain value or less.
  • a similarity determination unit that determines that impersonation has been attempted. And detection system, to provide.
  • the step of photographing the inspection object from the first angle to obtain the first image, the step of calculating the first feature point coordinates from the first image, and the inspection object from the second angle A step of obtaining a second image by photographing an object, a step of calculating second feature point coordinates from the second image, and a feature for associating the first feature point coordinates with the second feature point coordinates
  • a point correspondence step a feature conversion step of executing a planar projective transformation from the second image to the first image with respect to the second feature point coordinate, and obtaining a converted coordinate, the converted coordinate and the first
  • a spoofing detection method including a similarity determination step of determining that impersonation has been attempted when an error from one feature point coordinate is equal to or less than a certain value.
  • the first feature point coordinates which are the positions of the first feature points are obtained from the first image of the inspection object photographed from the first angle, and the first feature point is different from the first angle.
  • a system, an impersonation detection method, and an impersonation detection program can be provided.
  • the determination is performed based only on the image information of the inspection object, it is possible to provide an impersonation detection system, an impersonation detection method, and an impersonation detection program without requiring an additional device other than the imaging device.
  • the determination is based on the variation in the position of the feature point rather than the variation in the brightness of the inspection target object, it is possible to provide an impersonation detection system, an impersonation detection method, and an impersonation detection program that are robust against variations in the lighting environment.
  • FIG. 1 is a diagram illustrating a configuration example of an impersonation detection system.
  • the inspection target object is imaged from the first angle to obtain the first image
  • the inspection target object is imaged from the second angle different from the first angle to obtain the second image.
  • the first feature point is detected from the imaging unit 2 and the first image
  • the first feature point coordinates representing the position of the detected feature point are obtained
  • the second feature point is detected from the second image.
  • a feature point coordinate calculation unit 101 that obtains second feature point coordinates representing the position of the detected feature point, a feature point association unit 102 that associates the first feature point with the second feature point,
  • a feature conversion unit 104 that performs a planar projective transformation from the second image to the first image on the second feature point coordinates to obtain conversion coordinates, the conversion coordinates, the first feature point coordinates, Class that determines that impersonation was attempted when the error of the error is below a certain value
  • a degree determination unit 105 provides impersonation detection system, the comprises a.
  • the plane transformation matrix estimation unit 103 may not perform any processing.
  • a transformation matrix is calculated using the plane transformation matrix estimation unit 103 each time, as described later, and similarity determination is performed.
  • the impersonation detection system includes a data processing device 1 that operates under program control and an imaging unit 2 that captures an object to be inspected.
  • the data processing device 1 includes the feature point coordinate calculation unit 101, the feature point association unit 102, the plane transformation matrix estimation unit 103, the feature conversion unit 104, and the similarity determination unit. 105.
  • the imaging unit 2 is a digital still camera, a digital video camera, a CCD camera module, or the like, and has a function of photographing the inspection target object and a function of outputting the photographed image data to the data processing device 1.
  • the feature point coordinate calculation unit 101 is based on a first angle image (hereinafter referred to as a first image) and an image at a second angle (hereinafter referred to as a second image) of the inspection object captured by the imaging unit 2.
  • the first feature point coordinates representing the position of the feature point of the first image and the second feature point coordinate representing the position of the feature point of the second image are obtained.
  • a feature point means a point that looks the same when the part is viewed from a specific angle, such as the position of the eye, the edge of the mouth, the mole or nose tip, the highest part of the cheekbone, or a part of the beard.
  • An area where a specific pattern exists, such as a hairline or a corner of a frame of glasses, may exist.
  • Non-patent Document 1 a KLT method (Non-patent Document 1), a specific pattern to be detected is registered in advance, and template matching is performed. This refers to a place that is detected using a method for detecting the part (Non-Patent Document 2).
  • template matching refers to a method in which a specific pattern (template image) registered in advance is compared with a captured image, and an image region similar to the specific pattern is searched from the captured image.
  • specific patterns representing various feature points such as facial features and backgrounds are prepared in advance, and the first feature points are extracted by comparing the specific pattern with the first image.
  • the second feature point is extracted by comparing with the second image.
  • the degree of correlation between the specific pattern and the image area determined to correspond to this is called “reliability”. The higher the reliability, the more similar the specific pattern and the corresponding image area are determined to be. Can do.
  • the feature points are described as positions on the face.
  • the obtained feature points may be feature points obtained from a background, a face, a concealed object in front of the head, or the like. This is because the feature points of the background and the concealed object move in a plane like a face in the case of a photograph, and a change different from the change in the angle of the face can be expected in the case of a real object.
  • the feature point association unit 102 associates the first feature point with the second feature point. That is, the second feature point coordinates calculated by the feature point coordinate calculation unit 101 are associated with the first feature point coordinates calculated by the feature point coordinate calculation unit 101.
  • the correspondence relationship between the first feature point and the second feature point associated by the feature point correspondence unit 102 is determined by the luminance pattern around the first feature point and the second feature point. Obtained on the basis.
  • the correspondence relationship is such that the first feature point and the second feature point are associated with feature points having similar brightness patterns, frequencies, edge components, etc. around the feature points.
  • the association is performed such that the feature points detected using template matching are associated. This association is performed, for example, by calculating the degree of correlation between the image area representing the first feature point and the image area representing the second feature point, and the pair having the high correlation degree (the first and first pairs forming a pair). 2 feature points) can be extracted.
  • a set of the first feature point and the second feature point is referred to as a corresponding point.
  • the plane transformation matrix estimation unit 103 transforms the second image into the first image in a planar manner using some of the feature point coordinates among the feature points associated by the feature point association unit 102. Ask for.
  • Non-Patent Document 3 may be used to calculate the transformation matrix for planar transformation.
  • Non-Patent Document 3 describes a method for calculating a transformation matrix called a homography matrix.
  • the homography matrix H can be expressed by the following formula 1.
  • the homography matrix is a coordinate transformation matrix between two cameras that capture the same point of the object to be inspected, and the coordinates of that point on the first image obtained by the camera 1 when the same point is captured.
  • (x, y) be the coordinates of that point on the second image obtained by the camera 2 (X, Y).
  • the coordinates on two cameras on an object surface (n ⁇ 4) are (x i , y i ) and (X i , Y i )
  • the matrix A is defined as follows.
  • an optimal homography matrix H that defines a projection between two images is obtained.
  • the symbol T means transposition of the matrix.
  • (Xw, yw, w) can be obtained by performing the following projective transformation using this homography matrix H as a transformation matrix.
  • the second feature point coordinates can be converted on the first image by the plane conversion matrix estimated in this way, and the converted coordinates can be calculated. Note that, at the time of calculating the conversion matrix, the conversion matrix may be obtained using some corresponding points that are highly likely to be matched.
  • the plane transformation matrix estimation unit 103 uses the first feature point and the second feature point associated with the feature point association unit 102 to correspond to the second feature point transformation coordinate and the second feature point transformation coordinate. What is necessary is just to obtain
  • the first provisional transformation is performed using four pairs of the eight feature point pairs associated with each other. Compute a provisional homography matrix that is a matrix.
  • the second feature point coordinates forming a pair of four points are projected onto the first image, thereby obtaining four converted coordinates. Then, an error (difference) between the converted coordinates of these four points and the first feature point coordinates corresponding thereto is calculated.
  • a provisional homography matrix that is a second provisional transformation matrix is calculated using the remaining four pairs of points.
  • the second feature point coordinates forming a pair of four points are projected onto the first image, thereby obtaining four converted coordinates. Then, an error (difference) between the converted coordinates of these four points and the first feature point coordinates corresponding thereto is calculated.
  • the provisional transformation matrix having the smallest error (difference) is estimated as a formal transformation matrix (homography matrix).
  • the above contents are generally expressed as follows. Assume that there are N correspondences (pairs) between the first feature point and the second feature point. First to Mth groups G 1 to G M each including four pairs are generated from the N pairs. Note that pairs included in the groups G 1 to G M may overlap each other.
  • the provisional homography matrices PH 1 to PH M are calculated for the groups G 1 to G M , respectively.
  • transformed coordinates are calculated for each of these groups G 1 to G M , and an error (for example, a norm in the Euclidean space or Manhattan between the transformed coordinates and the coordinates representing the first feature point corresponding to these transformed coordinates is calculated. Distance). Then, the provisional transformation matrix having the smallest sum of errors is estimated as a transformation matrix (homography matrix).
  • the feature conversion unit 104 uses the conversion matrix calculated by the plane conversion matrix estimation unit 103 to perform a planar projective conversion from the second coordinates to the converted coordinates on the first image, and to project the second image onto the first image. Get the transformation coordinates.
  • the similarity determination unit 105 obtains an error (difference) between the projective transformation coordinates of the second image on the first image obtained by the feature transformation unit 104 and the first coordinates on the first image. If this error (difference) is less than or equal to a certain value, it is determined that the object to be inspected in the first image and the second image has a planar correspondence, and it is determined that the person is impersonating the photograph.
  • the high similarity means that the inspection object is close to the real object.
  • the projective transformation coordinates of the second image on the first image and the first coordinates on the first image are This means that the error (difference) is large.
  • the subject appearing in the first image and the second image is deformed in a planar manner for a certain period of time, it may be determined that the subject is a photograph.
  • the spoofing detection method shown in FIG. 2 includes step A01 in which a first image is obtained by photographing an inspection target object from a first angle, a first feature point is detected from the first image, and the detected feature point is detected.
  • the transformation matrix step A06 described later is calculated in advance, and the plane transformation matrix estimation step A06 in FIG. You can omit it.
  • the conversion matrix estimation step A06 is calculated each time, the similarity calculation step A08 is performed, and the similarity determinations A09 to A11 are performed. It will be.
  • Step A01 First, an image of the subject at a first angle is taken by the imaging unit 2.
  • Step A02 Next, in the first feature coordinate calculation step A02 that has received the first image, the first coordinate that is the position of the first feature point is calculated from the first image.
  • a feature point means a point that looks the same when the part is viewed from a specific angle, such as the position of the eye, the edge of the mouth, the mole or nose tip, the highest part of the cheekbone, or a part of the beard.
  • An area where a specific pattern exists such as a hairline or a corner of a frame of glasses, may exist.
  • the feature point is described as a position on the face.
  • the obtained feature point may be a feature point obtained from a background, a face, a concealed object in front of the head, or the like. This is because the feature points of the background and the concealed object of the face and head move in a plane like a face in the case of a photograph, and it can be expected that a change different from the change in the angle of the face can be obtained in the real object .
  • feature points may be obtained by, for example, a method of extracting a point with a large change in shading on an image or a method of registering a detected specific pattern in advance and detecting the part by template matching. .
  • the feature point is comprised by 1 or more various site
  • Step A03 Subsequently, the imaging unit 2 captures an image of the subject at the second angle.
  • Step A04 The feature point coordinate calculation unit 101 that has received the second image calculates a second coordinate that is the position of the second feature point from the second image.
  • Step A05 Association is performed using the two feature point coordinates calculated in the first feature point coordinate calculation step A02 and the second feature point coordinate calculation step A04.
  • the correspondence relationship associates feature points having similar brightness patterns, frequencies, edge components, and the like around the feature points between the first feature points and the second feature points.
  • it can be obtained by a method such as detecting feature points that are matched using template matching.
  • Step A06 Using the first feature point coordinates and the second feature point coordinates for which correspondence has been obtained as described above, a conversion matrix for planarly converting the second image into the first image is obtained. That is, the plane transformation matrix estimation step performs a projective transformation of the second image to the first image in a planar manner using some of the feature point coordinates among the feature points associated with the feature points.
  • a plane conversion matrix estimation step can be performed if a conversion matrix is calculated in advance. A06 can be omitted.
  • the correspondence between the first feature point and the second feature point associated by the feature point correspondence unit 102 is the relationship between the first feature point and the second feature point.
  • An impersonation detection method obtained based on a peripheral luminance pattern may be used.
  • an impersonation detection method that obtains a transformation matrix using a part of corresponding points that are highly likely to be matched when calculating the transformation matrix may be used.
  • the plane transformation matrix estimation step uses the first feature point and the second feature point associated in the feature point association step to perform coordinate transformation of the second feature point using a provisional transformation matrix. And a spoofing detection method in which a provisional matrix that minimizes an error (difference) between the calculated transformed coordinates and the first feature point corresponding to the feature point is used as the transformation matrix.
  • the impersonation detection method whose transformation matrix is the homography matrix mentioned above may be sufficient. If the first angle and the second angle are given in advance, the transformation coordinates by the plane transformation matrix may be obtained in advance.
  • the feature point conversion process includes a feature conversion (pseudo front feature calculation) step for executing the projective transformation using the transformation matrix obtained by the plane transformation matrix processing.
  • the feature conversion (pseudo front feature calculation) step is a step of projective conversion of the second coordinates onto the first image using the conversion matrix calculated in the plane conversion matrix estimation step A06.
  • Step A08 Finally, in the similarity calculation step A08, the coordinates between the projected conversion coordinates of the second image projected on the first image obtained in the feature conversion (pseudo front feature calculation) step A07 and the conversion coordinates of the first image. Find the error (difference).
  • Step A09 An error (difference) between the first image conversion coordinates and the projection conversion coordinates of the projected second image is compared with a threshold value as a similarity.
  • Steps A10 and A11 If this error (difference) is less than or equal to a certain value, it is determined that the object to be inspected in the first image and the second image has a planar correspondence, and it is determined that the person is impersonating the photograph.
  • the high similarity means that the similarity is close to the real thing, and this determination means that the error (difference) is large.
  • the subject appearing in the first image and the second image is deformed in a planar manner for a certain period of time, it may be determined that the subject is a photograph.
  • the photo is determined A10, and if the similarity is small, the photo is determined A11, and the process is terminated.
  • the impersonation inspection can be performed by the method of converting the second image onto the first image by the plane conversion matrix and calculating the conversion coordinates, the object to be inspected is different when performing personal authentication. Since it is configured to perform spoofing detection only based on whether or not a plurality of images taken from an angle are in a planar projection relationship with each other, it is not necessary to register the three-dimensional information of the certifier himself at the time of identity registration. Accordingly, since it is not necessary to take a person image from a plurality of directions at the time of user registration, convenience for the user is improved.
  • FIG. 3 is a diagram showing a specific correspondence of FIG. 1 described later.
  • the data processing device 1 in FIG. 1 and the data processing device 6 in FIG. 3 can each be constituted by an integrated circuit including a microprocessor, a ROM, a RAM, a signal transmission bus, and an input / output interface. All or a part of the functional blocks 101, 102, 103, 104, 601 and 602 of the data processing devices 1 and 6 may be realized by hardware such as a semiconductor integrated circuit, or a non-volatile memory or an optical disk. It may be realized by a program or a program code recorded on the recording medium. Such a program or program code causes a computer having an arithmetic device such as a CPU to execute all or part of the processing of the functional blocks 101, 102, 103, 104, 601 and 602.
  • the impersonation detection program obtains the first feature point coordinates that are the positions of the first feature points from the first image of the object to be inspected taken from the first angle, and the first angle.
  • Impersonation is attempted when a feature conversion process for obtaining a transformed coordinate by executing a planar projective transformation to an image and an error between the transformed coordinate and the first coordinate of the first feature point is below a certain value.
  • the similarity determination process for determining that the That.
  • a process of acquiring an image of the first angle of the subject by the imaging unit 2 and calculating the first coordinate of the first feature point from the first image The computer is caused to execute a process of calculating the second coordinates of the second feature point from the second image of the inspection target object photographed from a second angle different from the first angle.
  • a feature point means a point that looks the same when the part is viewed from a specific angle, such as the position of the eye, the edge of the mouth, the mole or nose tip, the highest part of the cheekbone, or a part of the beard.
  • An area where a specific pattern exists such as a hairline or a corner of a frame of glasses, may exist.
  • the feature points are described as positions on the face, but the obtained feature points may be feature points obtained on the background, an object in front of the face, the head, or the like.
  • feature points may be obtained by, for example, a method of extracting a point with a large shade change on an image or a method of registering a specific pattern to be detected in advance and detecting the part by template matching. It doesn't matter.
  • the feature point is comprised by 1 or more various site
  • the feature point coordinate calculation unit 101 uses the two feature point coordinates subjected to the feature point calculation process by the feature point coordinate calculation unit 101 to cause the computer to execute the feature point association process.
  • the correspondence relationship is such that the first feature point and the second feature point correspond to feature points having similar brightness patterns, frequencies, edge components, and the like around the feature point.
  • it can be obtained by a method such as detecting feature points that are matched using template matching.
  • the plane transformation matrix estimation unit 103 in FIG. 1 estimates a plane transformation matrix from the first feature point coordinates and the second feature point coordinates for which correspondence has been obtained as described above, and sets the second feature point coordinates by the estimated plane transformation matrix.
  • a process of converting to one image and obtaining converted coordinates is executed by a computer. If the angle formed by the first angle at which the first image is photographed and the second angle at which the second image is photographed is fixed, if a transformation matrix is calculated in advance, a plane transformation matrix estimation unit 103 can be omitted.
  • the comparison between the first feature point and the second feature point associated by the feature point correspondence unit 102 is, for example, the vicinity of the first feature point and the second feature point as described above. Processing obtained based on the luminance pattern may be used.
  • a spoofing detection program that obtains a transformation matrix using a part of corresponding points that are highly likely to be matched when calculating the transformation matrix may be used.
  • a temporary conversion matrix is calculated using some of the corresponding points, the second feature points that are not used for the calculation of the temporary conversion matrix are converted using the temporary conversion matrix, and the conversion coordinates are calculated and calculated.
  • the spoofing detection program may use a provisional transformation matrix that minimizes an error (difference) between the transformed coordinates and the first feature point corresponding to the feature point as a transformation matrix.
  • the transformation matrix may be an impersonation detection program that is a transformation matrix that projects the second image onto the first image in a planar manner.
  • the transformation matrix may be a spoofing detection program that is a homography matrix. If the first angle and the second angle are given in advance, the transformation coordinates by the plane transformation matrix may be obtained in advance.
  • First to M-th groups G 1 to G M each including four pairs are generated from the N pairs, and the provisional homography matrices PH 1 to PH are respectively obtained for these groups G 1 to G M.
  • Processing for calculating M and then obtaining transformed coordinates for each of these groups G 1 to G M , and an error (for example, Euclidean space) between the transformed coordinates and the coordinates representing the first feature point corresponding thereto.
  • the computer executes a process for calculating the above norm or Manhattan distance) and a process for selecting a provisional transformation matrix having the smallest sum of errors as a transformation matrix (homography matrix). Note that the pairs included in the groups G 1 to G M may overlap each other.
  • the similarity determination unit 105 in FIG. 1 performs a process of calculating an error (difference) between the conversion coordinates on the first image obtained by the feature conversion unit 104 and the image of the projective transformation coordinates of the second image subjected to the projective transformation. To run.
  • the computer is caused to execute a process of comparing an error (difference) between the converted coordinates and the corresponding second coordinates as a similarity with a threshold value. If this error (difference) is less than or equal to a certain value, it is determined that the object to be inspected in the first image and the second image has a planar correspondence, and it is determined that the person is impersonating the photograph.
  • the high similarity means that the similarity is close to the real thing, and this determination means that the error (difference) is large.
  • the subject appearing in the first image and the second image is deformed in a planar manner for a certain period of time, it may be determined that the subject is a photograph.
  • the similarity is greater than the threshold, it is determined as a photograph, and if it is small, it is determined as a real thing, and the computer ends the process.
  • the impersonation inspection can be performed by the method of converting the second image onto the first image by the plane conversion matrix and calculating the conversion coordinates, the object to be inspected is different when performing personal authentication. Since it is configured to perform spoofing detection only based on whether or not a plurality of images taken from an angle are in a planar projection relationship with each other, it is not necessary to register the three-dimensional information of the certifier himself at the time of identity registration. Accordingly, since it is not necessary to take a person image from a plurality of directions at the time of user registration, convenience for the user is improved.
  • FIG. 3 is a diagram in which the impersonation detection system of FIG. 1 is applied to the mobile phone 5.
  • the specific aspect of this embodiment assumes a mobile phone 5 with a camera, the imaging unit 2 is a CCD camera 7 attached to the mobile phone 5, and the mobile phone 5 is preliminarily security-locked. .
  • the user When the user unlocks the security lock of the mobile phone 5, the user presses the key button of the mobile phone 5, opens the mobile phone 5 if the mobile phone 5 is a foldable mobile phone 5, or slides the mobile phone 5. If there is, an operation for using the mobile phone 5 such as sliding is performed.
  • the mobile phone 5 acquires an image of the inspection object 12 from the CCD camera 7 using these operations as a trigger. Subsequently, the personal authentication unit 601 determines whether the biometric information in the acquired image is obtained from the same person as the feature registered in advance in the registered feature storage unit 801 of the storage device 8.
  • the collation between the input image and the registered feature is performed based on, for example, a face collation method described in Patent Document 3 or the like. As a result of this collation, if it is determined that the image of the inspection target object 12 is an image of the same person as the person from whom the registered feature is obtained, the image is sent to the user through the monitor 9, the speaker 10, the LED display 11, etc. An instruction is issued to change the angle at which the biological information is captured.
  • the image used for the personal authentication is sent to the feature point coordinate calculation unit 101 as an image of the first angle, and the feature point coordinate calculation unit 101 obtains the first coordinates of the first feature point. .
  • the feature point coordinate calculation unit 101 acquires an image of a second angle different from the first image from the CCD camera 7 and calculates the second feature point coordinates.
  • the feature point association unit 102 associates the first feature point with the second feature point. That is, the second feature point coordinates calculated by the feature point coordinate calculation unit 101 are associated with the first feature point coordinates calculated by the feature point coordinate calculation unit 101.
  • the correspondence relationship is such that the first feature point and the second feature point are associated with feature points having similar brightness patterns, frequencies, edge components, etc. around the feature points.
  • it can be obtained by a method such as detecting feature points that are matched using template matching.
  • the correspondence relationship associates feature points having similar brightness patterns, frequencies, edge components, and the like around the feature points between the first feature points and the second feature points.
  • it can be obtained by detecting feature points that correspond to each other using template matching.
  • the plane transformation matrix estimation unit 103 uses the first feature point and the second feature point associated with the feature point association unit 102 and the second feature point coordinate and the corresponding feature point. The transformation matrix is obtained so that the error (difference) from the first feature point coordinates is minimized.
  • the transformation coordinates obtained by planarly transforming the coordinates of the feature points of the second image onto the first image are A position error (difference) occurs with respect to the first coordinates.
  • the image 14 is shown on the uppermost side.
  • Below the feature point coordinates 15 and the second image feature point coordinates 16 of the first image the feature point coordinates of the first image and the feature point coordinates of the second image are planarly displayed on the first image.
  • the projected feature point coordinates are indicated by stars.
  • the inspection target object is three-dimensional and can be determined as a real face.
  • FIG. 4B shows a flow of determination when a face of a photograph is taken.
  • the first image 23 obtained by photographing the inspection target object from the first angle and the image 24 photographed from the second angle (on the left side of the frame 23 of the inspection object object) are rotated. Shown at the top.
  • Below the feature point coordinates 25 of the first image and the second image feature point coordinates 26, the feature point coordinates of the first image and the feature point coordinates of the second image are planarly displayed on the first image.
  • the projected feature point coordinates are indicated by stars.
  • a comparison is made between the feature point coordinates of the first image and the coordinates obtained by projecting the feature point coordinates of the second image onto the first image in a plane.
  • a thick arrow indicates that the feature point coordinates obtained by projecting the feature point coordinates of the first image and the feature point coordinates of the second image onto the first image substantially coincide with each other. . From this, the object to be inspected is a planar photograph and can be determined to be impersonation.
  • the process is terminated without releasing the security lock.
  • the processing is immediately terminated.
  • the second image is repeated for a certain time until the coordinate error (difference) exceeds the threshold value. You may acquire and impersonate.
  • the effect of the first embodiment since it is configured to detect impersonation based only on an image obtained by imaging the inspection target object 12, it is possible to detect impersonation without requiring an additional device other than the imaging device.
  • the illumination environment around the inspection object 12 is not affected. Impersonation can be detected by being robust against fluctuations.
  • it is further configured to perform spoofing detection only based on whether or not a plurality of images obtained by photographing the inspection target object 12 from different angles are in a planar projection relationship when performing personal authentication. Therefore, it is not necessary to register the three-dimensional information of the certifier himself at the time of identity registration. Accordingly, since it is not necessary to take a person image from a plurality of directions at the time of user registration, convenience for the user is improved.
  • the first imaging unit 2 that captures 12 from a first angle to obtain a first image, and the inspection target object 12 from a second angle different from the first angle are captured.
  • a second imaging unit 3 that obtains a second image to be detected, a first feature point is detected from the first image, a first feature point coordinate representing the position of the detected feature point is obtained, and the second feature point A feature point coordinate calculation unit 401 that detects a second feature point from the image and obtains a second feature point coordinate representing the position of the detected feature point, and the first feature point and the second feature point A feature point associating unit 102 to be associated, a feature converting unit 104 that planarly converts the second feature point coordinates to the first image to obtain converted coordinates, and the converted coordinates and the first feature point coordinates.
  • Similarity determination unit 1 that determines that impersonation has been attempted when the error (difference) is a certain value or less. 5, provides impersonation detection system, the comprises a.
  • the data processing device 4 operating by program control is the same as the data processing device 1 in the first embodiment shown in FIG.
  • the difference is that the unit 101 is replaced with a feature point coordinate calculation unit 401 and the imaging unit 3 is added to the first embodiment.
  • the imaging unit 2 and the imaging unit 3 are arranged so as to shoot the inspection target object 12 from different angles.
  • the imaging devices 2 and 3 may be provided with two CCD cameras 7 in the mobile phone 5.
  • the feature point coordinate calculation unit 401 acquires the first image of the first angle from the imaging unit 2 and the imaging unit 3 and the second image of the second angle different from the first angle almost simultaneously, and each of the first feature The point coordinates and the second feature point coordinates are acquired.
  • the feature point association unit 102 associates the first feature point with the second feature. That is, the second feature point coordinate calculated by the feature point coordinate calculation unit 401 is associated with the first feature point coordinate calculated by the feature point coordinate calculation unit 401.
  • the correspondence relationship is such that the first feature point and the second feature point are associated with feature points having similar brightness patterns, frequencies, edge components, etc. around the feature points. Alternatively, it can be obtained by a method such as detecting feature points that are matched using template matching. In addition, the correspondence relationship associates feature points having similar brightness patterns, frequencies, edge components, and the like around the feature points between the first feature points and the second feature points. Alternatively, it can be obtained by detecting feature points that correspond to each other using template matching. *
  • the plane conversion matrix estimation unit 103 obtains a conversion matrix for planarly converting the second image into the first image based on the first coordinate and the second coordinate extracted by the feature point coordinate calculation unit 401.
  • the transformation matrix may be calculated in advance from the angle formed by the imaging unit 2 and the imaging unit 3.
  • the plane transformation matrix estimation unit 103 performs no processing.
  • the embodiment of the present invention can obtain the same effects as those of the first embodiment. Further, with this configuration, it is not necessary to request the inspection target object 12 to change the shooting angle, and it can be expected that convenience is improved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)
  • Image Processing (AREA)

Abstract

L'invention porte sur un système de détection de mystification qui comprend une unité d'imagerie (2) pour imager un objet (12) devant être vérifié à partir d'un premier angle, obtenir une première image, imager l'objet à partir d'un second angle différent du premier, et obtenir une seconde image, une unité de calcul de coordonnées de point caractéristique (101) pour détecter un premier point caractéristique à partir de la première image, obtenir les coordonnées représentant la position du premier point caractéristique détecté, détecter le second point caractéristique à partir de la seconde image, et obtenir les coordonnées représentant la position du second point caractéristique détecté, une unité d'association de point caractéristique (102) pour associer le premier point caractéristique au second point caractéristique, une unité de transformation de caractéristique (104) pour obtenir les coordonnées transformées par application d'une transformation projective plane aux coordonnées du second point caractéristique de la seconde image à la première image, et une unité de détermination de similarité (105) pour déterminer qu'une mystification est essayée si la différence entre les coordonnées transformées et les coordonnées du premier point caractéristique est inférieure à une certaine valeur.
PCT/JP2009/005709 2008-10-28 2009-10-28 Système de détection de mystification, procédé de détection de mystification et programme de détection de mystification WO2010050206A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2010535673A JP5445460B2 (ja) 2008-10-28 2009-10-28 なりすまし検知システム、なりすまし検知方法及びなりすまし検知プログラム
CN200980142953.9A CN102197412B (zh) 2008-10-28 2009-10-28 伪装检测系统和伪装检测方法
US13/126,339 US8860795B2 (en) 2008-10-28 2009-10-28 Masquerading detection system, masquerading detection method, and computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-277212 2008-10-28
JP2008277212 2008-10-28

Publications (1)

Publication Number Publication Date
WO2010050206A1 true WO2010050206A1 (fr) 2010-05-06

Family

ID=42128577

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/005709 WO2010050206A1 (fr) 2008-10-28 2009-10-28 Système de détection de mystification, procédé de détection de mystification et programme de détection de mystification

Country Status (4)

Country Link
US (1) US8860795B2 (fr)
JP (1) JP5445460B2 (fr)
CN (1) CN102197412B (fr)
WO (1) WO2010050206A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012052884A (ja) * 2010-08-31 2012-03-15 Honda Motor Co Ltd 車載カメラを用いた測距装置
CN103021004A (zh) * 2012-11-30 2013-04-03 中国人民解放军61517部队 与起伏地表空间特征相匹配的遮障面制作方法
JP2015007919A (ja) * 2013-06-25 2015-01-15 Kddi株式会社 異なる視点の画像間で高精度な幾何検証を実現するプログラム、装置及び方法
US9424487B2 (en) 2014-01-22 2016-08-23 Fujitsu Limited Image matching method and image processing system
US9641523B2 (en) 2011-08-15 2017-05-02 Daon Holdings Limited Method of host-directed illumination and system for conducting host-directed illumination
KR101877816B1 (ko) * 2017-02-20 2018-07-12 주식회사 에스원 단일 가시광 카메라를 이용한 복합 사진 동영상 위조 얼굴 판단방법 및 이를 이용한 위조 얼굴 판단 시스템
JP2019509545A (ja) * 2016-04-21 2019-04-04 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 生きた人の顔検証方法およびデバイス
JP2019095345A (ja) * 2017-11-24 2019-06-20 国立大学法人 岡山大学 物体識別システム
WO2019216091A1 (fr) * 2018-05-10 2019-11-14 パナソニックIpマネジメント株式会社 Dispositif d'authentification faciale, procédé d'authentification faciale et système d'authentification faciale
WO2022049690A1 (fr) * 2020-09-03 2022-03-10 日本電信電話株式会社 Dispositif d'estimation de degré de mouvement, procédé d'estimation de degré de mouvement et programme

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11321772B2 (en) 2012-01-12 2022-05-03 Kofax, Inc. Systems and methods for identification document processing and business workflow integration
US10146795B2 (en) 2012-01-12 2018-12-04 Kofax, Inc. Systems and methods for mobile image capture and processing
US8989515B2 (en) 2012-01-12 2015-03-24 Kofax, Inc. Systems and methods for mobile image capture and processing
KR101899978B1 (ko) * 2012-07-11 2018-09-19 엘지전자 주식회사 이동 단말기 및 그것의 제어 방법
MX346218B (es) 2012-09-05 2017-03-09 Element Inc Sistema y método de autenticación biométrica en conexión con dispositivos equipados con cámara.
KR101937323B1 (ko) * 2012-09-18 2019-01-11 한국전자통신연구원 위장 얼굴 판별 장치 및 방법
FR2997211B1 (fr) * 2012-10-18 2021-01-01 Morpho Procede d'authentification d'une capture d'image d'une entite tridimensionnelle
US10127636B2 (en) 2013-09-27 2018-11-13 Kofax, Inc. Content-based detection and three dimensional geometric reconstruction of objects in image and video data
US10783615B2 (en) * 2013-03-13 2020-09-22 Kofax, Inc. Content-based object detection, 3D reconstruction, and data extraction from digital images
CN104933389B (zh) * 2014-03-18 2020-04-14 北京细推科技有限公司 一种基于指静脉的身份识别方法和装置
US9916495B2 (en) * 2014-03-28 2018-03-13 Nec Corporation Face comparison device, method, and recording medium
JP6376873B2 (ja) * 2014-07-16 2018-08-22 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム
EP3208770B1 (fr) * 2014-10-15 2022-05-04 Nec Corporation Dispositif de détection d'usurpation d'identité, procédé de détection d'usurpation d'identité et support d'enregistrement
US9760788B2 (en) 2014-10-30 2017-09-12 Kofax, Inc. Mobile document detection and orientation based on reference object characteristics
US9934443B2 (en) * 2015-03-31 2018-04-03 Daon Holdings Limited Methods and systems for detecting head motion during an authentication transaction
CN104966316B (zh) * 2015-05-22 2019-03-15 腾讯科技(深圳)有限公司 一种3d人脸重建方法、装置及服务器
KR102410300B1 (ko) * 2015-06-26 2022-06-20 한국전자통신연구원 스테레오 카메라를 이용한 카메라 위치 측정 장치 및 방법
US10242285B2 (en) 2015-07-20 2019-03-26 Kofax, Inc. Iterative recognition-guided thresholding and data extraction
US10467465B2 (en) 2015-07-20 2019-11-05 Kofax, Inc. Range and/or polarity-based thresholding for improved data extraction
US9898674B2 (en) 2015-12-10 2018-02-20 International Business Machines Corporation Spoof detection for facial recognition
JP6845612B2 (ja) * 2016-03-07 2021-03-17 中村留精密工業株式会社 工作機械における機械精度の測定方法及び装置
US10217009B2 (en) 2016-08-09 2019-02-26 Daon Holdings Limited Methods and systems for enhancing user liveness detection
US11115408B2 (en) 2016-08-09 2021-09-07 Daon Holdings Limited Methods and systems for determining user liveness and verifying user identities
US10210380B2 (en) * 2016-08-09 2019-02-19 Daon Holdings Limited Methods and systems for enhancing user liveness detection
US10628661B2 (en) 2016-08-09 2020-04-21 Daon Holdings Limited Methods and systems for determining user liveness and verifying user identities
JP6886260B2 (ja) * 2016-09-07 2021-06-16 キヤノン株式会社 画像処理装置、その制御方法、およびプログラム
WO2018058554A1 (fr) * 2016-09-30 2018-04-05 Intel Corporation Anti-mystification de visage à l'aide d'une analyse de réseau neuronal convolutif spatial et temporel
KR20200073222A (ko) * 2017-09-18 2020-06-23 엘리먼트, 인크. 모바일 인증에서 스푸핑을 검출하기 위한 방법, 시스템 및 매체
US11062176B2 (en) 2017-11-30 2021-07-13 Kofax, Inc. Object detection and image cropping using a multi-detector approach
AU2019208182B2 (en) 2018-07-25 2021-04-08 Konami Gaming, Inc. Casino management system with a patron facial recognition system and methods of operating same
US11521460B2 (en) 2018-07-25 2022-12-06 Konami Gaming, Inc. Casino management system with a patron facial recognition system and methods of operating same
CN109873978B (zh) * 2018-12-26 2020-10-16 深圳市天彦通信股份有限公司 定位追踪方法及相关装置
EP3674973A1 (fr) 2018-12-28 2020-07-01 Samsung Electronics Co., Ltd. Procédé et appareil de détection de vivacité et reconnaissance d'objet
US11343277B2 (en) 2019-03-12 2022-05-24 Element Inc. Methods and systems for detecting spoofing of facial recognition in connection with mobile devices
US20220270360A1 (en) * 2019-08-20 2022-08-25 Technology Innovation Momentum Fund (Israel) Limited Partnership Method and apparatus for authentication of a three-dimensional object
US11557124B2 (en) * 2019-10-25 2023-01-17 7-Eleven, Inc. Homography error correction
US11551454B2 (en) * 2019-10-25 2023-01-10 7-Eleven, Inc. Homography error correction using marker locations
US11074340B2 (en) 2019-11-06 2021-07-27 Capital One Services, Llc Systems and methods for distorting CAPTCHA images with generative adversarial networks
US11507248B2 (en) 2019-12-16 2022-11-22 Element Inc. Methods, systems, and media for anti-spoofing using eye-tracking

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004192378A (ja) * 2002-12-12 2004-07-08 Toshiba Corp 顔画像処理装置およびその方法
JP2007304801A (ja) * 2006-05-10 2007-11-22 Nec Corp 立体性認証方法、立体性認証装置および立体性認証プログラム

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100560030B1 (ko) * 2001-11-22 2006-03-13 고나미 가부시끼가이샤 당구 게임의 입력 장치, 당구 게임 시스템, 게임용 입력장치 및 컴퓨터 판독 가능한 기억 매체
JP2003178306A (ja) 2001-12-12 2003-06-27 Toshiba Corp 個人認証装置および個人認証方法
US7221809B2 (en) * 2001-12-17 2007-05-22 Genex Technologies, Inc. Face recognition system and method
US7532750B2 (en) * 2002-04-17 2009-05-12 Sony Corporation Image processing apparatus and method, program, and image processing system
JP2004362079A (ja) 2003-06-02 2004-12-24 Fuji Photo Film Co Ltd 本人認証装置
US7218760B2 (en) * 2003-06-30 2007-05-15 Microsoft Corporation Stereo-coupled face shape registration
JP2006338092A (ja) 2005-05-31 2006-12-14 Nec Corp パタン照合方法、パタン照合システム及びパタン照合プログラム
WO2006138643A2 (fr) * 2005-06-16 2006-12-28 Nomos Corporation Systeme, dispositif de poursuite et produit-programme pour faciliter et verifier l'alignement correct recherche pour l'application de rayonnement, et procedes associes
JP2007249585A (ja) * 2006-03-15 2007-09-27 Omron Corp 認証装置およびその制御方法、認証装置を備えた電子機器、認証装置制御プログラム、ならびに該プログラムを記録した記録媒体
JP2008054754A (ja) 2006-08-29 2008-03-13 Toshiba Corp 個人認証装置及び個人認証システム
JP4902316B2 (ja) * 2006-11-10 2012-03-21 東芝機械株式会社 斜め加工のための5軸加工機の姿勢保証システム
KR101404527B1 (ko) * 2007-12-26 2014-06-09 다이니폰 인사츠 가부시키가이샤 화상 변환 장치 및 화상 변환 방법
JP5159950B2 (ja) * 2009-05-28 2013-03-13 株式会社東芝 画像処理装置、方法、プログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004192378A (ja) * 2002-12-12 2004-07-08 Toshiba Corp 顔画像処理装置およびその方法
JP2007304801A (ja) * 2006-05-10 2007-11-22 Nec Corp 立体性認証方法、立体性認証装置および立体性認証プログラム

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012052884A (ja) * 2010-08-31 2012-03-15 Honda Motor Co Ltd 車載カメラを用いた測距装置
US10984271B2 (en) 2011-08-15 2021-04-20 Daon Holdings Limited Method of host-directed illumination and system for conducting host-directed illumination
US10503991B2 (en) 2011-08-15 2019-12-10 Daon Holdings Limited Method of host-directed illumination and system for conducting host-directed illumination
US11462055B2 (en) 2011-08-15 2022-10-04 Daon Enterprises Limited Method of host-directed illumination and system for conducting host-directed illumination
US9641523B2 (en) 2011-08-15 2017-05-02 Daon Holdings Limited Method of host-directed illumination and system for conducting host-directed illumination
US10002302B2 (en) 2011-08-15 2018-06-19 Daon Holdings Limited Method of host-directed illumination and system for conducting host-directed illumination
US10169672B2 (en) 2011-08-15 2019-01-01 Daon Holdings Limited Method of host-directed illumination and system for conducting host-directed illumination
CN103021004A (zh) * 2012-11-30 2013-04-03 中国人民解放军61517部队 与起伏地表空间特征相匹配的遮障面制作方法
JP2015007919A (ja) * 2013-06-25 2015-01-15 Kddi株式会社 異なる視点の画像間で高精度な幾何検証を実現するプログラム、装置及び方法
US9424487B2 (en) 2014-01-22 2016-08-23 Fujitsu Limited Image matching method and image processing system
JP2019509545A (ja) * 2016-04-21 2019-04-04 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 生きた人の顔検証方法およびデバイス
KR101877816B1 (ko) * 2017-02-20 2018-07-12 주식회사 에스원 단일 가시광 카메라를 이용한 복합 사진 동영상 위조 얼굴 판단방법 및 이를 이용한 위조 얼굴 판단 시스템
JP2019095345A (ja) * 2017-11-24 2019-06-20 国立大学法人 岡山大学 物体識別システム
WO2019216091A1 (fr) * 2018-05-10 2019-11-14 パナソニックIpマネジメント株式会社 Dispositif d'authentification faciale, procédé d'authentification faciale et système d'authentification faciale
WO2022049690A1 (fr) * 2020-09-03 2022-03-10 日本電信電話株式会社 Dispositif d'estimation de degré de mouvement, procédé d'estimation de degré de mouvement et programme
JPWO2022049690A1 (fr) * 2020-09-03 2022-03-10
JP7464135B2 (ja) 2020-09-03 2024-04-09 日本電信電話株式会社 移動量推定装置、移動量推定方法およびプログラム

Also Published As

Publication number Publication date
CN102197412A (zh) 2011-09-21
JPWO2010050206A1 (ja) 2012-03-29
JP5445460B2 (ja) 2014-03-19
CN102197412B (zh) 2014-01-08
US8860795B2 (en) 2014-10-14
US20110254942A1 (en) 2011-10-20

Similar Documents

Publication Publication Date Title
JP5445460B2 (ja) なりすまし検知システム、なりすまし検知方法及びなりすまし検知プログラム
KR102120241B1 (ko) 얼굴 생체 검증 방법 및 장치
JP4734980B2 (ja) 顔認証装置およびその制御方法、顔認証装置を備えた電子機器、顔認証装置制御プログラム、ならびに該プログラムを記録した記録媒体
JP5106459B2 (ja) 立体物判定装置、立体物判定方法及び立体物判定プログラム
WO2018042996A1 (fr) Dispositif de détection de forme de vie
JP5170094B2 (ja) なりすまし検知システム、なりすまし検知方法およびなりすまし検知用プログラム
KR20200116138A (ko) 안면 인식을 위한 방법 및 시스템
JP2003178306A (ja) 個人認証装置および個人認証方法
JP5915664B2 (ja) 静脈認証方法及び静脈認証装置
JP2007241402A (ja) 顔認証におけるなりすまし判定装置およびそれを用いた顔認証装置
US11315360B2 (en) Live facial recognition system and method
CN107016330B (zh) 通过预记录的图像投影对欺诈进行检测的方法
JP5416489B2 (ja) 三次元指先位置検出方法、三次元指先位置検出装置、およびプログラム
WO2021166289A1 (fr) Dispositif d'enregistrement de données, dispositif d'authentification biométrique, et support d'enregistrement
JP4141090B2 (ja) 画像認識装置、陰影除去装置、陰影除去方法及び記録媒体
US11216679B2 (en) Biometric authentication apparatus and biometric authentication method
JP4446383B2 (ja) 画像処理装置および画像認識装置
KR101711307B1 (ko) 깊이정보 기반의 안면인식 휴대장치 또는 컴퓨터 기기 잠금해제시스템
KR101718244B1 (ko) 얼굴 인식을 위한 광각 영상 처리 장치 및 방법
JP2001331804A (ja) 画像領域検出装置及び方法
US20220392256A1 (en) Authentication device, registration device, authentication method, registration method, and storage medium
JP2018088064A (ja) 画像処理装置、画像処理方法及び画像処理プログラム
KR101845419B1 (ko) 이미지의 보정여부 분석방법 및 프로그램
JP2007004534A (ja) 顔判別方法および装置ならびに顔認証装置
KR20210001270A (ko) 블러 추정 방법 및 장치

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980142953.9

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09823319

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2010535673

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 13126339

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 09823319

Country of ref document: EP

Kind code of ref document: A1