WO2013088707A1 - 辞書学習装置、パターン照合装置、辞書学習方法および記憶媒体 - Google Patents
辞書学習装置、パターン照合装置、辞書学習方法および記憶媒体 Download PDFInfo
- Publication number
- WO2013088707A1 WO2013088707A1 PCT/JP2012/007929 JP2012007929W WO2013088707A1 WO 2013088707 A1 WO2013088707 A1 WO 2013088707A1 JP 2012007929 W JP2012007929 W JP 2012007929W WO 2013088707 A1 WO2013088707 A1 WO 2013088707A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pattern
- matching
- dictionary
- score
- quality
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/28—Determining representative reference patterns, e.g. by averaging or distorting; Generating dictionaries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/772—Determining representative reference patterns, e.g. averaging or distorting patterns; Generating dictionaries
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/20—Speech recognition techniques specially adapted for robustness in adverse environments, e.g. in noise, of stress induced speech
Definitions
- the present invention relates to a technique related to a quality dictionary used for pattern matching.
- the pattern matching is a process for determining whether or not a plurality of patterns to be verified are the same pattern using an image such as a face or a fingerprint or a waveform of an audio signal as a pattern.
- This pattern matching is a technique that is particularly important in the field of biometric authentication.
- a collation score indicating how similar two patterns to be collated are calculated.
- a matching score is calculated using a feature vector extracted from a pattern to be matched and a feature vector group called an identification dictionary prepared in advance.
- the identification dictionary is often generated by machine learning using a large number of pattern examples.
- the collation score is compared with a threshold value, and the collation result (that is, whether the two patterns to be collated are the same) is determined based on this comparison. In this way, pattern matching is performed.
- the problem is a problem caused by deterioration of pattern quality (in other words, unclearness (unclearness)). That is, depending on the degree of pattern quality degradation, information necessary for collation (for example, information indicating features such as a face and a fingerprint) may be lost. For example, if the face image is blurred and unclear, any person will have a similar face image. For this reason, in face matching based on a blurred face image, an erroneous determination is likely to occur in which it is determined that the two face images to be compared are the same even though they are not images of the same person.
- unclearness unclearness
- Patent Document 1 Japanese Patent Laid-Open No. 2010-129045 discloses a technique for detecting (determining) a fingerprint image as information for estimating pattern quality in fingerprint matching.
- Japanese Patent Application Laid-Open No. 2011-002494 discloses a technique for estimating a sound quality level as information for estimating pattern quality in speech recognition.
- Patent Document 3 Japanese Patent Application Laid-Open No. 2007-140823
- conditions for example, lighting environment, face orientation, and presence / absence of wearing items (eg, sunglasses)
- sunglasses wearing items
- Patent Document 4 Japanese Patent Application Laid-Open No. 2006-072553 discloses a technique for performing verification after correcting an image when the input image is unclear in biometric authentication.
- Patent Document 5 Japanese Patent Laid-Open No. 2008-107408 discloses a technique for performing speech recognition processing in consideration of the surrounding environment in speech recognition.
- Patent Document 1 JP 2010-129045 [Patent Document 2] JP 2011-002494 [Patent Document 3] JP 2007-140823 [Patent Document 4] JP 2006-072553 [Patent Document 3] Reference 5] Japanese Patent Application Laid-Open No. 2008-107408
- pattern matching is a technique employed in authentication techniques such as biometric authentication. For this reason, it is assumed that an erroneous determination in pattern matching becomes a big problem. For this reason, pattern matching is required to be more accurate.
- the present invention has been made to solve the above problems. That is, a main object of the present invention is to provide a technique capable of further improving the accuracy of pattern matching.
- the dictionary learning apparatus of the present invention A score calculation means for calculating a matching score indicating a degree of similarity between a sample pattern that is a sample of a pattern to be subjected to pattern matching and a degradation pattern obtained by performing degradation processing on the sample pattern; Learning means for learning a quality dictionary used in the process of evaluating the deterioration state of the pattern to be verified, which is a pattern to be subjected to pattern matching, based on the calculated matching score and the deterioration pattern.
- the pattern matching device of the present invention is A quality dictionary learned by the dictionary learning device of the present invention; Quality evaluation means for evaluating a deterioration state of a pattern to be verified, which is a pattern to be subjected to pattern matching, based on the quality dictionary; Score calculating means for calculating a matching score indicating a degree of similarity between the plurality of matching target patterns based on a process corresponding to a process in which the dictionary learning device calculates a matching score; Score correction means for correcting the matching score using an evaluation result by the quality evaluation means; Determination means for determining a collation result of the collation target pattern based on the collation score after correction.
- the computer calculates a matching score indicating the degree of similarity between the sample pattern that is a sample of the pattern to be subjected to pattern matching and the deterioration pattern obtained by performing deterioration processing on the sample pattern, Based on the calculated matching score and the deterioration pattern, the computer learns a quality dictionary used in the process of evaluating the deterioration state of the matching target pattern that is the pattern to be subjected to pattern matching.
- the storage medium of the present invention is On the computer, A process of calculating a matching score indicating a similarity between a sample pattern that is a sample of a pattern to be subjected to pattern matching and a deterioration pattern obtained by performing deterioration processing on the sample pattern; A control procedure for executing a process of learning a quality dictionary used in a process for evaluating a deterioration state of a pattern to be verified, which is a pattern to be subjected to pattern matching, based on the calculated matching score and the deterioration pattern Is remembered.
- the main object of the present invention is also achieved by a dictionary learning method corresponding to the dictionary learning apparatus of the present invention having the above-described configuration.
- the main object of the present invention is also achieved by a computer program that implements the dictionary learning apparatus and the corresponding dictionary learning method of the present invention using a computer, and a computer-readable storage medium that stores the computer program. .
- the accuracy of pattern matching can be further improved. This makes it possible to provide a highly reliable pattern matching device.
- FIG. 1 is a block diagram showing a simplified configuration of the dictionary learning apparatus according to the first embodiment of the present invention.
- FIG. 2 is a block diagram showing a simplified configuration of a pattern matching device using a quality dictionary learned by the dictionary learning device of the first embodiment.
- FIG. 3 is a block diagram showing a simplified configuration of the dictionary learning apparatus according to the second and third embodiments of the present invention.
- FIG. 4 is a flowchart illustrating an example of processing related to dictionary learning performed by the dictionary learning device according to the second embodiment.
- FIG. 5 is a block diagram showing a simplified configuration of the pattern matching apparatus according to the fourth embodiment of the present invention.
- FIG. 6 is a flowchart illustrating an example of a matching process executed by the pattern matching apparatus according to the fourth embodiment.
- FIG. 1 is a block diagram showing a simplified configuration of the dictionary learning apparatus according to the first embodiment of the present invention.
- the dictionary learning device 1 according to the first embodiment is a device (for example, a computer) that learns (generates by machine learning) a quality dictionary used when pattern matching is performed.
- the dictionary learning device 1 includes a score calculation unit (score calculation unit) 2 and a learning unit (learning unit) 3.
- the score calculation unit 2 has a function of calculating a matching score indicating the degree of similarity between a sample pattern that is a sample of a pattern to be subjected to pattern matching and a deterioration pattern obtained by performing deterioration processing on the sample pattern.
- the learning unit 3 has a function of learning a quality dictionary based on the collation score calculated by the score calculation unit 2 and the deterioration pattern.
- the quality dictionary is a dictionary used in a process for evaluating a deterioration state (quality) of a pattern to be verified that is a pattern in a pattern matching mode in a pattern matching device.
- FIG. 2 is a block diagram showing a simplified configuration example of a pattern matching device that uses the quality dictionary generated by the dictionary learning device 1.
- the pattern matching device 5 includes a score calculation unit (score calculation unit) 6, a quality evaluation unit (quality evaluation unit) 7, a score correction unit (score correction unit) 8, a determination unit (determination unit) 9, a quality And a dictionary 10.
- the quality dictionary 10 is a dictionary previously learned (generated) by the dictionary learning device 1 described above.
- the score calculation unit 6 has a function of calculating a collation score indicating a similarity between a plurality of patterns to be collated by a process similar to the process in which the score calculation unit 2 constituting the dictionary learning device 1 calculates a collation score. Yes.
- the quality evaluation unit 7 has a function of evaluating (calculating) the degradation state (quality) of the verification target pattern using the quality dictionary 10.
- the score correction unit 8 has a function of correcting the collation score calculated by the score calculation unit 6 using the evaluation result of the quality evaluation unit 7.
- the determination unit 9 has a function of determining a verification result based on the corrected verification score.
- the dictionary learning device 1 and the pattern matching device 5 of the first embodiment can obtain the following effects.
- Patent Documents 1-3 described above merely evaluate the deterioration state of the pattern itself such as image quality and sound quality, and do not evaluate the deterioration state (quality) of the pattern from the viewpoint of pattern matching.
- the dictionary learning device 1 according to the first embodiment generates a quality dictionary by machine learning using a matching score indicating a similarity between patterns used in pattern matching.
- the evaluation result of the pattern to be verified is a result in consideration of pattern matching.
- pattern quality evaluation specialized for pattern matching can be realized.
- the pattern matching device 5 can evaluate the deterioration state (quality) of the pattern to be verified in consideration of pattern matching in this way. From this, the pattern matching apparatus 5 can further improve the accuracy of pattern matching using the evaluation result. Thereby, the reliability of the pattern collation apparatus 5 can be improved more.
- FIG. 3 is a block diagram showing a simplified configuration of the dictionary learning apparatus according to the second embodiment.
- the dictionary learning device 20 includes a control device 21 and a storage device 22.
- the storage device 22 includes a storage medium (nonvolatile storage medium) 22A such as a hard disk.
- the storage medium 22A stores various data and computer programs (hereinafter also abbreviated as programs).
- One of the data stored in the storage medium 22A is an identification dictionary 23.
- This identification dictionary 23 is a dictionary used in processing for calculating a matching score, which will be described later.
- a program 24 that shows a procedure for the dictionary learning device 1 to learn a dictionary.
- data and programs stored in the storage medium 22A may be stored in a portable storage medium.
- the program 24 may be written in the storage medium 22A by being read into the dictionary learning device 20 from a portable storage medium.
- the control device 21 has, for example, a CPU (Central Processing Unit).
- the control device 21 controls the overall operation of the dictionary learning device 20 by operating according to the program read from the storage device 22. Specifically, the control device 21 implements the following functions by executing the program 24. That is, the control device 21 includes a receiving unit (receiving unit) 26, a deterioration processing unit (deterioration processing unit) 27, a score calculation unit (score calculation unit) 28, a feature extraction unit (feature extraction unit) 29, and learning. Part (learning means) 30.
- the reception unit 26 has a function of receiving a sample pattern (learning pattern).
- the sample pattern is a sample of a pattern to be subjected to pattern matching (a pattern that can be a pattern matching target).
- the degradation processing unit 27 has a function of performing degradation processing on the received sample pattern.
- the sample pattern subjected to the degradation process is referred to as a degradation pattern here. That is, the degradation processing unit 27 has a function of generating a degradation pattern by performing degradation processing on the sample pattern.
- the deterioration processing for example, when the sample pattern is an image, there is smoothing (processing for blurring the image).
- a signal for example, an audio signal
- the feature extraction unit 29 has a function of extracting features from each of the sample pattern and its degradation pattern.
- the extracted features include a pixel value of the image, a filter response (for example, a Gabor filter or a Sobel filter), and the like.
- the score calculation unit 28 calculates a matching score s using the combination of the features (sample pattern and deterioration pattern features) extracted by the feature extraction unit 29 and the identification dictionary 23 stored in the storage device 22. To do.
- the matching score s can be calculated based on the following mathematical formula (1).
- x in Equation (1) represents a feature vector obtained by converting the feature of the sample pattern extracted by the feature extraction unit 29 into a vector.
- y represents a feature vector obtained by converting the feature of the degradation pattern extracted by the feature extraction unit 29 into a vector.
- ⁇ d represents the identification dictionary 23.
- the identification dictionary 23 there is a matrix that projects a feature vector onto a low-dimensional subspace.
- the learning unit 30 has a function of learning (generating) a quality dictionary based on a deterioration pattern (feature extracted from the deterioration pattern) and a matching score calculated based on the deterioration pattern.
- the quality dictionary is a dictionary that estimates a collation score between a deterioration pattern and a non-deterioration virtual pattern corresponding to the deterioration pattern.
- the learning unit 30 learns the quality dictionary ⁇ q in the equation (2) so that the following equation (2) is established.
- Equation (2) q represents a predetermined function, and ⁇ q represents a quality dictionary.
- the matching score s used in the process of learning the quality dictionary ⁇ q is a corrected matching score obtained by correcting the matching score s calculated by the score calculation unit 28 according to the quality of the sample pattern and the deterioration pattern. May be.
- FIG. 4 is a flowchart showing an example of a dictionary learning operation (processing) executed by the dictionary learning device 20 of the second embodiment. This flowchart represents an example of a control procedure of a program executed by the control device 21 in the dictionary learning device 20.
- the control device 21 receives the sample pattern (step S101 in FIG. 4). Then, the control device 21 (degradation processing unit 27) performs a degradation process on the sample pattern to generate a degradation pattern (step S102). Thereafter, the control device 21 (feature extraction unit 29) extracts the feature of the generated degradation pattern. In addition, the control device 21 (feature extraction unit 29) also extracts the features of the sample pattern itself that is not subjected to the deterioration process (step S103).
- control device 21 uses the identification dictionary 23, and based on the combination (x, y) of the extracted features (features of the sample pattern and the degradation pattern), the matching score s (that is, Then, a score representing the degree of similarity between the sample pattern and the degradation pattern corresponding to the sample pattern is calculated (step S104). Thereafter, the control device 21 (learning unit 30) learns the quality dictionary ⁇ q by a learning method such as regression analysis based on the deterioration pattern and the matching score s calculated based on the deterioration pattern. (Step S105).
- the dictionary learning device 20 of the second embodiment also generates a quality dictionary by machine learning using the matching score, as in the first embodiment. For this reason, the quality evaluation of the pattern specialized for pattern collation is realizable by evaluating the quality of the pattern for collation using the quality dictionary produced
- the pattern matching apparatus can further improve the accuracy of the result. That is, the dictionary learning device 20 can provide a quality dictionary that can further improve the reliability of the pattern matching device.
- the dictionary learning device of the third embodiment has the same configuration as that of the second embodiment.
- symbol is attached
- the score calculation unit 28 calculates the collation score s using the following formula (3).
- the collation score s increases as the possibility that the feature vectors x and y based on the features of the sample pattern and the deterioration pattern extracted by the feature extraction unit 29 are the same. From this, when the correlation coefficient is used, the matching score s can be expressed as Equation (3).
- A is a matrix whose feature vector is a row vector, and represents the identification dictionary 23 (corresponding to ⁇ d in Equation (1)).
- This matrix A is learned (generated) in advance as a projection matrix onto a partial space where it is easy to determine whether or not they are the same pattern by a method such as linear discriminant analysis.
- (Ax) T represents a transposed matrix of the matrix (Ax).
- represent the determinants of the matrices (Ax) and (Ay) in this order.
- the learning unit 30 learns (generates) the quality dictionary u so that approximation of the following formula (4) is established. That is, the quality dictionary is a dictionary that estimates a collation score between a deterioration pattern and a non-deterioration state pattern corresponding to the deterioration pattern. When a linear function is used for the estimation, the matching score s can be expressed as in Equation (4).
- x represents a feature vector based on the feature of the sample pattern extracted by the feature extraction unit 29
- y represents a feature based on the feature of the degradation pattern extracted by the feature extraction unit 29.
- y T denotes a transposed vector of the feature vector y.
- u represents a quality dictionary (corresponding to ⁇ q in Equation (2)).
- the quality dictionary u is a vector corresponding to a coefficient of a linear function.
- the learning unit 30 learns the quality dictionary u by, for example, a linear regression method.
- FIG. 5 is a block diagram showing a simplified configuration of the pattern matching apparatus of the fourth embodiment.
- the pattern matching device 40 includes a control device 41 and a storage device 42.
- the storage device 42 includes a computer-readable storage medium (nonvolatile storage medium) 43.
- the storage medium 43 stores various data and programs.
- the storage medium 43 stores at least a quality dictionary 45, an identification dictionary 46, and a program 47.
- Data and programs written in the storage medium 43 may be stored in a portable storage medium. For example, these data and programs may be written in the storage medium 43 by being read into the pattern matching device 40 from a portable storage medium.
- the quality dictionary 45 is a dictionary learned (generated) by the dictionary learning device described in the first to third embodiments.
- the identification dictionary 46 is a dictionary similar to the identification dictionary 23 used when learning the quality dictionary 45.
- the program 47 is a program showing a control procedure for controlling the overall operation of the pattern matching device 40.
- the control device 41 has, for example, a CPU.
- the control device 41 has various functions for controlling the overall operation of the pattern matching device 40 by operating according to the program read from the storage device 42.
- the control device 41 includes a receiving unit 50, a feature extraction unit 51, a quality evaluation unit 52, a score calculation unit 53, a score correction unit 54, and a determination unit 55 as functional units. ing.
- the accepting unit 50 has a function of accepting a verification target pattern when a verification target pattern for pattern verification is input to the pattern verification device 40.
- the feature extraction unit 51 has a function of extracting features from the received verification target pattern in the same manner as the feature extraction unit 29 of the dictionary learning device 20 described above.
- the quality evaluation unit 52 has a function of evaluating the quality of the verification target pattern using the extracted characteristics of the verification target pattern and the quality dictionary 45. For example, it is assumed that two verification target patterns Ix and Iy that match each other are input to the pattern matching device 40. Here, the features extracted by the feature extraction unit 51 from the verification target patterns Ix and Iy are assumed to be feature vectors x and y. Based on the feature vectors x and y and the quality dictionary 45, the quality evaluation unit 52 calculates the quality q x and q y of each of the verification target patterns Ix and Iy according to the following equations (5) and (6). To do.
- Equation (5) represents a conversion vector of the feature vector x.
- y T during Equation (5) represents the conversion vector of the feature vector y.
- u in the equations (5) and (6) indicates the quality dictionary 45.
- the same quality dictionary 45 is used for both of the two matching target patterns x and y to be matched.
- the score calculation unit 53 has a function of calculating a collation score using the features (feature vectors x, y) of the collation target patterns Ix and Iy extracted by the feature extraction unit 51 and the identification dictionary 46. Yes.
- the function for calculating the matching score is the same as the function used by the score calculation unit 28 in the dictionary learning device 20 that has learned (generated) the quality dictionary 45. Specifically, for example, the score calculation unit 53 calculates the matching score s according to the following mathematical formula (7).
- P is a function for correcting the matching score s according to the quality q x , q y of the matching target patterns Ix, Iy.
- a polynomial function can be used as the correction function P.
- the correction function P can be expressed as the following formula (9).
- a 1 to a 6 are parameters for determining the correction function P. This parameter is also learned (generated) in advance as in the matrix A described above.
- q x and q y represent the quality of the verification target patterns Ix and Iy.
- Determining unit 55 based on the matching score s h after correction, collation object pattern Ix, and has a function of determining the collation result of Iy. More specifically, the determination unit 55 determines matching score s h after correction, compared to the threshold alpha predetermined, whether the matching score s h is equal to or larger than the threshold alpha. Determining unit 55, when the matching score s h is equal to or more than the threshold value ⁇ is collation object pattern Ix, outputs the numeric value "1" indicating that Iy is the same pattern as the determination result D (s h) To do.
- the determination unit 55 when the matching score s h is determined threshold value not more than alpha and (smaller than matching score s h is the threshold alpha) is a numerical value indicating that the collation object pattern Ix, Iy are different patterns “0” is output as the determination result D (s h ).
- the determination result can be expressed as the following equation (10).
- FIG. 6 is a flowchart showing an example of the collation process executed by the pattern collation apparatus 40.
- This flowchart represents an example of a control procedure of a program executed by the control device 41 in the pattern matching device 40.
- the control device 41 receives the verification target patterns Ix and Iy (step S201). Then, the control device 41 (feature extraction unit 51) extracts the features (feature vectors x, y) of the matching target patterns Ix, Iy (step S202). Thereafter, the control device 41 (quality evaluation unit 52) evaluates the quality q x and q y of the verification target patterns Ix and Iy based on the extracted features (step S203).
- control device 41 uses the characteristics (feature vectors x, y) of the verification target patterns Ix, Iy and the identification dictionary 46 to determine the verification score s for the verification target patterns Ix, Iy. Calculate (step S204).
- control device 41 (score correction unit 54) corrects the matching score s using the quality q x , q y of the matching target patterns Ix, Iy and the quality dictionary 45, and the corrected matching score s h is calculated (step S205).
- control unit 41 determines whether or greater than or equal to the threshold value alpha.
- the control unit 41 determines whether the matching score s h after correction is equal to or greater than the threshold value ⁇ is collation object pattern Ix, Iy is determined to be the same pattern, numeric “1” is output (step S207).
- the control unit 41 determines whether the matching score s h after correction is determined to not more than the threshold value alpha (less than matching score s h is the threshold alpha) is collation object pattern Ix, Iy Are different patterns and output a numerical value "0" (step S208). After making this determination, the control device 41 enters a standby state in preparation for the next verification process.
- the pattern matching device 40 of the fourth embodiment evaluates the quality of the pattern to be verified using the quality dictionary learned by the dictionary learning device 20 of the second or third embodiment. Since the quality dictionary is a dictionary learned by using the matching score as described above, the pattern matching device 40 according to the fourth embodiment performs quality evaluation of the pattern to be verified specialized in matching processing. be able to. Thereby, since the pattern collation apparatus 40 of this 4th Embodiment can improve the accuracy of a collation result more, it can improve the reliability with respect to collation determination more.
- the present invention is not limited to the first to fourth embodiments, and various embodiments can be adopted.
- the quality dictionary used when evaluating the verification target pattern Ix and the quality dictionary used when evaluating the verification target pattern Iy use the same quality dictionary.
- the quality dictionary for evaluating the verification target pattern Ix and the quality dictionary for evaluating the verification target pattern Iy may be different quality dictionaries.
- one of the verification target patterns Ix and Iy is an image based on a photograph (still image), and the other is an image based on a video (moving image).
- the deterioration states (deterioration processes) of the patterns Ix and Iy are different.
- the separate quality dictionaries as described above are based on the separate sample patterns corresponding to the verification target patterns Ix and Iy, and the deterioration patterns subjected to the separate deterioration processing corresponding to the verification target patterns Ix and Iy, respectively.
- the dictionary learning devices described in the first to third embodiments are separately generated by machine learning.
- one of the verification target patterns Ix and Iy is of high quality.
- one of the verification target patterns Ix and Iy is a reference pattern given in advance, and the reference pattern is often not adopted unless it is a high quality pattern.
- biometric authentication a verification target pattern acquired as a verification target is checked against the reference pattern, and whether or not authentication is possible is determined based on this verification.
- the pattern matching device 40 may omit the process of evaluating the quality of one of the high-quality matching target patterns (reference patterns). Good. In this case, the pattern matching device 40 corrects the matching score based on the quality evaluation result of only the other matching target pattern whose quality has been evaluated.
- each function of the computer is realized by a computer program (software) stored in a storage medium.
- a configuration realized by hardware may be used.
- the present invention is applicable to fields such as authentication technology using pattern matching.
Abstract
Description
[特許文献2]特開2011-002494号公報
[特許文献3]特開2007-140823号公報
[特許文献4]特開2006-072553号公報
[特許文献5]特開2008-107408号公報
本発明は上記課題を解決するためになされたものである。すなわち、本発明の主な目的は、パターン照合の正確さをより高めることができる技術を提供することにある。
パターン照合を行うパターンのサンプルであるサンプルパターンと、当該サンプルパターンを劣化処理した劣化パターンとの類似度を示す照合スコアを計算するスコア計算手段と、
パターン照合を行う対象のパターンである照合対象パターンの劣化状態を評価する処理で利用する品質辞書を、前記算出された照合スコアと、前記劣化パターンとに基づいて学習する学習手段と
を有する。
本発明の辞書学習装置によって学習された品質辞書と、
パターン照合を行う対象のパターンである照合対象パターンの劣化状態を前記品質辞書に基づいて評価する品質評価手段と、
前記辞書学習装置が照合スコアを計算する処理に相当する処理に基づいて、複数の前記照合対象パターン間の類似度を示す照合スコアを計算するスコア計算手段と、
前記品質評価手段による評価結果を利用して、前記照合スコアを補正するスコア補正手段と、
補正後の前記照合スコアに基づいて、前記照合対象パターンの照合結果を判定する判定手段と
を有する。
パターン照合を行うパターンのサンプルであるサンプルパターンと、当該サンプルパターンを劣化処理した劣化パターンとの類似度を示す照合スコアをコンピュータが計算し、
パターン照合を行う対象のパターンである照合対象パターンの劣化状態を評価する処理で利用する品質辞書を、前記算出された照合スコアと、前記劣化パターンとに基づいてコンピュータが学習する。
コンピュータに、
パターン照合を行うパターンのサンプルであるサンプルパターンと、当該サンプルパターンを劣化処理した劣化パターンとの類似度を示す照合スコアを計算する処理と、
パターン照合を行う対象のパターンである照合対象パターンの劣化状態を評価する処理で利用する品質辞書を、前記算出された照合スコアと、前記劣化パターンとに基づいて学習する処理と
を実行させる制御手順を記憶している。
図1は、本発明に係る第1実施形態の辞書学習装置の構成を簡略化して示すブロック図である。この第1実施形態の辞書学習装置1は、パターン照合を行う際に利用する品質辞書を学習(機械学習により生成)する装置(例えばコンピュータ)である。辞書学習装置1は、スコア計算部(スコア計算手段)2と、学習部(学習手段)3とを有している。
以下に、本発明に係る第2実施形態を説明する。
以下に、本発明に係る第3実施形態を説明する。
以下に、本発明に係る第4実施形態を説明する。
この第4実施形態では、第2や第3の実施形態で述べた辞書学習装置により学習(生成)された品質辞書を利用するパターン照合装置の一形態例を説明する。なお、この第4実施形態の説明において、第2や第3の実施形態で述べた説明と重複する説明は省略する。
なお、この発明は第1~第4の各実施形態に限定されるものではなく、様々な実施の形態を採り得る。例えば、第4実施形態では、照合対象パターンIxを評価する際に利用する品質辞書と、照合対象パターンIyを評価する際に利用する品質辞書とは同じ品質辞書を用いる例が説明されている。これに対し、照合対象パターンIxを評価する品質辞書と、照合対象パターンIyを評価する品質辞書とは別々の品質辞書であってもよい。このように別々の品質辞書を用いる場合の例としては、照合対象パターンIx,Iyの一方が写真(静止画)による画像であり、他方が映像(動画)による画像であるというように、照合対象パターンIx,Iyの劣化状態(劣化過程)が異なる場合が挙げられる。
2,28,53 スコア計算部
3,30 学習部
5,40 パターン照合装置
7,52 品質評価部
8,54 スコア補正部
9,55 判定部
29,51 特徴抽出部
Claims (9)
- パターン照合を行うパターンのサンプルであるサンプルパターンと、当該サンプルパターンを劣化処理した劣化パターンとの類似度を示す照合スコアを計算するスコア計算手段と、
パターン照合を行う対象のパターンである照合対象パターンの劣化状態を評価する処理で利用する品質辞書を、前記算出された照合スコアと、前記劣化パターンとに基づいて学習する学習手段と
を有する辞書学習装置。 - 前記サンプルパターンを劣化処理することによって前記劣化パターンを生成する劣化処理手段をさらに有する請求項1記載の辞書学習装置。
- 前記サンプルパターンと前記劣化パターンとのそれぞれの特徴を抽出する特徴抽出手段をさらに有し、
前記スコア計算手段は、それら抽出された前記サンプルパターンと前記劣化パターンとの特徴に基づいて、前記照合スコアを計算する請求項1又は請求項2記載の辞書学習装置。 - 前記品質辞書は、前記照合対象パターンと、当該照合対象パターンに対応する非劣化状態の仮想パターンとの類似度を示す照合スコアを求める辞書である請求項1又は請求項2又は請求項3記載の辞書学習装置。
- 前記サンプルパターンおよび前記劣化パターンは、画像パターン、あるいは、音声信号パターンである請求項1乃至請求項4の何れか一つに記載の辞書学習装置。
- 請求項1乃至請求項5の何れか一つに記載の辞書学習装置によって学習された品質辞書と、
パターン照合を行う対象のパターンである照合対象パターンの劣化状態を前記品質辞書に基づいて評価する品質評価手段と、
前記辞書学習装置が照合スコアを計算する処理に相当する処理に基づいて、複数の前記照合対象パターン間の類似度を示す照合スコアを計算するスコア計算手段と、
前記品質評価手段による評価結果を利用して、前記照合スコアを補正するスコア補正手段と、
補正後の前記照合スコアに基づいて、前記照合対象パターンの照合結果を判定する判定手段と
を有するパターン照合装置。 - 前記照合対象パターンの特徴を抽出する特徴抽出手段を有し、
前記スコア計算手段は、その抽出された照合対象パターンの特徴に基づいて、前記照合スコアを計算する請求項6記載のパターン照合装置。 - パターン照合を行うパターンのサンプルであるサンプルパターンと、当該サンプルパターンを劣化処理した劣化パターンとの類似度を示す照合スコアをコンピュータが計算し、
パターン照合を行う照合対象パターンの劣化状態を評価する処理で利用する品質辞書を、前記算出された照合スコアと、前記劣化パターンとに基づいてコンピュータが学習する辞書学習方法。 - パターン照合を行うパターンのサンプルであるサンプルパターンと、当該サンプルパターンを劣化処理した劣化パターンとの類似度を示す照合スコアを計算する処理と、
パターン照合を行う対象のパターンである照合対象パターンの劣化状態を評価する処理で利用する品質辞書を、前記算出された照合スコアと、前記劣化パターンとに基づいて学習する処理と
をコンピュータに実行させる制御手順を記憶している記憶媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013549117A JP6065842B2 (ja) | 2011-12-16 | 2012-12-12 | 辞書学習装置、パターン照合装置、辞書学習方法およびコンピュータプログラム |
US14/364,467 US9262694B2 (en) | 2011-12-16 | 2012-12-12 | Dictionary learning device, pattern matching apparatus, method for learning dictionary and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011275872 | 2011-12-16 | ||
JP2011-275872 | 2011-12-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013088707A1 true WO2013088707A1 (ja) | 2013-06-20 |
Family
ID=48612181
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/007929 WO2013088707A1 (ja) | 2011-12-16 | 2012-12-12 | 辞書学習装置、パターン照合装置、辞書学習方法および記憶媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9262694B2 (ja) |
JP (1) | JP6065842B2 (ja) |
WO (1) | WO2013088707A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015025704A1 (ja) * | 2013-08-23 | 2015-02-26 | 日本電気株式会社 | 映像処理装置、映像処理方法および映像処理プログラム |
JP2020508531A (ja) * | 2017-09-08 | 2020-03-19 | ジョンアン インフォメーション テクノロジー サービシズ カンパニー リミテッド | 画像品質の評価方法及び画像品質の評価システム |
WO2021136029A1 (zh) * | 2019-12-31 | 2021-07-08 | 百果园技术(新加坡)有限公司 | 重打分模型训练方法及装置、语音识别方法及装置 |
US11443552B2 (en) | 2016-03-17 | 2022-09-13 | Kabushiki Kaisha Toshiba | Image pattern similarity calculation device and recognition device |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3056138B1 (en) | 2015-02-11 | 2020-12-16 | Samsung Electronics Co., Ltd. | Electrocardiogram (ecg)-based authentication apparatus and method thereof, and training apparatus and method thereof for ecg-based authentication |
JP6333871B2 (ja) * | 2016-02-25 | 2018-05-30 | ファナック株式会社 | 入力画像から検出した対象物を表示する画像処理装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04123276A (ja) * | 1990-09-14 | 1992-04-23 | Fujitsu Ltd | 指紋照合装置 |
JP2010026848A (ja) * | 2008-07-22 | 2010-02-04 | Hitachi Omron Terminal Solutions Corp | 紙葉類識別装置 |
JP2011197902A (ja) * | 2010-03-18 | 2011-10-06 | Fujitsu Ltd | 画像処理装置、画像処理方法及び画像処理用コンピュータプログラム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0225898A (ja) * | 1988-07-15 | 1990-01-29 | Toshiba Corp | 音声認識装置 |
CN100377171C (zh) * | 2004-08-13 | 2008-03-26 | 富士通株式会社 | 生成劣化字符图像的方法和装置 |
JP4564804B2 (ja) | 2004-08-31 | 2010-10-20 | セコム株式会社 | 生体情報照合装置 |
JP2007140823A (ja) | 2005-11-17 | 2007-06-07 | Omron Corp | 顔照合装置、顔照合方法及びプログラム |
JP4749990B2 (ja) | 2006-10-23 | 2011-08-17 | 三菱電機株式会社 | 音声認識装置 |
EP2129136A4 (en) * | 2007-01-31 | 2016-04-13 | Nec Corp | IMAGE QUALITY EVALUATION PROCESS, IMAGE QUALITY EVALUATION DEVICE AND IMAGE QUALITY EVALUATION PROGRAM |
JP4881278B2 (ja) * | 2007-10-31 | 2012-02-22 | 株式会社東芝 | 物体認識装置及びその方法 |
JP2010129405A (ja) | 2008-11-28 | 2010-06-10 | Autonetworks Technologies Ltd | 絶縁電線およびワイヤーハーネス |
JP4981850B2 (ja) | 2009-06-16 | 2012-07-25 | 日本電信電話株式会社 | 音声認識装置とその方法と、プログラムと記録媒体 |
US8948467B2 (en) * | 2010-08-06 | 2015-02-03 | Honeywell International Inc. | Ocular and iris processing system and method |
-
2012
- 2012-12-12 US US14/364,467 patent/US9262694B2/en active Active
- 2012-12-12 WO PCT/JP2012/007929 patent/WO2013088707A1/ja active Application Filing
- 2012-12-12 JP JP2013549117A patent/JP6065842B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04123276A (ja) * | 1990-09-14 | 1992-04-23 | Fujitsu Ltd | 指紋照合装置 |
JP2010026848A (ja) * | 2008-07-22 | 2010-02-04 | Hitachi Omron Terminal Solutions Corp | 紙葉類識別装置 |
JP2011197902A (ja) * | 2010-03-18 | 2011-10-06 | Fujitsu Ltd | 画像処理装置、画像処理方法及び画像処理用コンピュータプログラム |
Non-Patent Citations (2)
Title |
---|
GO KOTAKI: "Pattern Matching for Blurred Image Using Eigen Decomposed Template", THE INSTITUTE OF ELECTRICAL ENGINEERS OF JAPAN KENKYUKAI SHIRYO, 25 March 2011 (2011-03-25), pages 73 - 78 * |
HIROYUKI ISHIDA: "Generative Learning Method for the Recognition of Low-Resolution Characters Using the Subspace Method", IEICE TECHNICAL REPORT, vol. 104, no. 90, 20 May 2004 (2004-05-20), pages 37 - 42 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015025704A1 (ja) * | 2013-08-23 | 2015-02-26 | 日本電気株式会社 | 映像処理装置、映像処理方法および映像処理プログラム |
JPWO2015025704A1 (ja) * | 2013-08-23 | 2017-03-02 | 日本電気株式会社 | 映像処理装置、映像処理方法および映像処理プログラム |
US10037466B2 (en) | 2013-08-23 | 2018-07-31 | Nec Corporation | Video processing apparatus, video processing method, and video processing program |
US11443552B2 (en) | 2016-03-17 | 2022-09-13 | Kabushiki Kaisha Toshiba | Image pattern similarity calculation device and recognition device |
JP2020508531A (ja) * | 2017-09-08 | 2020-03-19 | ジョンアン インフォメーション テクノロジー サービシズ カンパニー リミテッド | 画像品質の評価方法及び画像品質の評価システム |
WO2021136029A1 (zh) * | 2019-12-31 | 2021-07-08 | 百果园技术(新加坡)有限公司 | 重打分模型训练方法及装置、语音识别方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2013088707A1 (ja) | 2015-04-27 |
US20140301634A1 (en) | 2014-10-09 |
US9262694B2 (en) | 2016-02-16 |
JP6065842B2 (ja) | 2017-01-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6065842B2 (ja) | 辞書学習装置、パターン照合装置、辞書学習方法およびコンピュータプログラム | |
KR101365789B1 (ko) | 얼굴 특징점 위치 보정 장치, 얼굴 특징점 위치 보정 방법, 및 얼굴 특징점 위치 보정 프로그램을 기록한 컴퓨터 판독가능 기록 매체 | |
US8805002B2 (en) | Method of determining reference features for use in an optical object initialization tracking process and object initialization tracking method | |
JP5766564B2 (ja) | 顔認証装置及び顔認証方法 | |
KR20020077257A (ko) | 패턴 조합 장치와 그 패턴 조합 방법 | |
JP4858612B2 (ja) | 物体認識システム、物体認識方法および物体認識用プログラム | |
JP6833620B2 (ja) | 画像解析装置、ニューラルネットワーク装置、学習装置、画像解析方法およびプログラム | |
US20140093142A1 (en) | Information processing apparatus, information processing method, and information processing program | |
US20100239128A1 (en) | Registering device, checking device, program, and data structure | |
JP5040835B2 (ja) | 生体情報読取装置、生体情報読取方法および生体情報読取プログラム | |
US11526963B2 (en) | Image processing apparatus, image processing method, and storage medium | |
WO2013122009A1 (ja) | 信頼度取得装置、信頼度取得方法および信頼度取得プログラム | |
WO2014112346A1 (ja) | 特徴点位置検出装置、特徴点位置検出方法および特徴点位置検出プログラム | |
JP2018147240A (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
US20190188443A1 (en) | Information processing apparatus, biometric authentication method, and recording medium having recorded thereon biometric authentication program | |
JP6194880B2 (ja) | 情報処理装置、情報処理方法および記録媒体 | |
US8483449B2 (en) | Registration device, checking device, program, and data structure | |
CN113139564A (zh) | 关键点检测模型的训练方法、装置、电子设备及存储介质 | |
EP4156087A1 (en) | Authentication method, authentication program, and authentication device | |
US8873810B2 (en) | Feature-based method and system for blur estimation in eye images | |
CN113222480A (zh) | 对抗样本生成模型的训练方法及装置 | |
JP7348945B2 (ja) | 情報処理方法、および、情報処理システム | |
JP6763408B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
WO2021111832A1 (ja) | 情報処理方法、情報処理システム及び情報処理装置 | |
CN116663655B (zh) | 一种对抗攻击的防御方法及电子设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12858001 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013549117 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14364467 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12858001 Country of ref document: EP Kind code of ref document: A1 |