WO2006009043A1 - データ照合方法、データ照合装置及びデータ照合プログラム - Google Patents
データ照合方法、データ照合装置及びデータ照合プログラム Download PDFInfo
- Publication number
- WO2006009043A1 WO2006009043A1 PCT/JP2005/012952 JP2005012952W WO2006009043A1 WO 2006009043 A1 WO2006009043 A1 WO 2006009043A1 JP 2005012952 W JP2005012952 W JP 2005012952W WO 2006009043 A1 WO2006009043 A1 WO 2006009043A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- state
- component
- collation
- state change
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 77
- 230000008859 change Effects 0.000 claims abstract description 242
- 239000000470 constituent Substances 0.000 claims abstract description 36
- 238000006243 chemical reaction Methods 0.000 claims description 63
- 238000009825 accumulation Methods 0.000 claims description 13
- 241000282414 Homo sapiens Species 0.000 claims description 12
- 238000000354 decomposition reaction Methods 0.000 claims description 11
- 238000000513 principal component analysis Methods 0.000 claims description 8
- 230000009466 transformation Effects 0.000 claims description 7
- 230000008878 coupling Effects 0.000 claims description 2
- 238000010168 coupling process Methods 0.000 claims description 2
- 238000005859 coupling reaction Methods 0.000 claims description 2
- 238000013500 data storage Methods 0.000 abstract description 50
- 238000012795 verification Methods 0.000 description 32
- 230000008921 facial expression Effects 0.000 description 31
- 238000004458 analytical method Methods 0.000 description 28
- 230000032683 aging Effects 0.000 description 22
- 238000012545 processing Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 15
- 230000014509 gene expression Effects 0.000 description 14
- 230000006870 function Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 12
- 230000001815 facial effect Effects 0.000 description 12
- 238000013524 data verification Methods 0.000 description 10
- 238000003672 processing method Methods 0.000 description 6
- 239000000284 extract Substances 0.000 description 5
- 239000011159 matrix material Substances 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 239000013598 vector Substances 0.000 description 4
- 241000282412 Homo Species 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 2
- 230000037303 wrinkles Effects 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/169—Holistic features and representations, i.e. based on the facial image taken as a whole
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/7715—Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
Definitions
- the present invention relates to a data matching method and a data matching system (apparatus) for collating data of a state change object at a certain point in time with respect to an object in which a state change occurs.
- the present invention also relates to a data collation program for collating an image of data of a state change object at a certain point in time with respect to an object in which a state change occurs.
- a data collation method and data collation system for collating images and voices of biometrics information such as human face, voiceprint, etc.
- the present invention relates to a data collation program that collates images and sounds of biometrics information such as human faces, voiceprints, etc. at different points in time and images of the biometrics information from other points in time.
- collation may be an obstacle.
- data related to changes in state such as a person's face and voice that are typified by biometrics information
- characteristics at an early age may be lost due to aging, and characteristics at the time of aging may appear.
- the state data at a certain point in time may be an obstacle during collation, even if it is the same object data, unlike the state data at another point in time.
- a face image at the time of aging can be applied to a face image when it is aged.
- CG computer graphics
- US Pat. No. 6,556,196 describes a three-dimensional model in the case of processing a face image. Describes an image processing method that uses Dell to refine the addition of aged and facial features to an image and add unclear features to the image.
- a 3D face data force stored in the face database is created, and a general model of the deformable face is created, and the input face image is pasted on the created model.
- the model is then deformed using a modeler to provide feature changes including state changes.
- Japanese Patent Application Laid-Open No. 2003-233671 is a method for predicting the progress of at least one state on a body external surface portion of a subject, and receives first data representing at least one state, Receives second data reflecting how the at least one state of the body's exterior is expected to evolve over time, and based on the first data and the second data, A method is described that includes generating a prediction progress for at least one state and sending the prediction progress to the subject!
- An object of the present invention is to provide a data collation method, a data collation apparatus and a data collation program that can improve the collation performance.
- the present invention adds statistical state changes to data before state changes, and uses state change data to which features of global state changes are added for collation, so that data accompanying state changes is used.
- An object of the present invention is to provide a data collation method, a data collation apparatus and a data collation program that can improve the collation performance of the data.
- the present invention uses the state change data in which the general state change of each age is added to the data before the state change to provide correspondence of the constituent components between the state categories. Accordingly, it is an object of the present invention to provide a data collation method, a data collation apparatus, and a data collation program that can improve data collation performance in a specific state.
- a statistical change in state is added to data before a change in state, and state change data created at the time of collation can be automatically created, thereby reducing the burden on the operator at the time of data collation.
- the object is to provide a data verification method, a data verification device, and a data verification program.
- the data collating method decomposes the measured amount of an object by a predetermined method.
- a component storage step for storing the component obtained by the above in association with each of a plurality of states taken by the object, and a component for decomposing the measured amount of the object to be collated into components in a predetermined state of the plurality of states Stored in a decomposition step, a parameter conversion step for converting a parameter corresponding to a component in a predetermined state into a post-conversion parameter in a second state different from the predetermined state among a plurality of states, and a component storage step Using the component corresponding to the second state and the converted parameter, a state change data creation step for creating state change data that gives a predetermined state change to the data of the object to be verified, and state change data and pre-stored And a verification step for verifying the verification data.
- verification is performed using data obtained by adding state information to input data. As a result, the collation performance of data
- the predetermined method is principal component analysis.
- the collation performance of data accompanying a state change is improved.
- the configuration of the data collating device is simplified. “Learning” is, for example, creating a face image by conversion using a neural network with the same person-specific data as learning data.
- the data of the object to be collated is human biometric data.
- the nanometric data is, for example, an image of a part of a human body exemplified by a face and a fingerprint, one-dimensional data of voice such as a voice, and three-dimensional data of a face shape.
- each of the plurality of states corresponds to a state at a different point in time.
- a data collating apparatus is a computer having a function of executing each of the steps included in the data collating method according to the present invention.
- the data verification program according to the present invention causes a computer to execute the data verification method according to the present invention.
- a data collating method is a data collating method for collating data of an object to be collated with data corresponding to an object to be collated included in a data group, the data being registered for collation.
- a collation data accumulation step for accumulating groups in advance, a component decomposition step for decomposing data to be collated into constituent components in a predetermined state, and parameters corresponding to constituent components in a predetermined state
- the parameter conversion step to convert to parameters in different states (realized by the component coefficients ci and di)
- the component components of the data divided by state are accumulated, and the accumulated component components and parameter conversion step
- a state change data creation step for creating state change data in which a predetermined state change is given to the data to be verified, and a state change data And state change data created by the creation Sutetsu flop, characterized in that it comprises a verification step for collating the verification data group that has been accumulated in the verification data storage step.
- the data collating method is a data collating method for collating the data of the object to be collated with data corresponding to the object to be collated included in the data group, and the data group registered for collation.
- State change data obtained by changing data in a predetermined state to data in a state different from the predetermined state by a conversion using a collation data storage step for storing in advance and learning using data classified by state.
- the state change data creation step to be created may include a collation step for collating the state change data created in the state change data creation step with the collation data group accumulated in the collation data accumulation step.
- the data collating method is a data collating method for collating human biometric data with biometric data corresponding to humans included in the data group, the data group registered for collation.
- a pre-stored collation data storage step a component decomposition step for decomposing biometrics data to be collated into constituent components in a predetermined state, and parameters corresponding to the constituent components in the predetermined state
- a parameter conversion step that converts the parameter to a state parameter different from the time, and biometrics data divided by state State change data that accumulates components and creates state change data that gives a predetermined state change to biometrics data to be verified using the accumulated components and the parameters converted in the parameter conversion step It may include a creation step, a collation step for collating the state change data created in the state change data creation step and the collation data group accumulated in the collation data accumulation step.
- the data collating method is a data collating method for collating human biometric data with biometric data corresponding to the human being included in the data group, and is a data group registered for collation.
- the biometrics data in a given state is changed to biometrics data in a different state from the given state by a collation data accumulation step that stores the data in advance and conversion using learning based on biometrics data divided by state.
- the biometric data in the component decomposition step, is decomposed into constituent components of a predetermined age, and in the parameter conversion step, parameters corresponding to the constituent components of the predetermined age are obtained. Converts parameters to aging that is different from the specified aging, accumulates the components of biometrics data divided by age in the state change data creation step, and converts them in the component conversion and parameter conversion step Using this parameter, the aging data (realized by the status change data) that gives the biometrics data a predetermined aging change is created, and the aging data created in the status change creation step in the verification step And a collation data group accumulated in the collation data accumulation step may be collated.
- the data collating method in the state change data creation step, the metric data at a predetermined age is converted into a predetermined age by conversion using learning based on biometric data divided by age. Creates aging data that has been changed to biometrics data at different aging times, and in the verification step, the aging data created in the state change creation step and the verification data group accumulated in the verification data storage step It can be a collation.
- the data matching method is a data matching method for matching a face image of a person to be matched with a face image corresponding to a person included in the face image group, the face image group registered for matching.
- a pre-stored collation data storage step, a component decomposition step for decomposing a face image into components for a predetermined facial expression, and parameters corresponding to the components for the predetermined facial expression are different from those for a predetermined facial expression.
- the data collating method is a data collating method for collating a face image of a person to be collated with a face image corresponding to a person included in the face image group, the face image group registered for collation.
- a collation data storage step that stores information in advance and conversion using learning based on facial images divided according to facial expressions
- the facial image with a predetermined facial expression is changed to a facial image with a facial expression different from the predetermined facial expression.
- Facial expression change data creation step for creating facial expression change data
- a facial expression change data created in the facial expression change data creation step and a collation step for collating the collation data group accumulated in the collation data accumulation step.
- a data collating apparatus is a data collating apparatus that collates data of an object to be collated with data corresponding to an object to be collated included in a data group.
- Component decomposition means for decomposing the component at the state (implemented by the component analysis means 101), and converting the parameter corresponding to the component at the predetermined state into the parameter at the state different from the predetermined state Parameter conversion means (implemented by the component coefficient conversion means 103) and the component components of the data divided according to the state, and using the accumulated component components and the parameters converted by the parameter conversion means, State change data creation means (realized by state change data creation means 102) that creates state change data that gives a predetermined state change to the data, and is registered for verification
- the Collation data storage means for storing the data group in advance (realized by the collation data storage means 104), state change data created by the state change data creation means, collation data group accumulated by the collation data accumulation means, And a verification means (realized by the verification means 105).
- the data collating device is a data collating device for collating the data of the object to be collated with data corresponding to the object to be collated included in the data group, and learning based on the data classified according to the state.
- a state change data creating means (implemented by the state change data creating means 102b) that creates state change data in which data in a predetermined state is changed to data in a state different from the predetermined state by conversion using Collation data storage means for preliminarily storing a data group registered for collation, state change data created by the state change data creation means, and collation means for collating the collation data group accumulated by the collation data storage means May be provided. According to such a configuration, it is possible to improve the data collation performance accompanied by the state change and simplify the configuration of the data collation device.
- the data collating device is a data collating device for collating the biometric data of a person with the biometric data corresponding to the human being included in the data group, and the biometric data to be collated is obtained.
- Component decomposing means for decomposing into constituent components in a predetermined state
- parameter converting means for converting parameters corresponding to the constituent components in a predetermined state into parameters in a state different from the predetermined state
- the components of the biometrics data divided according to the state are accumulated, and a predetermined state change is given to the biometrics data to be verified using the accumulated components and the parameters converted by the parameter conversion means.
- State change data creation means for creating state change data
- collation data storage means for pre-accumulating data groups registered for collation
- state change data creation And state change data stage created the verification data storage unit be a structure having a collating means for collating the verification data group to be accumulated.
- the data collating device is a data collating device that collates biometric data of a person with biometric data corresponding to humans included in the data group, and is classified according to states.
- the state of the bar State change data creation means for creating state change data by changing the biometric data to a state different from the predetermined state, and collation that prestores data groups registered for collation
- the data storage means, the state change data created by the state change data creation means, and the collation means for collating the collation data group accumulated by the collation data storage means may be provided.
- the component decomposing means decomposes the biometric data into the components of a predetermined age
- the meter converting means sets the parameters corresponding to the components of the predetermined age as the predetermined age. Is converted into parameters for different ages, and the state change data creation means accumulates the components of biometrics data divided by age, and uses the accumulated components and the parameters converted by the parameter conversion means. Aging data is generated by giving a predetermined secular change to the biometric data, and the collating means includes the aging data created by the state change creating means, the collation data group accumulated by the collating data storage means, May be used.
- the state change data creation means separates the nanometric data at a predetermined age from the predetermined age by conversion using learning based on biometric data divided by age. Aging data that has been changed to biometric data at the time of aging is created, and the collation means collates the aging data created by the state change creation means with the collation data group accumulated by the collation data storage means. You may do.
- the data collating device is a data collating device that collates a face image of a person to be collated with a face image corresponding to a person included in the face image group.
- Component decomposition means for decomposing into components parameter conversion means for converting parameters corresponding to components at the time of a predetermined facial expression into a meter for a facial expression different from that for a predetermined facial expression, and divided by expression
- Facial expression change data creation means that creates facial expression change data that gives a predetermined facial expression change to the facial image using the constituent components to be accumulated and the parameters converted by the parameter conversion means.
- collation data storage means for previously storing facial image groups registered for collation, facial expression change data created by the facial expression change data creation means, and collation data storage Stage may be one in which a collating means for collating the verification data group to be accumulated.
- the data matching device is a data matching device for matching the face image of the person to be matched with the face image corresponding to the person included in the face image group, and learning by the face image divided by expression.
- Expression change data creating means for creating facial expression change data by changing a face image at a predetermined expression to a face image at a different expression from that at the predetermined expression by conversion using (state change data creating means 102b
- a matching data storage means for storing face image groups registered for matching in advance, facial expression change data created by the facial expression change data creating means, and matching data group stored by the matching data storage means It may be provided with a matching means for checking against.
- a data collation program is a data collation program for collating data of objects to be collated with data corresponding to objects to be collated included in a data group, and is registered for collation.
- a computer having collation data accumulation means for preliminarily accumulating the data group, a process for decomposing the data to be collated into constituent components in a predetermined state, and parameters corresponding to the constituent components in the predetermined state
- a process for converting to a parameter for a state different from the state and a component component of the data divided according to the state are stored, and using the stored component component and the converted parameter, the verification target data is stored in a predetermined state.
- a process for creating state change data with a change, and a process for collating the created state change data with the collation data group stored in the collation data storage means are executed.
- the data collation program is a data collation program for collating the data of the object to be collated with data corresponding to the object to be collated included in the data group, and the data registered for the collation
- a computer equipped with collation data storage means that stores groups in advance was converted into data in a state different from the predetermined state by conversion using learning based on data divided by state. You may perform the process which produces state change data, the process which collates the produced state change data, and the collation data group which a collation data storage means accumulate
- the state change data is created and collated, so
- highly accurate collation can be performed using only object data at a certain point in time. Therefore, when creating state change data, using state change data that takes into account the uniqueness of the object based on the correlation of the same object for collation can improve the collation performance of data accompanying state changes. it can.
- FIG. 1 is a block diagram showing an example of the configuration of a data collating apparatus according to the present invention.
- FIG. 2 is a flowchart showing an example of data collation processing in which a data collation apparatus collates face images.
- FIG. 3 is a block diagram showing another configuration example of the data collating apparatus.
- FIG. 4 is a flowchart showing another example of data matching processing in which the data matching device matches face images.
- FIG. 5 is a block diagram showing an example of the configuration of a component analyzer.
- FIG. 6 is a block diagram showing an example of a configuration of a state change creation device.
- FIG. 7 is a block diagram showing an example of a configuration of a component coefficient conversion device.
- FIG. 8 is a block diagram showing an example of a configuration of a state change data creation device.
- FIG. 1 is a block diagram showing an example of the configuration of a data collating apparatus according to the present invention.
- the type of state change is secular change
- a face image is used as the state change data
- a face image at one time point is compared with a face image at another time point using a data matching device.
- the data matching device is not limited to the case where the state change is a secular change, and the data collating device may collate data with other state changes such as facial expression changes.
- state change data is not limited to face images, but images of other parts of the human body such as fingerprints and voices such as voices. Biometric data such as data and three-dimensional data of face shape may be used.
- the state change data may be data on animals and plants other than human beings, and data on objects that change over time while having individual characteristics like living things.
- the data collating device includes component analysis means 101 for analyzing the components of the input data 11, and state change data creation means 102 for creating state (age) change data of the input data 11.
- the component coefficient conversion means 103 for associating the databases by state (age), the collation data accumulation means 104 for accumulating collation data in advance, the status (aging) change data and the data accumulated by the collation data accumulation means 104 Including matching means 105 for matching
- the state change data creation means 102 stores a plurality of state (aged) databases DB1, ⁇ , DBi, ⁇ , DBn that accumulate data components classified by state (aged). Yes.
- state databases DBl to DBn are simply referred to as state databases when they are comprehensively expressed, or when referring to any state database.
- the collation data storage unit 104 is realized by, for example, a magnetic disk device.
- the component analysis unit 101, the component coefficient conversion unit 103, and the collation unit 105 are realized by, for example, a central processing unit in a computer and a program executed by the central processing unit.
- the state change data creating means 102 is realized by, for example, a magnetic disk device, a central processing unit in a computer, and a program executed by the central processing unit.
- the component analysis means 101 uses the components in the state database DBi corresponding to the face image state (age) information 12 as the input data 11, and uses the component components in the state database DBi to generate the face image so that the error is minimized.
- a function to reconfigure is provided.
- the component analysis unit 101 selects, for example, the state-specific database DBi corresponding to the state of the face image of the input data 11 based on the state information 12 input by the state change data generation unit 102, Reconstruct an image
- linear component analysis such as principal component analysis may be used.
- principal component analysis such as principal component analysis
- the face image is represented by Expression (1).
- Ip clPl + c2P2H h cmPm (Pi: principal component, ci: coefficient)
- the minimum error coefficient set ci that minimizes the error from the input face image 10 is selected from the face images represented by the equation (1).
- the component analysis unit 101 sends the selected minimum error coefficient selection set ci to the component coefficient conversion unit 103 via the state database DBi of the state change data creation unit 102.
- the state change data creating means 102 has a plurality of state-by-state databases DB1,..., DBi,..., DBn for accumulating data components classified by state (age). Further, the state change data creation means 102 has a function of passing the coefficient set ci for each state database calculated by the component analysis means 101 to the component coefficient conversion means 103.
- the state change data creation means 102 includes the coefficient set di converted by the component coefficient conversion means 103 for the state-specific database different from the state-specific database selected by the component analysis means 101, and the other state. It has a function to reconstruct a face image using components in another database. Further, the state change data creation means 102 has a function of sending the reconstructed face image p to the matching means 105 as state (age) change data. When the principal component analysis is used, the reconstructed facial image p is expressed using Equation (2) as a linear combination of coefficient components di of the principal component Qi in the state database.
- the state-based database DBi calculates a face image of a certain age ⁇ Al, A2, ⁇
- the state database stores as a component a value obtained by singular value differentiation of a matrix in which pixels Ai (x, y) of each image are arranged as a column vector.
- a matrix in which the pixels Ai (x, y) of each image are arranged as a column vector is expressed by Equation (3).
- the state-specific database is obtained by converting the matrix represented by Equation (3) into the first half p column vectors ⁇ Ul, U2, ⁇ of the orthogonal matrix obtained by the singular value decomposition represented by Equation (4). , Uj, ..., Up ⁇ are stored as constituents.
- Equation (4) S is a matrix in which elements other than the diagonal component are 0 and the diagonal components are arranged in descending order of absolute values.
- the same number of components that use the same person's face image are prepared in advance between the two state databases. For example, when using 30 component components, prepare facial images for both states (aged) for 30 or more people in two state databases and create the component components for each state database in advance. Accumulate.
- the component coefficient conversion means 103 has a function of converting a coefficient to be multiplied by the component of the state-specific database. For example, a case where principal component analysis is used will be described as an example.
- a plurality of face images Ip and Jp belonging to both states (aged) corresponding to the databases DBi and DBj are used with the same person's face.
- the face images I P and JP are expressed by Expression (5) and Expression (6), respectively, using Expression (1).
- CiA and DjA are column vectors in which the coefficients ci and dj in Expression (4) and Expression (5) are vertically arranged.
- the component coefficient conversion means 103 may convert c into dj using a non-linear conversion.
- the component coefficient conversion unit 103 may perform coefficient conversion using a neural network using coefficient sets ⁇ ci, dj ⁇ corresponding to the databases DBi and DBj as the learning data for the same person's face. .
- the collation data storage unit 104 is a database for accumulating data for collation, and accumulates data registered for collation in advance.
- the verification data storage unit 104 is realized by, for example, a data storage unit of a normal verification device.
- the collation data storage means 104 stores, for example, a data group of a person's current face image or expressionless face image in advance as collation data.
- the matching unit 105 has a function of comparing the state change data created by the state change data creating unit 102 with the registered data stored in the matching data storage unit 104 and collating them, and outputting a matching result 18.
- the collating means 105 obtains a difference between the state (aging) change data and the registered data, and determines that the data having the smallest obtained difference is the person's data.
- the collation unit 105 is not limited to the method for obtaining the difference, and the collation unit 105 may perform collation using another collation method. Further, as shown in FIG. 1, the collating unit 105 collates a plurality of state change data created by the state change data creating unit 102 with registered data.
- the collation means 105 collates a plurality of state change data and registered data, and can perform collation corresponding to the state (aging) change by regarding the data as the original data with the smallest difference. it can. If the state (age) of the data stored in the collation database (collation data storage means 104) is known, the collation means 105 uses the state change data corresponding to the registration data as the registration data. You may make it collate. By doing so, the verification time can be shortened.
- FIG. 2 is a flowchart showing an example of data matching processing in which the data matching device matches face images.
- the component analysis means 101 inputs the input data 11 to be collated according to the user's input operation (step S101).
- the component analysis unit 101 inputs a face image to be collated as input data 11.
- the state change data creating means 102 inputs state information 12 indicating the state of the input data 11 according to the user's input operation (step S102). For example, the state change data creating means 102 inputs the age and age band (age) of the person of the face image of the input data 11 as the state information 12. Further, for example, the state change data creating means 102 inputs, as the state information 12, information indicating facial expression of the face image of the input data 11 such as anger.
- the component analysis means 101 selects the state-specific database DBi corresponding to the state of the input data 11 to be verified based on the state information 12 from the state-specific database DB of the state change data creation means 102 (step S 103). Further, the component analysis means 101 extracts the component from the selected state-by-state database, analyzes the input data 11, and calculates the component component coefficient (step S104). Further, the component analysis unit 101 sends the calculated component component coefficient to the component coefficient conversion unit 103 via the state database of the state change data creation unit 102.
- the component coefficient conversion means 103 converts the component coefficient calculated by the component analysis means 101 into a component coefficient corresponding to a state database different from the state database selected by the component analysis means 101 (step S105).
- the state change data creation unit 102 extracts the constituent components from the state-specific database corresponding to the converted constituent component coefficients, and generates the constituent components. State change data is created based on the component coefficient converted by the fraction coefficient conversion means 103 and the extracted component (step S106).
- the matching unit 105 extracts registration data from the matching data storage unit 104.
- the matching unit 105 then collates the state change data created by the state change data creation unit 102 with the registration data accumulated by the collation data storage unit 104, and outputs a collation result 18 (step S 108).
- the object features are decomposed into constituent components, it is possible to perform highly accurate collation using statistical state features that are difficult to express manually. Therefore, by adding statistical state changes to data before state changes and using state change data with global state change characteristics for collation, the collation performance of data accompanying state changes is improved. be able to.
- state change data of a plurality of states can be created and used for collation, so that collation with high accuracy can be performed by responding to a plurality of state changes. Can do.
- state change data obtained by adding the general state change of each age to the data before the state change Therefore, it is possible to improve the data collation performance in a specific state.
- the present embodiment it is possible to automatically add the state change created at the time of collation, so that the burden on the operator at the time of collation can be reduced. Therefore, the statistical state change It is possible to automatically create the state change data created at the time of collation by adding to the data before conversion, reducing the burden on the worker at the time of data collation.
- a data collating apparatus can be realized by using many existing collating systems. Therefore, the system (data collating apparatus) can be easily assembled or the system can be changed. Can do.
- FIG. 3 is a block diagram showing another configuration example of the data collating apparatus.
- the present embodiment differs from the first embodiment in that the data matching device does not include the component analysis unit 101 and the component coefficient conversion unit 103 among the components shown in FIG. Further, in the present embodiment, the processing content of the state change data creation means 102b of the data collating apparatus is different from the processing content of the state change data creation means 102 shown in the first embodiment.
- the state change data creating means 102b inputs the face image as the input data 11 to the state-specific database corresponding to the state (age) information 12 of the input data 11. .
- the state change data creating means 102b directly sends the input data 11 to a state-specific database other than the state-specific database corresponding to the input data 11, and the face image of a state other than the state of the input data 11 (aged) Create (state change data). Then, the state change data creating unit 102b sends the created face image to the matching unit 105.
- state change data creating means 102b creates state change data by directly converting a face image from one state to another without using a component. For this reason, data classified according to the state (age) of the same person is stored in advance between the databases according to each state to form a -Ural network. Then, the state change data creation means 102b creates a converted face image using a pre-formed-Ural network. In the present embodiment, for example, the state change data creating unit 102b creates a face image by performing conversion processing using a neural network with the same person-by-state data accumulated in advance as learning data.
- a force data collating apparatus that requires a large amount of data according to the state of the same person to learn in a neural network
- the configuration can be simplified.
- FIG. 4 is a flowchart showing another example of data collation processing in which the data collation apparatus collates face images.
- the state change data creation unit 102b inputs the input data 11 to be verified in accordance with the input operation of the user (step S201).
- the state change data creating means 102b inputs the face image to be collated as the input data 11.
- the state change data creating unit 102b inputs state information 12 indicating the state of the input data 11 in accordance with the user's input operation (step S202). For example, the state change data creating means 102b inputs the age and age band (age) of the person in the face image of the input data 11. Further, for example, the state change data creating unit 102b inputs information indicating the facial image expression of the input data 11 such as anger as the state information 12.
- the state change data creating means 102b converts the input data 11 into state change data of a state (age) other than the state of the input data 11 (step S203). In this case, the state change data creating unit 102b creates state change data using the learned-Eural network based on the state information 12.
- the matching unit 105 extracts registration data from the matching data storage unit 104.
- the matching unit 105 collates the state change data created by the state change data creation unit 102b with the registered data stored in the verification data storage unit 104, and outputs a verification result 18 (step S204).
- the data collating apparatus does not include the component analysis unit 101 and the component coefficient conversion unit 103 described in the first embodiment, but the first embodiment The same effect as the data collating apparatus shown in the form of can be obtained. Therefore, it is possible to improve the collation performance of data accompanied by a change in state and simplify the configuration of the data collation apparatus.
- the data collating device is a component in which the component coefficient converting means 103 is implemented.
- FIG. 5 is a block diagram showing an example of the configuration of a component analysis apparatus 101a in which the component analysis means 101 is implemented.
- the component analyzer 101a includes an arithmetic device 101b, an input data storage device 101c, and a component storage device 101d.
- the input data storage device 101c is realized by a memory or a magnetic disk device.
- the input data storage device 101c has a function of storing a face image that is the input data 11.
- the component storage device 101d is realized by a memory or a magnetic disk device.
- the component storage device 101d has a function of accumulating components sent from the state change creation device 102a, which is the state change data creation means 102.
- the arithmetic unit 101b is realized by a CPU that operates according to a program.
- the arithmetic device 101b performs data processing using the input data 11 and the constituent components.
- the computing device 101b is based on the facial image accumulated in the input data storage device 101c and the component stored in the component storage device 101d, and performs the same calculation processing as the component analysis means 101 shown in the first embodiment. To obtain the component coefficient. Then, the computing device 101b sends the obtained component coefficient to the state change creation device 102a.
- FIG. 6 is a block diagram showing an example of the configuration of a state change creation device 102a in which the state change data creation means 102 is implemented as an apparatus.
- the state change creating device 102a includes a computing device 102c, a state-specific component storage device 102d, and a state sorting device 102e.
- the state selection device 102e is realized by a switching semiconductor circuit or the like. Based on the state information 12, the state selection device 102e has a function of selecting a state-by-state database from which constituent components to be sent to the component analysis device 101a are extracted.
- the state-specific component storage device 102d is realized by a memory or a magnetic disk device.
- the state-specific component storage device 102d has a function of storing the components of the face image for each state.
- the computing device 102c is a CPU that operates according to a program. It is realized by.
- the computing device 102c has a function of creating state change data based on the constituent components and the constituent component coefficients of the face image for each state.
- the state selection device 102e selects a state-specific database corresponding to the state information 12, extracts the constituent components from the selected database, and sends them to the component analysis device 101a. Further, the state selection device 102e sends the component coefficient calculated by the component analysis device 101a to the component coefficient conversion device 103a in which the component coefficient conversion means 103 is implemented. The computing device 102c calculates the state change by using Equation (2) based on the component coefficient converted by the component coefficient conversion device 103a and the component of the state database corresponding to the converted component coefficient. Create data.
- FIG. 7 is a block diagram showing an example of a configuration of a component coefficient conversion device 103a in which the component coefficient conversion means 103 is implemented.
- the component coefficient conversion device 103a includes an arithmetic device 103b.
- the arithmetic unit 103b is realized by a CPU that operates according to a program.
- the arithmetic device 103b has a function of converting the component component coefficient from the state change creating device 102a into a component component coefficient corresponding to a state database different from the state database corresponding to the component coefficient. Note that the arithmetic device 103b converts the component coefficient using the same conversion method as the component coefficient conversion means 103 shown in the first embodiment.
- the collation data storage device in which the collation data storage unit 104 is implemented is specifically realized by a storage device such as a memory disk device.
- the collation device 105a in which the collation means 105 is implemented as an apparatus, has a power that may slightly change the configuration of the device depending on the collation method. Realized by a computer equipped with the device.
- the data collation apparatus is a state change creation apparatus that implements the state change data creation means 102b, a collation data storage apparatus that implements the collation data storage means 104, and a collation means 105. And a verification device.
- the configurations of the collation data storage device and the collation device are the same as the configurations of the collation data storage device and the collation device 105a shown in the third embodiment.
- FIG. 8 is a block diagram showing an example of the configuration of the state change data creation device 102 2f that incorporates the state change data creation means 102b.
- the state change data creation device 102f includes a state selection device 102g and an arithmetic device 102i.
- the state selection device 102g is realized by a switching semiconductor circuit or the like.
- the state selection device 102g receives the input data 11 and the state (aged) information 12
- the state selection device 102g converts the input data 11 to a state other than the state indicated by the state information 12 in the operation device 102i--Ural network operation.
- the input image (input data) 11 is sent to the arithmetic unit 102i to be performed.
- the arithmetic unit 102i is realized by a CPU that operates according to a program.
- the arithmetic device 102i converts the input data 11 into a face image in a state different from the state indicated by the state information 12, creates state change data, and outputs the state change data to the collation device 105a.
- the data matching method performed by the data matching device may be realized by a data matching program that can be executed on a computer.
- the data collation program is stored in an information recording medium readable by a computer and is read by the computer so that the data collation process shown in the first embodiment is executed on the computer.
- a process of decomposing data to be collated into constituent components in a predetermined state and parameters corresponding to the constituent components in the predetermined state in a state different from the predetermined state The state change data that stores the constituent components of the data divided according to the state and the data that is classified according to the state, and that gives the specified state change to the data to be verified using the stored constituent components and the converted parameters
- the data verification process is executed by causing the computer to read a data verification program for executing the process of generating the status change data and the process of verifying the generated status change data and the verification data group stored in the verification data storage means. Let ’s do it.
- the data matching method performed by the data matching device may be realized by a data matching program that can be executed on a computer. And that day The data verification process shown in the second embodiment may be executed on the computer by storing the data verification program in a computer-readable information recording medium and causing the computer to read it. ,.
- a process of creating state change data in which data in a predetermined state is changed to data in a state different from a predetermined state by conversion using learning based on data classified by state in a computer;
- the data verification process is executed by causing the computer to read a data verification program for executing the process of verifying the created state change data and the verification data group stored by the verification data storage means.
- a data verification program for executing the process of verifying the created state change data and the verification data group stored by the verification data storage means.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Computer Graphics (AREA)
- Collating Specific Patterns (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Processing (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Exchange Systems With Centralized Control (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05760141A EP1768064A4 (en) | 2004-07-15 | 2005-07-13 | METHOD FOR VERIFYING DATA, DEVICE FOR VERIFYING DATA, AND PROGRAM FOR VERIFYING DATA |
US10/573,843 US20080247639A1 (en) | 2004-07-15 | 2005-07-13 | Data Matching Method, Data Matching Apparatus and Data Matching Program |
JP2006524549A JP4029413B2 (ja) | 2004-07-15 | 2005-07-13 | データ照合方法、データ照合装置及びデータ照合プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-208603 | 2004-07-15 | ||
JP2004208603 | 2004-07-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006009043A1 true WO2006009043A1 (ja) | 2006-01-26 |
Family
ID=35785157
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/012952 WO2006009043A1 (ja) | 2004-07-15 | 2005-07-13 | データ照合方法、データ照合装置及びデータ照合プログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20080247639A1 (ja) |
EP (1) | EP1768064A4 (ja) |
JP (1) | JP4029413B2 (ja) |
KR (1) | KR100845634B1 (ja) |
CN (1) | CN100437641C (ja) |
WO (1) | WO2006009043A1 (ja) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4742193B2 (ja) | 2009-04-28 | 2011-08-10 | Necソフト株式会社 | 年齢推定装置、年齢推定方法及びプログラム |
US9886945B1 (en) * | 2011-07-03 | 2018-02-06 | Reality Analytics, Inc. | System and method for taxonomically distinguishing sample data captured from biota sources |
US11521460B2 (en) | 2018-07-25 | 2022-12-06 | Konami Gaming, Inc. | Casino management system with a patron facial recognition system and methods of operating same |
AU2019208182B2 (en) | 2018-07-25 | 2021-04-08 | Konami Gaming, Inc. | Casino management system with a patron facial recognition system and methods of operating same |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000113197A (ja) * | 1998-10-02 | 2000-04-21 | Victor Co Of Japan Ltd | 個人識別装置 |
JP2000357221A (ja) * | 1999-06-15 | 2000-12-26 | Minolta Co Ltd | 画像処理装置および画像処理方法、ならびに画像処理プログラムを記録した記録媒体 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5422961A (en) * | 1992-04-03 | 1995-06-06 | At&T Corp. | Apparatus and method for improving recognition of patterns by prototype transformation |
EP1039417B1 (en) * | 1999-03-19 | 2006-12-20 | Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. | Method and device for the processing of images based on morphable models |
US6937744B1 (en) * | 2000-06-13 | 2005-08-30 | Microsoft Corporation | System and process for bootstrap initialization of nonparametric color models |
US7093941B2 (en) * | 2001-04-25 | 2006-08-22 | Matsushita Electric Industrial Co., Ltd. | Video display apparatus and video display method |
-
2005
- 2005-07-13 KR KR1020067006216A patent/KR100845634B1/ko not_active IP Right Cessation
- 2005-07-13 EP EP05760141A patent/EP1768064A4/en not_active Ceased
- 2005-07-13 JP JP2006524549A patent/JP4029413B2/ja active Active
- 2005-07-13 US US10/573,843 patent/US20080247639A1/en not_active Abandoned
- 2005-07-13 WO PCT/JP2005/012952 patent/WO2006009043A1/ja not_active Application Discontinuation
- 2005-07-13 CN CNB200580001020XA patent/CN100437641C/zh active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000113197A (ja) * | 1998-10-02 | 2000-04-21 | Victor Co Of Japan Ltd | 個人識別装置 |
JP2000357221A (ja) * | 1999-06-15 | 2000-12-26 | Minolta Co Ltd | 画像処理装置および画像処理方法、ならびに画像処理プログラムを記録した記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1768064A4 * |
Also Published As
Publication number | Publication date |
---|---|
EP1768064A4 (en) | 2008-01-02 |
JPWO2006009043A1 (ja) | 2008-05-01 |
CN100437641C (zh) | 2008-11-26 |
CN1842823A (zh) | 2006-10-04 |
JP4029413B2 (ja) | 2008-01-09 |
KR100845634B1 (ko) | 2008-07-10 |
EP1768064A8 (en) | 2007-06-20 |
EP1768064A1 (en) | 2007-03-28 |
US20080247639A1 (en) | 2008-10-09 |
KR20060054477A (ko) | 2006-05-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1433118B1 (en) | System and method of face recognition using portions of learned model | |
JP4721052B2 (ja) | 特徴変化画像作成方法、特徴変化画像作成装置および特徴変化画像作成プログラム | |
Cambria et al. | Extreme learning machines [trends & controversies] | |
JPH1055444A (ja) | Dctをベースとするフィーチャー・ベクトルを使った顔の認識 | |
JP2000081894A (ja) | 音声評価方法 | |
US20020143539A1 (en) | Method of determining an eigenspace for representing a plurality of training speakers | |
Kim et al. | Analysis of 3d hand trajectory gestures using stroke-based composite hidden markov models | |
WO2021003813A1 (zh) | 基于神经网络模型的答案生成方法及相关设备 | |
EP2115737B1 (en) | Method and system to improve automated emotional recognition | |
CN111126233A (zh) | 基于距离值的通话通道构建方法、装置和计算机设备 | |
US6243695B1 (en) | Access control system and method therefor | |
Coviello et al. | The variational hierarchical EM algorithm for clustering hidden Markov models | |
CN111724458A (zh) | 一种语音驱动的三维人脸动画生成方法及网络结构 | |
JP4029413B2 (ja) | データ照合方法、データ照合装置及びデータ照合プログラム | |
JP5812505B2 (ja) | マルチモーダル情報に基づく人口学的分析方法及びシステム | |
US7454062B2 (en) | Apparatus and method of pattern recognition | |
JP2962549B2 (ja) | 顔動画像からの表情認識方法 | |
US7207068B2 (en) | Methods and apparatus for modeling based on conversational meta-data | |
JP2021021978A (ja) | 情報処理装置及びプログラム | |
US7516071B2 (en) | Method of modeling single-enrollment classes in verification and identification tasks | |
CN108596094A (zh) | 人物风格检测系统、方法、终端及介质 | |
JP2002082694A (ja) | 先行知識に基づく話者確認および話者識別 | |
KR102432854B1 (ko) | 잠재 벡터를 이용하여 군집화를 수행하는 방법 및 장치 | |
JPWO2006030687A1 (ja) | データ照合システム、データ照合装置、及びデータ照合方法 | |
CN117150320B (zh) | 对话数字人情感风格相似度评价方法及系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200580001020.X Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006524549 Country of ref document: JP |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10573843 Country of ref document: US Ref document number: 2005760141 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067006216 Country of ref document: KR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 1020067006216 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2005760141 Country of ref document: EP |