WO2013132695A1 - Dispositif et procédé de traitement d'informations, et support d'enregistrement - Google Patents

Dispositif et procédé de traitement d'informations, et support d'enregistrement Download PDF

Info

Publication number
WO2013132695A1
WO2013132695A1 PCT/JP2012/079070 JP2012079070W WO2013132695A1 WO 2013132695 A1 WO2013132695 A1 WO 2013132695A1 JP 2012079070 W JP2012079070 W JP 2012079070W WO 2013132695 A1 WO2013132695 A1 WO 2013132695A1
Authority
WO
WIPO (PCT)
Prior art keywords
score
face image
unit
integration
information processing
Prior art date
Application number
PCT/JP2012/079070
Other languages
English (en)
Japanese (ja)
Inventor
昭裕 早坂
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2014503416A priority Critical patent/JP6194880B2/ja
Publication of WO2013132695A1 publication Critical patent/WO2013132695A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data

Definitions

  • the present invention relates to a face image matching technique.
  • the technique disclosed in the above document by Huang et al. Is a method of using dictionary data for each fine angle of the face and integrating all the matching results by learning.
  • the technique disclosed in Japanese Patent Laid-Open No. 11-253427 integrates verification scores obtained from a plurality of different biometric authentication units, generates a verification score vector having the number of verification scores as the number of dimensions, and a verification score vector space This is a method for identifying either “legitimate” or “unfair” category based on the above.
  • the technique disclosed in Japanese Patent Laid-Open No. 2006-018707 is a method for setting each discriminator according to a target subject.
  • each classifier is set according to a target subject. Therefore, when a face image having a posture different from the target is input, the detection accuracy of the face image decreases. There was a problem of doing.
  • the present invention has been made to solve the above-described problems, and an object thereof is to provide an information processing apparatus, an information processing method, and a recording medium that can collate with high accuracy even face images with different postures. To do.
  • the information processing apparatus includes a first matching unit that calculates a first score for an input face image using a first discriminator corresponding to the face image captured in the first state; Second collating means for calculating a second score for the input face image using a second discriminator corresponding to the face image captured in a second state different from the first state; Holding means for previously holding the integration coefficient used for integrating the score and the second score in association with the relationship between the first score and the second score, and the integration held in the holding means Among the coefficients, determination means for selecting an integration coefficient corresponding to the relationship between the first score and the second score, and using the integration coefficient selected by the determination means, the first score and the An integrated hand that calculates an integrated score integrating the second score It is characterized in further comprising and.
  • the information processing method of the present invention includes a first collation step of calculating a first score for an input face image using a first discriminator corresponding to the face image captured in the first state; A second collation step of calculating a second score for the input face image using a second discriminator corresponding to a face image captured in a second state different from the first state; From the holding means that holds in advance the integration coefficient used for integrating the first score and the second score in association with the relationship between the first score and the second score, the first score A determination step of obtaining an integration coefficient corresponding to a relationship between the first score and the second score, and an integration score obtained by integrating the first score and the second score using the integration coefficient selected in the determination step And an integration step for calculating And it is characterized in and.
  • the computer-readable recording medium of the present invention uses the first identifier for calculating the first score for the input face image using the first discriminator corresponding to the face image captured in the first state. And a second collation step of calculating a second score for the input face image using a second discriminator corresponding to the face image captured in a second state different from the first state And from the holding means that holds in advance the integration coefficient used for integration of the first score and the second score in association with the relationship between the first score and the second score, A determination step of acquiring an integration coefficient corresponding to the relationship between the score of 1 and the second score, and the integration coefficient selected in the determination step is used to integrate the first score and the second score Calculated integrated score And if step, it is characterized in causing a computer to execute.
  • the first score for the input face image is calculated using the first discriminator corresponding to the face image captured in the first state, and the second score different from the first state is calculated.
  • the second score for the input face image is calculated using the second discriminator corresponding to the face image captured in the state, and the integration coefficient used for the integration of the first score and the second score is set as the first coefficient.
  • An integration coefficient corresponding to the relationship between the first score and the second score is acquired from the holding means previously stored in association with the relationship between the score and the second score, and the integration coefficient is used.
  • FIG. 1 is a block diagram showing the configuration of the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 2 is a flowchart showing the operation of the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 3 is a block diagram showing the configuration of the information processing apparatus according to the second embodiment of the present invention.
  • FIG. 4 is a flowchart showing the operation of the information processing apparatus according to the second embodiment of the present invention.
  • FIG. 5 is a block diagram showing the configuration of the information processing apparatus according to the third embodiment of the present invention.
  • FIG. 6 is a flowchart showing the operation of the information processing apparatus according to the third embodiment of the present invention.
  • FIG. 7 is a block diagram showing the configuration of the information processing apparatus according to the fourth embodiment of the present invention.
  • FIG. 8 is a block diagram showing the configuration of the information processing apparatus according to the fifth embodiment of the present invention.
  • FIG. 9 is a block diagram showing another configuration of the information processing apparatus according to the fifth embodiment of the present invention.
  • FIG. 10 is a block diagram showing another configuration of the information processing apparatus according to the fifth embodiment of the present invention.
  • FIG. 11 is a block diagram showing another configuration of the information processing apparatus according to the fifth embodiment of the present invention.
  • the information processing apparatus 100 is an apparatus for collating face images. As shown in FIG. 1, the information processing apparatus 100 includes a first collation unit 101 (first collation unit), a second collation unit 102 (second collation unit), and a determination unit 103 (determination unit). And an integration unit 104 (integration unit), a database unit 105, and a holding unit 106 (holding unit).
  • the first collation unit 101 uses the classifier 1010 learned in advance using a face image of a specific first posture, and the registered face image recorded in the input face image A and the database unit 105. And a matching score indicating the similarity between the input face image A and the registered face image is calculated.
  • the second collation unit 102 is recorded in the input face image A and the database unit 105 using a classifier 1020 that has been learned in advance using a face image in a second posture different from the first posture.
  • the registered face image is collated, and a collation score is calculated.
  • the first collation unit 101 and the second collation unit 102 use classifiers 1010 and 1020 specialized for collation of face images with different postures. Further, the number of collation units is not necessarily two, and three or more collation units may be provided.
  • the determination unit 103 determines the magnitude relationship between the matching score calculated by the first matching unit 101 and the matching score calculated by the second matching unit 102, and the holding unit 106 based on the magnitude relationship of the matching score.
  • An appropriate integration coefficient is selected from the integration coefficients of the matching scores recorded in the above.
  • the integration unit 104 integrates the verification score calculated by the first verification unit 101 and the verification score calculated by the second verification unit 102 by using the integration coefficient selected by the determination unit 103.
  • the verification score ST is output.
  • a feature quantity vector may be recorded in the database unit 105 instead of a face image.
  • an integration coefficient (integration weight) to be applied when integrating the verification score calculated by the first verification unit 101 and the verification score calculated by the second verification unit 102 is recorded in advance.
  • the optimum integration coefficient for obtaining a desired score is determined from the integration unit 104 in a situation where the registered face image is limited to a front face image and the input face image A is limited to an image with a specific face orientation. Learning to perform is performed in advance. Then, by repeating such learning while changing the face orientation of the person shown in the input face image A, the integration coefficients of different patterns corresponding to different postures of the person are determined, and the integration coefficients of these patterns are determined. It may be registered in the holding unit 106. That is, in the present embodiment, the holding unit 106 stores the integration coefficient in advance in association with the face orientation information of the input face image A (matching score magnitude relationship).
  • the database unit 105 and the holding unit 106 may be configured to use the same storage medium. According to the above configuration, face matching with higher accuracy can be realized.
  • the first collation unit 101 and the second collation unit 102 acquire the input face image A (step S101). Specifically, the first collation unit 101 and the second collation unit 102 acquire a still image or a moving image in the real space as the input face image A using a digital camera or a video camera. Or the 1st collation part 101 and the 2nd collation part 102 acquire the input face image A from the recording medium with which the still image or the moving image was accumulate
  • the first collation unit 101 and the second collation unit 102 collate the input face image A acquired in step S101 with the registered face image recorded in advance in the database unit 105, and calculate a collation score.
  • the face matching method used in the first matching unit 102 and the second matching unit 103 may use any face matching method.
  • the Eigenface method proposed by Turk may be used.
  • the determination unit 103 determines the magnitude relationship between the verification score calculated by the first verification unit 101 and the verification score calculated by the second verification unit 102. Then, the determination unit 103 selects a corresponding integration coefficient from a plurality of integration coefficient patterns recorded in advance in the holding unit 106 based on the acquired collation score magnitude relation information (step S103). Assuming that the first matching unit 101 specializes in matching a face facing left and the second matching unit 102 specializes in matching a face facing right, the matching score calculated by the first matching unit 101 and the first The rough posture (face orientation) of the person shown in the input face image A can be estimated from the magnitude relationship of the matching scores calculated by the second matching unit 102.
  • the determination unit 103 estimates that the input face image A is a left-facing face image, and selects an integration coefficient corresponding to the left-facing face image.
  • the determination unit 103 determines that the input face image A is a right-facing face image. And the integration coefficient corresponding to the image of the face facing right is selected.
  • the integration unit 104 uses the integration coefficient selected by the determination unit 103 to integrate the verification score calculated by the first verification unit 101 and the verification score calculated by the second verification unit 102.
  • the final matching score is output (step S104).
  • S1 is a collation score calculated by the first collation unit 101
  • S2 is a collation score calculated by the second collation unit 102
  • w1 is an integration coefficient that acts on the collation score S1 among the selected integration coefficients.
  • the integrated coefficient to be applied to the matching score S2 among the selected integrated coefficients is w2 calculates the integrated matching score ST as in the following equation, for example.
  • ST w1 ⁇ S1 + w2 ⁇ S2 (1)
  • the information processing apparatus 200 according to the present embodiment further includes an estimation unit 207 (estimation means) in addition to the configuration of the information processing apparatus 100 according to the first embodiment illustrated in FIG. Instead of 105, a determination unit 203 (determination means) and a database unit 205 are provided. Since other configurations are the same as those of the first embodiment, detailed description thereof is omitted here.
  • face orientation information indicating the face orientation of the person shown in the registered face image is recorded in advance together with the registered face image of the person to be collated.
  • a feature quantity vector may be recorded in the database unit 205 instead of a face image.
  • the estimation unit 207 estimates the face orientation of the person shown in the input face image A. Any method can be used to estimate the orientation of the face. For example, the literature “Kitomi Yamada, Ayako Nakajima, Kazuhiro Fukui,“ Estimation of Face Orientation by Factorization Method and Subspace Method ”, IEICE The technique disclosed in “Technical Research Report, 101 (568), 2002” may be used.
  • the estimation unit 207 outputs face orientation information indicating the estimated face orientation.
  • the determination unit 203 stores a plurality of information recorded in advance in the holding unit 106.
  • the optimum integration coefficient is selected from among the integration coefficient patterns. Specifically, when learning the integration coefficient in advance, the face direction of the learning data is limited to learn a plurality of patterns of the integration coefficient, and the face orientation information of the input face image A and the registered face are used for matching. A corresponding integration coefficient is selected based on the face orientation information of the image.
  • the set of integration coefficients learned in a situation where the registered face image is limited to the front face image and the input face image A is limited to the right face image is set to Wa
  • the registered face image is limited to the front face image.
  • a set of integration coefficients learned in a situation where the input face image A is limited to a left-facing face image is defined as Wb.
  • the determination unit 203 knows that the registered face image is a front face image from the face direction information of the registered face image at the time of collation, and the input face image A is an image of the right face from the face direction information of the input face image A. If it is found, the integration coefficient Wa is selected. That is, in the present embodiment, the holding unit 106 stores the integration coefficient in advance in association with the collation score magnitude relationship, the face orientation information of the input face image A, and the face orientation information of the registered face image.
  • the determination unit 203 may be configured to select the integration coefficient in consideration of not only the face orientation information but also the matching score calculated by the first matching unit 101 and the second matching unit 102.
  • the first matching unit 101 is a matching unit specialized for front face matching
  • the second matching unit 102 is a matching unit specialized for right-facing face matching
  • the collation scores of the first collation unit 101 and the second collation unit 102 depend on the face angle of the input face image A. The relationship changes.
  • the integration coefficient is determined in consideration of not only the face orientation information but also the collation score magnitude.
  • step S201 in FIG. 4 is the same as that in step S101 in FIG.
  • the estimation unit 207 estimates the face orientation of the person shown in the input face image A (step S202).
  • step S203 is the same as step S102 of FIG. Based on the face orientation information of the input face image A estimated in step S202 and the face orientation information of the registered face image recorded in the database unit 205 in advance, the determination unit 203 stores a plurality of information recorded in the holding unit 106 in advance. The optimum integration coefficient is selected from the integration coefficient patterns (step S204). The process of step S205 is the same as step S104 of FIG. According to the above configuration, since the integration coefficient can be selected more accurately, face images with different postures can be collated with high accuracy.
  • the information processing apparatus 300 according to the present embodiment further includes a selection unit 308 (selection means) in addition to the configuration of the information processing apparatus 200 according to the second embodiment illustrated in FIG.
  • a database unit 305 is provided. Since other configurations are the same as those of the second embodiment, detailed description thereof is omitted here.
  • the registered face image in addition to the registered face image of the person to be collated and the face orientation information of the registered face image, the registered face image is reversed to the left and right, and the reversed registered face image is reflected in the reversed registered face image.
  • Face orientation information indicating the face orientation of a person is recorded in advance.
  • a feature quantity vector may be recorded in the database unit 305 instead of a face image.
  • the face orientation of the inverted registered face image is opposite to the face orientation of the registered face image. If the face orientation of the registered face image is rightward, the face orientation of the inverted registered face image is leftward.
  • the selection unit 308 is based on the face orientation information of the input face image A estimated by the estimation unit 207, and is the same as the face orientation of the input face image A among the registered face image and the inverted registered face image recorded in the database unit 305 in advance. Select a face-oriented image.
  • step S301 in FIG. 6 is the same as step S101 in FIG.
  • step S302 is the same as step S202 of FIG.
  • the selection unit 308 Based on the face orientation information of the input face image A estimated in step S302, the selection unit 308 has the same face orientation as that of the input face image A among the registered face image and the inverted registered face image recorded in advance in the database unit 305.
  • An orientation image is selected (step S303).
  • step S304 is the same as step S102 of FIG. However, the first collation unit 101 and the second collation unit 102 collate the input face image A acquired in step S301 and the image selected by the selection unit 308 in step S303, respectively.
  • step S305 is the same as step S204 of FIG.
  • the determination unit 203 is based on the face orientation information of the input face image A estimated in step S302 and the face orientation information recorded in advance in the database unit 305 corresponding to the image selected by the selection unit 308.
  • the corresponding integration coefficient is selected from a plurality of integration coefficient patterns recorded in advance in the holding unit 106.
  • the process in step S306 is the same as step S104 in FIG.
  • a registered face image having a face orientation closer to the input face image A can be used for collation, so that face images with different postures can be collated with high accuracy.
  • the information processing apparatus 400 includes a front face matching unit 401, a right-facing face matching unit 402, a left-facing face matching unit 403, a determination unit 404, an integration unit 405, a database unit 406, and a storage Part 407.
  • This embodiment shows a specific example of the first embodiment.
  • the processing flow of the information processing apparatus 400 is the same as the processing flow of the information processing apparatus 100 shown in FIG.
  • the front face collation unit 401 collates the input face image A with the registered face image recorded in the database unit 406 using the discriminator 4010 learned in advance using the front face image.
  • a collation score indicating the degree of similarity between the face image A and the registered face image is calculated.
  • the right-facing face collation unit 402 collates the input face image A with the registered face image recorded in the database unit 406 using the discriminator 4020 learned in advance using the face image of the right-facing face. Calculate the score.
  • the left-facing face collation unit 403 collates the input face image A with the registered face image recorded in the database unit 406 using the discriminator 4030 that has been learned in advance using the left-facing face image. Calculate the score.
  • the front face collation unit 401, the right face collation unit 402, and the left face collation unit 403 use discriminators 4010, 4020, and 4030 specialized for face image collation with different postures.
  • registered face images of persons to be collated are recorded in advance. It is desirable that registered face images are recorded for three postures of front, right, and left for each person.
  • the front face collation unit 401 collates the input face image A and the registered face image of the front face
  • the right-facing face collation unit 402 collates the input face image A and the registered face image of the right-facing face.
  • the face collation unit 403 collates the input face image A and the registered face image of the left-facing face.
  • one registered face image of the front for each person is recorded in the database unit 406 in advance, and the right-facing face matching unit 402 stores the registered face image of the front face recorded in the database unit 406 when matching the images.
  • a face image of a right-facing face is generated from the image and collated with the input face image A, and the left-facing face collating unit 403 generates a left-facing face from the registered face image of the front face recorded in the database unit 406 when collating the images.
  • An image may be generated and collated with the input face image A.
  • the determination unit 404 determines the magnitude relationship between the matching score calculated by the right-facing face matching unit 402 and the matching score calculated by the left-facing face matching unit 403, and determines the face orientation of the person shown in the input face image A. . If the input face image A is a right-facing face image, the output of the right-facing face matching unit 402 is a high matching score. If the input face image A is a left-facing face image, the output of the left-facing face matching unit 403 is High matching score. Therefore, the face orientation of the person shown in the input face image A can be determined.
  • an integration coefficient used when integrating the matching score calculated by the front face matching unit 401, the matching score calculated by the right face matching unit 402, and the matching score calculated by the left face matching unit 403 (Integrated weight) is recorded in advance. Specifically, in the situation where the registered face image is limited to a front face image and the input face image A is limited to a right-facing face image, learning for determining an optimum integration coefficient that can obtain a desired score from the integration unit 405 The learned integration coefficient is recorded in the holding unit 407.
  • the coefficient to be applied to the collation score calculated by the front face collation unit 401 is w1
  • the coefficient to be applied to the collation score calculated by the right face collation unit 402 is w2
  • the collation calculated by the left face collation unit 403 A coefficient to be applied to the score is w3, and a set of these integrated coefficients w1, w2, and w3 is recorded in the holding unit 407. Furthermore, a set of integration coefficients using each matching score when learning a face image of a front face as learning data may be recorded in the holding unit 407.
  • the determination unit 404 selects an appropriate integration coefficient from the verification score integration coefficients recorded in the holding unit 407 based on the face orientation of the person shown in the input face image A (matching score magnitude relationship). Select. Specifically, when determining that the input face image A is a right-facing face image, the determination unit 404 selects an integration coefficient corresponding to the right-facing face image. If the determination unit 404 estimates that the input face image A is a left-facing face image, the determination unit 404 selects an integration coefficient corresponding to the right-facing face image, and then calculates the matching calculated by the right-facing face matching unit 402. The integration unit 405 is instructed to replace the score with the matching score calculated by the left-facing face matching unit 403.
  • the function max (S2, S3) in Expression (2) is a function for selecting the larger one of S2 and S3, and min (S2, S3) is a function for selecting the smaller one of S2 and S3.
  • the magnitude relationship between w1, w2, and w3 is w2> w1 >> w3, respectively.
  • the magnitude relationship of the collation scores is S2 >> S1 >> S3. Therefore, the integration coefficient corresponding to the right-facing face image is directly applied to each collation score.
  • the integrated verification score ST may be calculated.
  • the magnitude relationship of the collation scores is S3 >> S1 >> S2, so that the integration coefficient corresponding to the right-facing face image is directly applied to each collation score. If it does, desired integrated collation score ST will not be obtained. However, even if a right face image and a left face image are input to learn the integration coefficient, w2 and w3 become approximately equal weights, which is not an optimal integration.
  • the integration coefficient learning has only to be carried out with only the input face image A facing in either the right direction or the left direction, reduction of the pre-processing can be expected.
  • the authentication error rate (the individual rejection rate when the other person acceptance rate is 0.1%) is about 13%.
  • the authentication error rate is about 9%, and an integrated coefficient learned using only face images in the front and right postures is used.
  • the authentication error rate is about 4.6%.
  • the integrated matching score ST may be calculated using Equation (3).
  • the integrated collation score ST may be calculated using Expression (4).
  • Information processing apparatuses 100, 200, 300, and 400 determine whether or not the person shown in input face image A is the same as the person shown in the registered face image based on integrated verification score ST
  • the output unit 107 (output means) may be provided. That is, the output unit 107 outputs a determination result indicating that the person shown in the input face image A is the same as the person shown in the registered face image when the integrated matching score ST is equal to or greater than a predetermined threshold. When the integrated matching score ST is less than the threshold value, a determination result indicating that the person shown in the input face image A is different from the person shown in the registered face image is output.
  • the configuration when the output unit 107 is applied to the first embodiment is shown in FIG.
  • FIG. 8 shows a configuration in the case where the output is performed
  • FIG. 11 shows a configuration in a case where the output unit 107 is applied to the fourth embodiment.
  • the information processing apparatuses 100, 200, 300, and 400 described in the first to fifth embodiments are realized by, for example, a computer having a CPU, a storage device, and an interface, and an information processing program that controls these hardware resources. can do.
  • the CPU executes the processes described in the first to fifth embodiments according to the information processing program stored in the storage device.
  • the present invention may be applied to a system composed of a plurality of devices, or may be applied to a single device. Furthermore, the present invention is also applicable to cases where an information processing program that implements the functions of the embodiments is supplied directly or remotely to a system or apparatus. Therefore, in order to implement the functions of the present invention on a computer, a program installed in the computer, a non-transitory computer-readable recording medium storing the program, or a WWW (World Wide Web) server for downloading the program are also included in the scope of the present invention.
  • a program installed in the computer a non-transitory computer-readable recording medium storing the program, or a WWW (World Wide Web) server for downloading the program are also included in the scope of the present invention.
  • WWW World Wide Web
  • the 1st collation unit which calculates the 1st score with respect to an input face image using the 1st discriminator corresponding to the face image imaged in the 1st state, The said 1st state Using a second discriminator corresponding to a face image captured in a different second state, a second collating means for calculating a second score for the input face image, the first score and the A holding unit that holds in advance the integration coefficient used for integration of the second score in association with the relationship between the first score and the second score, and among the integration coefficients held in the holding unit, A determination unit that selects an integration coefficient corresponding to a relationship between the first score and the second score, and the integration coefficient selected by the determination unit is used to determine the first score and the second score. And an integration means for calculating an integrated score.
  • the information processing apparatus according to claim.
  • the information processing apparatus according to supplementary note 1, further comprising a database that stores in advance a registered face image of a person to be collated, wherein the first collating unit includes the input face image and the registered face image. Means for collating and calculating the first score, and the second collating means includes means for calculating the second score by comparing the input face image and the registered face image.
  • the information processing apparatus further comprising estimation means for estimating a face direction of a person shown in the input face image, wherein the database includes the registered face image together with the registered face image. Face orientation information is stored in advance, and the holding unit calculates the integration coefficient, the relationship between the first score and the second score, the face orientation information of the input face image, and the face orientation information of the registered face image.
  • the determination means is estimated by the estimation means and the relationship between the first score and the second score among the integration coefficients held by the holding means.
  • An information processing apparatus comprising: means for selecting an integration coefficient corresponding to the face orientation information of the input face image and the face orientation information of the registered face image.
  • the database further comprises selection means for selecting a registered face image for face orientation, and the database includes a registered face image of a person to be collated and face orientation information of the registered face image, as well as a registered image obtained by inverting the registered face image.
  • a face image and face orientation information of the registered face image are stored in advance, and the determination unit includes the relationship between the first score and the second score among the integration coefficients held in the holding unit, and the An information processing apparatus comprising: means for selecting an integration coefficient corresponding to face orientation information of an input face image estimated by an estimation means and face orientation information of a registered face image selected by the selection means.
  • the relationship between the first score and the second score is a magnitude relationship between the first score and the second score.
  • the 1st collation step which calculates the 1st score with respect to an input face image using the 1st discriminator corresponding to the face image imaged in the 1st state, The said 1st state, Uses a second discriminator corresponding to a face image captured in a different second state to calculate a second score for the input face image, the first score,
  • the first score and the second score are obtained from holding means that holds in advance an integration coefficient used for integration of the second score in association with the relationship between the first score and the second score.
  • the 1st collation step which calculates the 1st score with respect to an input face image using the 1st discriminator corresponding to the face image imaged in the 1st state, The said 1st state, Uses a second discriminator corresponding to a face image captured in a different second state to calculate a second score for the input face image, the first score,
  • the first score and the second score are obtained from holding means that holds in advance an integration coefficient used for integration of the second score in association with the relationship between the first score and the second score.
  • the present invention can be applied to face image matching technology.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

L'invention concerne un dispositif de traitement d'informations (100) qui comprend : une première section de comparaison (101) qui calcule un premier résultat correspondant à une image faciale d'entrée, à l'aide d'un dispositif de reconnaissance correspondant à une image faciale capturée dans une première condition ; une seconde section de comparaison (102) qui calcule un second résultat correspondant à une image faciale d'entrée à l'aide d'un dispositif de reconnaissance correspondant à une image faciale capturée dans une seconde condition différente de la première condition ; une section de conservation (106) qui conserve un coefficient de combinaison utilisé pour combiner le premier et le second résultat en association avec la relation entre le premier et le second résultat ; une section de décision (103) qui sélectionne un coefficient de combinaison correspondant à la relation du premier et du second résultat, des coefficients de combinaison conservés dans la section de conservation (106) ; et une section de combinaison (104) qui calcule un résultat combiné combinant le premier et le second résultat, à l'aide du coefficient de combinaison sélectionné.
PCT/JP2012/079070 2012-03-09 2012-11-09 Dispositif et procédé de traitement d'informations, et support d'enregistrement WO2013132695A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014503416A JP6194880B2 (ja) 2012-03-09 2012-11-09 情報処理装置、情報処理方法および記録媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-053187 2012-03-09
JP2012053187 2012-03-09

Publications (1)

Publication Number Publication Date
WO2013132695A1 true WO2013132695A1 (fr) 2013-09-12

Family

ID=49116197

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/079070 WO2013132695A1 (fr) 2012-03-09 2012-11-09 Dispositif et procédé de traitement d'informations, et support d'enregistrement

Country Status (2)

Country Link
JP (1) JP6194880B2 (fr)
WO (1) WO2013132695A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016006616A (ja) * 2014-06-20 2016-01-14 ヤフー株式会社 学習装置、学習方法及び学習プログラム
JP2022136179A (ja) * 2020-02-12 2022-09-15 日本電気株式会社 制御装置、検知装置、制御方法、およびプログラム
US12073647B2 (en) 2014-03-13 2024-08-27 Nec Corporation Detecting device, detecting method, and recording medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2019208182B2 (en) 2018-07-25 2021-04-08 Konami Gaming, Inc. Casino management system with a patron facial recognition system and methods of operating same
US11521460B2 (en) 2018-07-25 2022-12-06 Konami Gaming, Inc. Casino management system with a patron facial recognition system and methods of operating same

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006178651A (ja) * 2004-12-21 2006-07-06 Toshiba Corp 人物認識装置、人物認識方法および通行制御装置
JP2006343791A (ja) * 2005-06-07 2006-12-21 Hitachi Ltd 顔画像データベース作成方法
JP2010061465A (ja) * 2008-09-04 2010-03-18 Sony Corp 画像処理装置、撮像装置、画像処理方法およびプログラム
JP2011128916A (ja) * 2009-12-18 2011-06-30 Fujifilm Corp オブジェクト検出装置および方法並びにプログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5554984B2 (ja) * 2009-12-24 2014-07-23 キヤノン株式会社 パターン認識方法およびパターン認識装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006178651A (ja) * 2004-12-21 2006-07-06 Toshiba Corp 人物認識装置、人物認識方法および通行制御装置
JP2006343791A (ja) * 2005-06-07 2006-12-21 Hitachi Ltd 顔画像データベース作成方法
JP2010061465A (ja) * 2008-09-04 2010-03-18 Sony Corp 画像処理装置、撮像装置、画像処理方法およびプログラム
JP2011128916A (ja) * 2009-12-18 2011-06-30 Fujifilm Corp オブジェクト検出装置および方法並びにプログラム

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12073647B2 (en) 2014-03-13 2024-08-27 Nec Corporation Detecting device, detecting method, and recording medium
JP2016006616A (ja) * 2014-06-20 2016-01-14 ヤフー株式会社 学習装置、学習方法及び学習プログラム
JP2022136179A (ja) * 2020-02-12 2022-09-15 日本電気株式会社 制御装置、検知装置、制御方法、およびプログラム
JP7444204B2 (ja) 2020-02-12 2024-03-06 日本電気株式会社 制御装置、検知装置、制御方法、およびプログラム

Also Published As

Publication number Publication date
JP6194880B2 (ja) 2017-09-13
JPWO2013132695A1 (ja) 2015-07-30

Similar Documents

Publication Publication Date Title
CN106096582B (zh) 区分真人面部与平坦表面
KR102486699B1 (ko) 영상 인식 방법, 영상 검증 방법, 장치, 및 영상 인식 및 검증에 대한 학습 방법 및 장치
JP6921694B2 (ja) 監視システム
JP6268960B2 (ja) 画像認識装置及び画像認識装置に対するデータ登録方法
CN105868677B (zh) 一种活体人脸检测方法及装置
JP6194880B2 (ja) 情報処理装置、情報処理方法および記録媒体
JP4353246B2 (ja) 法線情報推定装置、登録画像群作成装置および画像照合装置ならびに法線情報推定方法
JP5910631B2 (ja) 情報処理装置、情報処理方法および情報処理プログラム
US10353954B2 (en) Information processing apparatus, method of controlling the same, and storage medium
JP2006338092A (ja) パタン照合方法、パタン照合システム及びパタン照合プログラム
JP4979480B2 (ja) 顔認証装置
JP4858612B2 (ja) 物体認識システム、物体認識方法および物体認識用プログラム
US10762133B2 (en) Information processing apparatus, method of controlling the same, and storage medium
US20090123077A1 (en) Coefficient determining method, feature extracting method, system, and program, and pattern checking method, system, and program
CN108596079B (zh) 手势识别方法、装置及电子设备
US9904843B2 (en) Information processing device, information processing method, and program
JP6164284B2 (ja) 認証装置、認証方法およびコンピュータプログラム
WO2012046426A1 (fr) Dispositif de détection d'objet, procédé de détection d'objet, et programme de détection d'objet
JP6065842B2 (ja) 辞書学習装置、パターン照合装置、辞書学習方法およびコンピュータプログラム
WO2021038788A1 (fr) Dispositif d'évaluation de robustesse, procédé d'évaluation de robustesse et support d'enregistrement
JP6763408B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
WO2009096208A1 (fr) Système, procédé et programme de reconnaissance d'objet
JP2007140695A (ja) 不審顔検出システム、不審顔検出方法および不審顔検出プログラム
CN113158706B (zh) 脸部抓拍方法、装置、介质以及电子设备
US20220309704A1 (en) Image processing apparatus, image processing method and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12870750

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014503416

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12870750

Country of ref document: EP

Kind code of ref document: A1