JP2009116385A - Information processor, information processing method, image identification device, image identification method, and program - Google Patents

Information processor, information processing method, image identification device, image identification method, and program Download PDF

Info

Publication number
JP2009116385A
JP2009116385A JP2007285164A JP2007285164A JP2009116385A JP 2009116385 A JP2009116385 A JP 2009116385A JP 2007285164 A JP2007285164 A JP 2007285164A JP 2007285164 A JP2007285164 A JP 2007285164A JP 2009116385 A JP2009116385 A JP 2009116385A
Authority
JP
Japan
Prior art keywords
image
model
model image
identification
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2007285164A
Other languages
Japanese (ja)
Other versions
JP5018404B2 (en
JP2009116385A5 (en
Inventor
Takayuki Ashigahara
Yoshiaki Iwai
Akira Nakamura
章 中村
嘉昭 岩井
隆之 芦ヶ原
Original Assignee
Sony Corp
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp, ソニー株式会社 filed Critical Sony Corp
Priority to JP2007285164A priority Critical patent/JP5018404B2/en
Publication of JP2009116385A publication Critical patent/JP2009116385A/en
Publication of JP2009116385A5 publication Critical patent/JP2009116385A5/ja
Application granted granted Critical
Publication of JP5018404B2 publication Critical patent/JP5018404B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/62Methods or arrangements for recognition using electronic means
    • G06K9/6217Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
    • G06K9/6228Selecting the most significant subset of features
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/62Methods or arrangements for recognition using electronic means
    • G06K9/6201Matching; Proximity measures
    • G06K9/6202Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • G06K9/6203Shifting or otherwise transforming the patterns to accommodate for positional errors
    • G06K9/6211Matching configurations of points or features, e.g. constellation matching

Abstract

Even if a similar model image is registered, it is correctly identified.
An identification feature point storage unit stores information of an identification feature point that is a feature point having a low correlation value between similar model images. When the images 11 to 15 are stored as model images in the model image dictionary storage unit and the query image 111 is supplied, the images 11 to 15 are extracted as model images that are highly likely to be identified. The images 11 to 15 are affine transformed and converted into images 11a to 15a. Then, a correlation value between the feature amount at the identification feature point of the image 11a to image 15a after the affine transformation and the feature amount at the identification feature point of the query image 111 is calculated, and the query value is calculated based on the correlation value at the identification feature point. An identification result for the image 111 is obtained. The present invention can be applied to a device corresponding to a recognition device or a determination processing unit.
[Selection] Figure 4

Description

  The present invention relates to an information processing device, an information processing method, an image identification device, an image identification method, and a program, and in particular, an information processing device, an information processing method, and an image suitable for performing an image identification process. The present invention relates to an identification device, an image identification method, and a program.

  Pre-register model images for recognition, extract feature points from each of the model images and query images input for recognition processing, match local feature quantities around them, and then geometric constraints A texture-based general object recognition method using local features is known as a method for removing mismatches by using a local feature (see, for example, Patent Document 1).

  This technology extracts feature points from model images and query images, describes local feature quantities around them, matches feature quantities, and removes outliers (mismatches) by Hough transform or RANSAC. This is a method of identifying an object in a query image by the number of matching pairs.

JP 2004-326693 A

  However, when a plurality of model images are similar, such as when some textures of the models are the same, the object in the query image is not matched by the number of match pairs as in Patent Document 1 described above. When a general object recognition method for determination is used, there is a possibility that a match pair exceeding a threshold is observed in a plurality of model images and recognized as “correct” for a plurality of models.

  The present invention has been made in view of such a situation, and makes it possible to correctly recognize an object even when a plurality of model images are similar to each other.

  An information processing apparatus according to a first aspect of the present invention is an information processing apparatus that generates information used to identify an image, the model image acquiring unit acquiring a plurality of model images, and the model image acquiring unit Model image feature amount extraction means for extracting the feature amount of the model image acquired by the above, matching means for performing the matching processing of the feature amount of the model image extracted by the model image feature amount extraction means, As a result of matching by the matching means, in a similar model image that is a model image in which a predetermined number of feature quantities match a predetermined model image, feature points that have little correlation with the predetermined model image are displayed between the similar model images. And an identification feature point extracting means for extracting as an identification feature point used for identification.

  The matching unit may include an outlier removal unit that removes an outlier from the result of matching processing of the feature quantity between the model images.

  The similar model image may further include first conversion means for performing affine transformation so as to match the predetermined model image, and the identification feature point extracting means includes the first feature point extraction means. The identification feature point can be extracted using the similar model image converted by the conversion means.

  A recognition image acquisition unit for acquiring a recognition image, a recognition image feature amount extraction unit for extracting a feature amount of the recognition image acquired by the recognition image acquisition unit, and the identification feature extracted by the identification feature point extraction unit In this regard, it is possible to further include an identification unit that identifies the similar model image corresponding to the recognized image by obtaining a correlation between the feature amount of the similar model image and the recognized image.

  The similar model image may be further provided with second conversion means for performing affine transformation so as to match the recognition image, and the identification means is converted by the second conversion means. The similar model image can be identified using the similar model image corresponding to the recognized image.

  The identification feature point extracted by the identification feature point extracting means is a first identification feature point that is identified as a corresponding similar model image as the number of matches increases, and a similar model image that corresponds as the number of matches decreases. It can be classified into two identification feature points identified as.

  The identification feature point extracting means may extract the identification feature points hierarchically.

  An information processing method according to a first aspect of the present invention is an information processing method of an information processing apparatus that generates information used for identifying an image, and acquires a plurality of model images, and the acquired model images A similar model image that is a model image that matches a predetermined model image with a predetermined number or more of the characteristic amounts as a result of matching. And extracting a feature point having a small correlation with the predetermined model image as an identification feature point used for discrimination between the similar model images.

  A program according to a first aspect of the present invention is a program for causing a computer to execute a process of generating information used for identifying an image, acquiring a plurality of model images, and acquiring the model images A similar model image that is a model image that matches a predetermined model image with a predetermined number or more of the characteristic amounts as a result of matching. And causing the computer to execute a process including a step of extracting a feature point having little correlation with the predetermined model image as an identification feature point used for discrimination between the similar model images.

  In the first aspect of the present invention, a plurality of model images are acquired, the feature amounts of the acquired model images are extracted, and the feature amount matching processing of the extracted model images is performed. In a similar model image that is a model image in which a predetermined number or more of feature quantities match the image, feature points having a small correlation with the predetermined model image are extracted as identification feature points used for identification between similar model images.

  An image identification device according to a second aspect of the present invention stores a plurality of model images and feature amounts of the model image, receives an input of a recognition image, and identifies the model image corresponding to the recognition image In the similar model images that are a plurality of model images in which a predetermined number or more of the feature quantities match with any one of the other model images, the correlation between the similar model images. Storage means for storing identification feature points, which are few feature points, recognition image acquisition means for acquiring the recognition image, and recognition image feature quantity extraction for extracting the feature quantity of the recognition image acquired by the recognition image acquisition means The recognition image is obtained by obtaining a correlation between the feature amount of the similar model image and the recognition image at the identification feature point stored by the storage unit and the storage means. It comprises identifying means for identifying the similar model image corresponding.

  The similarity model image may further include a conversion unit that performs affine transformation so as to match the recognition image, and the identification unit converts the similarity model image converted by the conversion unit. Can be used to identify the similar model image corresponding to the recognized image.

  A matching unit that performs a matching process of a feature amount between the recognized image acquired by the recognized image acquiring unit and the stored model image can be further provided, and as a result of matching by the matching unit, When there are a predetermined number or more of the model images in which the recognition image and a feature quantity equal to or greater than a predetermined threshold are matched, the identification unit stores the predetermined number or more of the model images as the similar model image by the storage unit. By obtaining a correlation between the similar model image and the feature amount of the recognized image at the identified feature point, the similar model image corresponding to the recognized image can be identified.

  The matching unit may include an outlier removal unit that removes an outlier from the result of matching processing of the feature quantity between the model images.

  The identification feature point stored by the storage means is a first identification feature point identified as a corresponding similar model image as the number of matches increases, and a corresponding similar model image as the number of matches decreases. It can be classified into two identified feature points to be identified.

  The identification feature points stored by the storage means may have a hierarchical structure.

  An image identification method according to a second aspect of the present invention includes a first storage unit that stores a plurality of model images and feature amounts of the model images, and one of the other model images among the model images. A second storage unit that stores identification feature points, which are feature points having a low correlation between the similar model images, in a similar model image that is a plurality of the model images in which a predetermined number or more of the feature quantities match. An image identification method of an image identification device that receives an input of a recognition image and identifies the model image corresponding to the recognition image, acquires the recognition image, and extracts a feature amount of the acquired recognition image The step of identifying the similar model image corresponding to the recognized image by obtaining a correlation between the feature amounts of the similar model image and the recognized image at the identified feature point.

  The program according to the second aspect of the present invention includes a first storage unit that stores a plurality of model images and feature amounts of the model images, and a predetermined number of the model images with any of the other model images. Information stored in a second storage unit that stores identification feature points that are feature points having a low correlation between the similar model images in the similar model images that are the plurality of model images that match the feature amounts described above. , A program for causing a computer to execute a process of identifying the model image corresponding to the input recognition image, acquiring the recognition image, extracting a feature amount of the acquired recognition image, A step of identifying the similar model image corresponding to the recognized image by obtaining a correlation between feature amounts of the similar model image and the recognized image at an identification feature point. To execute a process including the computer.

  In the second aspect of the present invention, the recognition image is acquired, the feature amount of the acquired recognition image is extracted, and by obtaining the correlation between the feature amount of the similar model image and the recognition image at the identification feature point, A similar model image corresponding to the recognized image is identified.

  The network is a mechanism in which at least two devices are connected and information can be transmitted from one device to another device. The devices that communicate via the network may be independent devices, or may be internal blocks that constitute one device.

  The communication is not only wireless communication and wired communication, but also communication in which wireless communication and wired communication are mixed, that is, wireless communication is performed in a certain section and wired communication is performed in another section. May be. Further, communication from one device to another device may be performed by wired communication, and communication from another device to one device may be performed by wireless communication.

  The device corresponding to the recognition device, the identification feature point selection unit, or the device corresponding to the determination processing unit may be an independent device or a single block such as an information processing device.

  According to the first aspect of the present invention, among similar model images, which are a plurality of model images that match a predetermined number or more of feature quantities with any one of the other model images, among the model images, between similar model images. Since it is possible to extract an identification feature point that is a feature point with little correlation, by using this identification feature point for the identification process, even if there is a similar model image, a correct model image can be obtained while maintaining robustness. Can be identified.

  Further, according to the second aspect of the present invention, among the similar model images that are a plurality of model images having a predetermined number or more of feature quantities matched with any of the other model images among the Dell images, Therefore, the identification process can be performed by using the identification feature point which is a feature point with little correlation, so that even when there is a similar model image, the correct model image can be identified while maintaining the robustness.

  Embodiments of the present invention will be described below. Correspondences between the constituent elements of the present invention and the embodiments described in the specification or the drawings are exemplified as follows. This description is intended to confirm that the embodiments supporting the present invention are described in the specification or the drawings. Therefore, even if there is an embodiment which is described in the specification or the drawings but is not described here as an embodiment corresponding to the constituent elements of the present invention, that is not the case. It does not mean that the form does not correspond to the constituent requirements. Conversely, even if an embodiment is described herein as corresponding to a configuration requirement, that means that the embodiment does not correspond to a configuration requirement other than the configuration requirement. It's not something to do.

  The information processing device according to the first aspect of the present invention corresponds to the information processing device (for example, the recognition device 51 in FIG. 2 or the identification feature point selection unit 61) that generates information used to identify an image. A model image acquisition unit (for example, the model image acquisition unit 71 in FIG. 2) that acquires a plurality of model images, and a model that extracts the feature amount of the model image acquired by the model image acquisition unit Image feature amount extraction means (for example, the feature amount extraction unit 72 in FIG. 2) and matching means (for example, FIG. 2) that performs the process of matching the feature amount of the model image extracted by the model image feature amount extraction means. Matching processing unit 74 and outlier removal unit 75), and a result of matching by the matching means, a predetermined model image (for example, an attention model image) and a predetermined number In the similar model image that is the model image in which the feature amount is matched above, a feature point extraction that extracts a feature point having little correlation with the predetermined model image as an identification feature point used for discrimination between the similar model images Means (for example, correlation image generation unit 78 and identification feature point extraction unit 79 in FIG. 2).

  The matching unit may include an outlier removing unit (for example, an outlier removing unit 75 in FIG. 2) that removes an outlier from the result of the matching processing of the feature amounts between the model images.

  The image processing apparatus further includes first conversion means (for example, the affine parameter acquisition unit 76 and the affine transformation processing unit 77 in FIG. 2) that performs affine transformation on the similar model image so as to match the predetermined model image. The identification feature point extraction unit may extract the identification feature point using the similar model image converted by the first conversion unit.

  A recognition image acquisition unit (for example, a query image acquisition unit 93 in FIG. 2) that acquires a recognition image (for example, a query image), and a recognition image feature that extracts a feature amount of the recognition image acquired by the recognition image acquisition unit. A correlation between the feature quantity of the similar model image and the recognized image is obtained at the identification feature point extracted by the quantity extraction means (for example, the feature quantity extraction unit 94 in FIG. 2) and the identification feature point extraction means. In this way, it is possible to further include identification means (for example, the identification feature point comparison unit 99 and the identification processing unit 100 in FIG. 2) for identifying the similar model image corresponding to the recognition image.

  The similar model image may further include a second conversion unit (for example, the affine parameter acquisition unit 97 and the affine transformation processing unit 98 in FIG. 2) that performs affine transformation so as to match the recognition image. The identifying unit may identify the similar model image corresponding to the recognized image using the similar model image converted by the second converting unit.

  The information processing method according to the first aspect of the present invention is an information processing device that generates information used to identify an image (for example, the recognition device 51 in FIG. 2 or a device corresponding to the identification feature point selection unit 61). ), A plurality of model images are acquired (for example, the process in step S11 in FIG. 13 or the process in step S21 in FIG. 4), and the feature amount of the acquired model image is extracted. (For example, the process in step S12 in FIG. 13 or the process in step S22 in FIG. 4), the matching processing of the feature amount of the extracted model image is performed (for example, the process in step S14 in FIG. 13 or In step S24 in FIG. 4, as a result of matching, a similar model that is a model image in which a predetermined model image (for example, an attention model image) matches a predetermined number or more of the feature quantities is obtained. In the image, feature points having a small correlation with the predetermined model image are extracted as identification feature points used for discrimination between the similar model images (for example, processing in steps S15 to S19 in FIG. 13 or FIG. 4). Step S28 to step S31).

  The program according to the first aspect of the present invention is a program for causing a computer to execute processing for generating information used to identify an image, and obtains a plurality of model images (for example, step of FIG. 13). Step S11 or step S21 in FIG. 4), the feature amount of the acquired model image is extracted (for example, step S12 in FIG. 13 or step S22 in FIG. 4) and extracted. A matching process of the feature amount of the model image that has been performed (for example, the process of step S14 in FIG. 13 or the process of step S24 in FIG. 4) is performed. ) And a similar model image that is a model image in which a predetermined number or more of the feature quantities match, a feature having little correlation with the predetermined model image Is extracted as an identification feature point used for identification between the similar model images (for example, the process of step S15 to step S19 in FIG. 13 or the process of step S28 to step S31 in FIG. 4). To run.

  An image identification device according to a second aspect of the present invention stores a plurality of model images and feature amounts of the model image, receives an input of a recognition image, and identifies the model image corresponding to the recognition image (For example, an apparatus corresponding to the determination processing unit 62 in FIG. 2), and among the model images, a plurality of the model images in which a predetermined number or more of the feature quantities match with any of the other model images. Storage means (for example, the identification feature point storage unit 92 in FIG. 2) that stores feature points that are less correlated between the similar model images, and the recognition image (for example, a query). Recognition image acquisition means (for example, query image acquisition unit 93 in FIG. 2) for acquiring images, and recognized image feature quantities for extracting feature quantities of the recognition images acquired by the recognition image acquisition means. The recognition unit obtains a correlation between feature amounts of the similar model image and the recognition image at the identification feature points stored by the output unit (for example, the feature amount extraction unit 94 in FIG. 2) and the storage unit. Identification means (for example, the identification feature point comparison unit 99 and the identification processing unit 100 in FIG. 2) for identifying the similar model image corresponding to the image is provided.

  The similar model image may further include a conversion unit (for example, an affine parameter acquisition unit 97 and an affine conversion processing unit 98 in FIG. 2) that performs affine transformation so as to match the recognition image, and the identification unit. Can identify the similar model image corresponding to the recognized image using the similar model image converted by the converting means.

  Further provided is a matching means (for example, a matching processing unit 95 and an outlier removal unit 96 in FIG. 2) for performing a feature amount matching process between the recognized image acquired by the recognized image acquiring unit and the stored model image. When the result of matching by the matching means is a predetermined number or more of the model images that match the recognition image and a feature quantity equal to or greater than a predetermined threshold, the identification means As the similar model image, the similar model image corresponding to the recognized image is identified by obtaining a correlation between the similar model image and the feature amount of the recognized image at the identification feature point stored by the storage unit. can do.

  The matching unit may include an outlier removal unit (for example, an outlier removal unit 96 in FIG. 2) that removes an outlier from the result of the matching processing of the feature amounts between the model images.

  An image identification method according to a second aspect of the present invention includes a first storage unit that stores a plurality of model images and feature amounts of the model images, and one of the other model images among the model images. A second storage unit that stores identification feature points, which are feature points having a low correlation between the similar model images, in a similar model image that is a plurality of the model images in which a predetermined number or more of the feature quantities match. An image identification method of an image identification apparatus (for example, an apparatus corresponding to the determination processing unit 62 in FIG. 2) that receives an input of a recognition image and identifies the model image corresponding to the recognition image, For example, a query image) is acquired (for example, the process of step S41 of FIG. 15), the feature amount of the acquired recognition image is extracted (for example, the process of step S42 of FIG. 15), and the identification feature point is extracted. The step of identifying the similar model image corresponding to the recognized image by obtaining the correlation between the feature amounts of the similar model image and the recognized image (for example, the processing in steps S47 to S50 in FIG. 15). Including.

  The program according to the second aspect of the present invention includes a first storage unit that stores a plurality of model images and feature amounts of the model images, and a predetermined number of the model images with any of the other model images. Information stored in a second storage unit that stores identification feature points that are feature points having a low correlation between the similar model images in the similar model images that are the plurality of model images that match the feature amounts described above. Is used to acquire a recognition image (for example, a query image) (for example, in step S41 of FIG. 15), which causes the computer to execute processing for identifying the model image corresponding to the input recognition image. Processing), the feature amount of the acquired recognition image is extracted (for example, the processing of step S42 in FIG. 15), and the similar model image is compared with the previous model image at the identification feature point. By correlating the feature amount of the recognition image, the identifying the similar model image corresponding to the recognized image (e.g., the processing of step S47 to step S50 of FIG. 15) to execute processing including a step in the computer.

  Hereinafter, embodiments of the present invention will be described with reference to the drawings.

  For example, when the model images corresponding to the models to be identified are the images 11 to 15 shown in FIG. 1, the respective images are different, but these images are the double images shown in the simplified image 21. The shape is basically the same except for the region about half on the left side of the characters described in the circle and the star, and the presence or absence or type of fruit picture indicated by a cloud.

  Image data corresponding to images 11 to 15 shown in FIG. 1 (hereinafter simply referred to as images 11 to 15) are registered as data corresponding to model images (hereinafter simply referred to as model images). When the image corresponding to the image 11 is supplied as data corresponding to the query image (hereinafter simply referred to as a query image), only the image 11 should be extracted as the identification result. In a technique for identifying an object in a query image by the number of pairs, although it depends on the number of match pairs for identification, any of the images 11 to 15 is identified as being correct.

  In particular, when there are a wide variety of model images, if the threshold value for the number of match pairs is increased to such a level that the images 11 to 15 can be distinguished, the identification becomes too strict and should be recognized as actually the same. Are likely to be misrecognized as different. However, as described above, it is difficult to correctly identify the images 11 to 15 at a threshold setting level in general matching processing.

  Therefore, when there are similar images such as the images 11 to 15 in FIG. 1 when the model image is registered, the correlation between these images is obtained, and the feature points of the dissimilar portions are extracted. Thus, it is preferable that the identification process can identify similar images such as the image 11 to the image 15 in FIG. .

  FIG. 2 is a block diagram illustrating a configuration of a recognition device 51 that can obtain a correlation between similar images in advance when registering a model image and can use a matching result of a portion that is not similar in the identification process. It is.

  The recognition device 51 includes an identification feature point selection unit 61 and a determination processing unit 62. Here, although the description will be made assuming that the recognition device 51 is one device, it goes without saying that the identification feature point selection unit 61 and the determination processing unit 62 may each be configured as one device.

  The identification feature point selection unit 61 includes a model image acquisition unit 71, a feature amount extraction unit 72, a model image dictionary storage unit 73, a matching processing unit 74, an outlier removal unit 75, an affine parameter acquisition unit 76, an affine transformation processing unit 77, and a correlation. The image generation unit 78 and the identification feature point extraction unit 79 are configured.

  The model image acquisition unit 71 acquires a model image for identification processing and supplies the model image to the model image dictionary storage unit 73 and the feature amount extraction unit 72. Assume that the supplied model images include a set of very similar model images, such as images 11 to 15 shown in FIG.

  The feature amount extraction unit 72 extracts feature points from each model image supplied from the model image acquisition unit 71, extracts local feature amounts from around the feature points, and describes them as vectors. The feature quantity extraction unit 72 supplies the extracted local feature quantity information to the model image dictionary storage unit 73, the matching processing unit 74, and the outlier removal unit 75.

  For example, the feature point may be a pixel with a large difference from neighboring pixels, a pixel having a high frequency component obtained by applying FFT (Fast Fourier transform) to the image data, or the like. It may be extracted by this method. As the local feature amount, for example, Gabor Jet, Haar Wavelet, Gaussian Derivatives, SIFT feature, and the like can be used.

  The model image dictionary storage unit 73 stores each model image supplied from the model image acquisition unit 71 and local feature amounts corresponding to each model image supplied from the feature amount extraction unit 72. The model image stored in the model image dictionary storage unit 73 and the local feature amount corresponding to the model image are supplied to the model image dictionary storage unit 91 of the determination processing unit 62 and used for matching with the query image.

  The matching processing unit 74 calculates the local feature amount supplied from the feature amount extracting unit 72 and the local feature amount of each model image currently stored in the model image dictionary storage unit 73, for example, K-NN (k -NearestNeighbor (nearest neighbor classification method) etc. are used for matching to find a match pair. Hereinafter, the model image used as the reference for the matching process, from which the latest local feature quantity is extracted by the feature quantity extraction unit 72, is referred to as a model image of interest, and is already stored in the model image dictionary storage unit 73. Are distinguished from a plurality of other model images used for comparison with the local feature of the model image of interest.

  Then, the matching processing unit 74 supplies the attention model image, similar model images having a predetermined number of match pairs or more with respect to the target model image, and information on those match pairs to the outlier removal unit 75. For example, when the image 12 of FIG. 1 is already stored in the model image dictionary storage unit 73 and the image 11 of FIG. 1 is supplied as a model image, Images 12 to 15 in FIG. 1 are extracted as similar model images having a predetermined number of match pairs or more with respect to the target model image.

  In addition, since the matching processing unit 74 obtains a match pair using the local feature amount, the model image of interest and each of the other model images to be compared are captured from different viewpoints or their sizes. Even if they are different, matching can be performed correctly.

  The outlier removal unit 75 removes mismatched pairs that do not satisfy the geometric constraints from the match pairs detected by the matching processing unit 74 using, for example, Hough transform or RANSAC (Random Sample Consensus). The outlier removal may be performed using any technique other than the technique described above. The outlier removal unit 75 supplies the affine parameter acquisition unit 76 with the attention model image, the similar model image having a predetermined number of match pairs or more with respect to the attention model image, and the match pair information after the outlier removal.

  The affine parameter acquisition unit 76 obtains affine parameters between the model image of interest and each similar model image from the remaining match pairs. The affine transformation is a transformation that allows shear deformation to the similarity transformation that adds the scaling transformation to the translation and rotation transformation (Euclidean transformation), and the points on the straight line in the original figure remain on the straight line after the transformation. Lines and parallel lines are transformations that maintain their geometric properties, such as parallel lines after transformation. That is, the affine parameter acquisition unit 76 is similar even when the target model image and each of the similar model images are captured from different viewpoints or have different sizes. It is a parameter for rotation, scaling, shear deformation, parallel movement, etc. to correct each distortion of the model image and correct it to an image captured in the same size from the imaging direction corresponding to the target model image Find the affine parameters.

  The affine parameter acquisition unit 76 obtains the affine parameter, the similar model image having a predetermined number of match pairs or more with respect to the target model image, and the calculated affine parameters in addition to the match pair information after the outlier removal. This is supplied to the conversion processing unit 77.

  The affine transformation processing unit 77 transforms the supplied similar model image with affine parameters, and corrects the image so as to be captured in the same size from the imaging direction corresponding to the target model image. The affine transformation processing unit 77 sends the attention model image, the similar model image after affine transformation having a predetermined number or more of match pairs to the attention model image, and the match pair information after outlier removal to the correlation image generation unit 78. Supply.

  The correlation image generation unit 78 calculates, for each feature point of the target model image, a correlation value (feature amount distance) of the feature amount with the similar model image after affine transformation at that position, and for each feature point obtained. Are supplied to the identification feature point extraction unit 79. When the correlation values obtained here are associated on the image, a correlation image indicating the correlation between the model image of interest and the similar model image is obtained.

  Based on the correlation value with the similar model image at each feature point of the target model image, the identification feature point extraction unit 79 has a predetermined number of feature points whose correlation value is lower than a certain threshold value, or from the lowest correlation value. The feature points are extracted as identification feature points that are useful feature points for identification with similar model images, and supplied to the identification feature point storage unit 92 of the determination processing unit 62.

  For example, in the state where the images 12 to 15 in FIG. 1 or the images to be converted into the images 12 to 15 in FIG. 1 when the affine transformation is performed are already stored in the model image dictionary storage unit 73. 1 is supplied, the images 12 to 15 in FIG. 1 or images corresponding thereto are extracted as similar model images with respect to the image 11 that is the model image of interest. Then, the correlation image generation unit 78 obtains the correlation value of the feature amount at the feature point of the target model image between the image 11 that is the target model image and each of the images 12 to 15 that are the similar model images after the affine transformation. . Then, for example, as shown in FIG. 3, the identification feature point extraction unit 79 extracts feature points of a portion having a low correlation value.

  For example, among the images 12 to 15 described with reference to FIG. 1, when the image 12 is supplied to the identification feature point selection unit 61 as the model image earliest, the image 12 corresponds to the images 13 to 15. Although it is registered as a similar model image, the identification feature point may not be extracted from the image 12 because the model image itself is not processed as a model image having a similar image. Therefore, for example, when the matching processing unit 74 detects a similar model image having a predetermined number of match pairs or more, if a similar model image is not associated with the model image detected as the similar model image, It is preferable that the identification feature point detection process is executed when the attention model image and the similar model image are handled in reverse.

  In addition, one model image is input, but the above-described processing is not performed sequentially, but a set of similar model images is generated in advance, and all combinations of images included in the set are generated. Identification feature points may be extracted based on the average of correlation or the like.

  In such a case, for example, in the images 11 to 15 described with reference to FIG. 1, a matching process is performed by the matching processing unit 74 to obtain a match pair, and an outlier removal unit 75 removes the outlier of the match pair. In this case, the images 11 to 15 are extracted as a set of similar model images.

  And the affine parameter acquisition part 76 acquires the affine parameter of all the combinations. Also, the affine parameter conversion processing unit 77 performs affine conversion of each combination. Then, the correlation image generation unit 78 generates correlation images for all combinations and obtains an average value thereof. Then, based on the average value of a plurality of correlation images, the identification feature point extraction unit 79 obtains a feature model having a correlation value lower than a threshold value or a predetermined number of feature points from the lowest correlation value. It can be extracted as an identification feature point that is a useful feature point for use in identification with an image and supplied to the identification feature point storage unit 92 of the determination processing unit 62.

  Next, the determination processing unit 62 of the recognition device 51 in FIG. 2 includes a model image dictionary storage unit 91, an identification feature point storage unit 92, a query image acquisition unit 93, a feature amount extraction unit 94, a matching processing unit 95, and an outlier removal unit. 96, an affine parameter acquisition unit 97, an affine transformation processing unit 98, an identification feature point comparison unit 99, an identification processing unit 100, and an identification result output unit 101.

  The model image dictionary storage unit 91 receives and stores the model image and the local feature corresponding to the model image from the model image dictionary storage unit 73.

  The identification feature point storage unit 92 receives supply of identification feature point information that is a feature point having a low correlation value with the similar model image in the above-described attention model image having the similar model image from the identification feature point extraction unit 79. ,Remember.

  The query image acquisition unit 93 acquires a query image that is an identification target.

  The feature amount extraction unit 94 extracts feature points from the query image supplied from the query image acquisition unit 93, extracts local feature amounts from around the feature points, and describes them as vectors. The feature point and feature amount extraction method is the same as that of the feature amount extraction unit 72. The feature quantity extraction unit 94 supplies the extracted local feature quantity information to the matching processing unit 95 and the outlier removal unit 96.

  The matching processing unit 95 uses the local feature amount supplied from the feature amount extraction unit 94 and the local feature amount corresponding to the model image stored in the model image dictionary storage unit 91, for example, a technique such as K-NN. To find a match pair. Then, the matching processing unit 95 supplies the query image, model image data having a predetermined number of match pairs or more with respect to the query image, and information on those match pairs to the outlier removal unit 96. For example, when the image 11 to image 15 in FIG. 1 are stored as model images in the model image dictionary storage unit 91 and the query image 111 shown in FIG. 4 is supplied, the matching processing unit 95 selects the images 11 to 11. The image 15 is extracted as a model image that is highly likely to be an identification target, and is supplied to the outlier removal unit 96.

  The outlier removal unit 96 removes mismatched pairs that do not satisfy the geometric constraints from the match pairs detected by the matching processing unit 95 using Hough transform, RANSAC, or the like. Then, when there is no model image in which the number of remaining match pairs is greater than a predetermined threshold, the outlier removal unit 96 supplies the result to the identification processing unit 100. In addition, when the number of remaining match pairs is greater than a predetermined threshold, and the number of remaining match pairs is less than a predetermined number, the outlier removal unit 96 identifies the model images remaining without being removed and the number of match pairs. 100. Further, when the number of remaining match pairs exceeds a predetermined threshold value, the model image having the remaining number of match pairs larger than the predetermined threshold value is considered to be the above-described similar model image. . The outlier removal unit 96 supplies the similar model image, the query image, and the match pair information after the outlier removal to the affine parameter acquisition unit 97.

  The affine parameter acquisition unit 97 obtains an affine parameter between the query image and each similar model image from the remaining match pairs, a query image, an affine parameter, a similar model image in which the number of remaining match pairs is greater than a predetermined threshold, and The match pair information after the outlier removal is supplied to the affine transformation processing unit 98.

  The affine transformation processing unit 98 transforms the supplied similar model image with affine parameters, and corrects the image so as to be captured in the same size from the imaging direction corresponding to the query image. As illustrated in FIG. 4, when images 11 to 15 are extracted as similar model images of the query image 111, the images 11 to 15 are converted into images 11 a to 15 a after affine transformation by the affine transformation processing unit 98. Is converted to The affine transformation processing unit 98 supplies the query image, the similar model image after the affine transformation, and the match pair information after the outlier removal to the identification feature point comparison unit 99.

  Based on the identification feature points of each model image stored in the identification feature point storage unit 92, the identification feature point comparison unit 99 obtains the identification feature point in the similar model image after the affine transformation and calculates the feature amount. At the same time, a correlation value between the obtained feature quantity at the identified feature point and the feature quantity at the identified feature point on the query image is calculated. That is, the identification feature point comparison unit 99 calculates a correlation value between the feature amount at the identification feature point of the image 11a to image 15a after the affine transformation of FIG. 4 and the feature amount at the identification feature point of the query image 111. The identification feature point comparison unit 99 supplies the calculation result of the correlation value to the identification processing unit 100.

  The identification processing unit 100 executes the identification process based on the information supplied from the outlier removal unit 96 or the identification feature point comparison unit 99. That is, when the outlier removal unit 96 detects that the number of match pairs remaining after the outlier removal is greater than a predetermined threshold value and is smaller than a predetermined number, the identification processing unit 100 performs outlier removal. Based on the number of match pairs supplied from the unit 96, the query image corresponds to a model image having the largest number of match pairs or one or more model images having a greater number of match pairs than a predetermined threshold. The identification result is output to the identification result output unit 101. In addition, when the identification processing unit 100 is notified from the outlier removal unit 96 that there is no similar model image in which the number of remaining matched pairs is greater than a predetermined threshold, the query image does not match any model image. The determination result is output to the identification result output unit 101.

  In addition, when the outlier removal unit 96 detects that the number of similar model images in which the number of match pairs remaining after the outlier removal exceeds a predetermined threshold is greater than or equal to a predetermined number, the identification processing unit 100 performs an identification feature point comparison unit From 99, the correlation value of the feature quantity at the identification feature point between the similar model image and the query image is supplied. Based on the supplied correlation value of the feature quantity, the identification processing unit 100 uses a similar model image having the highest average correlation value at the identification feature point, or the average value is lower than a predetermined threshold value. It is assumed that the image corresponds to one or more high similar model images, and the identification result is output to the identification result output unit 101.

  The identification result output unit 101 displays the identification result supplied from the identification processing unit 100 on, for example, a display unit, outputs it as audio data, or outputs it via a predetermined transmission path or a predetermined recording. The data is output to another device, for example, by being recorded on a medium.

  Similar image identification processing in the recognition device 51 is applied to identification of images that have substantially the same format and size and differ only in characters, such as telops of the same program such as television broadcasting. be able to.

  For example, a case where the # 1 model image 121 and the # 2 model image 122 shown in FIG. 5 are compared with the query image 123 will be described.

  The # 2 model image 122 and the query image 123 are images that should be identified as being the same, but the # 1 model image 121 has substantially the same format and size as the # 2 model image 122. Yes, only the characters are different.

  Therefore, when the # 1 model image 121 and the # 2 model image 122 are registered as model images and the query image 123 is supplied, only the # 2 model image 122 should be extracted as an identification result. In the conventional technique for identifying an object in a query image with the number of matching pairs, as shown in FIG. 6, a predetermined number or more is exceeded for both the # 1 model image 121 and the # 2 model image 122. Since the match pair is extracted, the correct answer is identified.

  Therefore, it is assumed that the identification feature point selection unit 61 described above extracts the identification feature points of the # 1 model image 121 and the # 2 model image 122.

  An image 131 in FIG. 7 is an image obtained by overlaying each image after affine transformation when one of the # 1 model image 121 and the # 2 model image 122 is a target model image and the other is a similar model image. is there. From the image 131, the # 1 model image 121 and the # 2 model image 122 are different in the characters of “local supporters are discouraged” and “the cause of the defeat of the Japanese national team”, but other than that, they are almost the same. I understand that

  An image 132 in FIG. 7 is a correlation image between the # 1 model image 121 and the # 2 model image 122, and a portion with a large correlation is displayed in white and a portion with a small correlation is displayed in black.

  7 shows identification feature points obtained from the # 1 model image 121 and the # 2 model image 122. The determination processing unit 62 in FIG. 2 stores the identification feature points shown in the image 133 in the identification feature point storage unit 92.

  As described with reference to FIG. 6, in the technique for identifying an object in a query image with the above-described conventional number of matching pairs, the # 1 model image 121 and the # 2 model image 122 are compared with the query image 123. In both cases, a predetermined number or more of match pairs are extracted, and both of them may be identified as images corresponding to the query image 123. In contrast, the determination processing unit 62 in FIG. 2 obtains a correlation value between the # 1 model image 121 and the # 2 model image 122 at the identification feature point shown in the image 133 in FIG. Since the identification process can be performed, the image corresponding to the query image 123 can be correctly identified as the # 2 model image 122.

  That is, the determination processing unit 62 in FIG. 2 includes many identifications included in the portions described as “the local supporter is discouraged” and “the cause of the Japan national defeat” in the # 1 model image 121 and the # 2 model image 122. Since the determination is performed using the feature points, the image corresponding to the query image 123 can be correctly identified as the # 2 model image 122.

  Further, for example, in the simplified image 21 of FIG. 1, there are N model images in which the portions indicated by the clouds are similar, but the N images can be classified into a and b in a double circle portion. Yes, the b similar model images can be completely classified by a star-shaped part, for example, when similar forms between model images are represented by a hierarchical structure, or there are a plurality of similar parts between model images. When there is a location, it is possible to determine a similar model image by distinguishing the identification feature points or by layering the identification feature points.

  For example, as shown in FIG. 8, the model image dictionary storage unit 91 of the determination processing unit 62 to which the query image 151 has been input stores # 11 model image 161 to # 14 model image 164 as similar model images. The case will be described.

  Since each of the # 11 model image 161 to the # 14 model image 164 has a similar part to the query image 151, an object in the query image is identified by the number of matching pairs described above. In such a technique, any of the # 11 model image 161 to the # 14 model image 164 may be identified as corresponding to the query image 151.

  Each of the # 11 model image 161 to # 14 model image 164 has a similar portion and a portion that is not similar. FIG. 9 shows the correlation images of the # 11 model image 161 to the # 14 model image 164, respectively. In the correlation image of FIG. 9, a portion with a large correlation is displayed in white and a portion with a small correlation is displayed in black.

  The difference between the # 11 model image 161 and the # 12 model image 162 is the upper right heart mark and the lower right crescent moon mark. The difference between the # 11 model image 161 and the # 13 model image 163 is a heart mark portion at the upper right and a widow mark portion at the lower left. A different part between the # 11 model image 161 and the # 14 model image 164 is a part of the wife mark at the lower left. The difference between the # 12 model image 162 and the # 13 model image 163 is a crescent moon mark portion in the lower right and a widow mark portion in the lower left. The difference between the # 12 model image 162 and the # 14 model image 164 is a heart mark on the upper right, a crescent moon mark on the lower right, and a widow mark on the lower left. The difference between the # 13 model image 163 and the # 14 model image 164 is a heart mark portion at the upper right.

  In the # 11 model image 161 to # 14 model image 164 that are similar model images, the smile mark at the upper left is the same in all cases, and thus does not become an identification feature point. On the other hand, since the part without the mark is a part with few features as the model image, the feature point should not be extracted in each of the corresponding query image and the model image in the part without the mark. is there. In other words, it can be estimated that the query image and the model image are not similar when there is a matched pair of feature amounts of the query image and the model image in a portion without a mark.

  Also, in the # 11 model image 161 to # 14 model image 164, the portion where the mark is described except for the smile mark at the upper left is a portion having a large feature as a model image. When there is a feature-value match pair with an image, it can be estimated that the query image and the model image are similar.

  Therefore, the identification feature point extraction unit 79 of the identification feature point selection unit 61 classifies the identification feature points into two types, as shown in FIG. 10, and the portions indicated by + in the figure are model images and query images. When the correlation value of is high, that is, when there is a match pair at the corresponding feature point, it is assumed that the query image and the model image are identification feature points for performing a positive determination that estimates that they are similar. In the portion indicated by, when the correlation value between the model image and the query image is low, that is, when there is no match pair at the corresponding feature point, a positive determination is made to estimate that the query image and the model image are similar. It is an identification feature point.

  In FIG. 10, the classified identification feature point of the # 11 model image 161 is indicated by the identification feature point 181, the classified identification feature point of the # 12 model image 162 is indicated by the identification feature point 182, and # 13 The classified identification feature point of the model image 163 is indicated by the identification feature point 183, and the classified identification feature point of the # 14 model image 164 is indicated by the identification feature point 184.

  The determination processing unit 62 uses the identification feature points classified in this way, so that even if the similar parts in the similar model image are different, they are not important and important parts for identification. The identification process can be performed by distinguishing the portions.

  That is, the identification feature point comparison unit 99 of the determination processing unit 62 that calculates the correlation value at the identification feature point using the identification feature points 181 to 184 of FIG. 10 performs the identification feature between the # 11 model image 161 and the query image. The correlation value at the point 181 is positive only in the lower right part, and the correlation value at the identification feature point 182 between the # 12 model image 162 and the query image is positive only in the upper right part, and the # 13 model image 163 and the query image The correlation value at the identification feature point 183 is positive at the lower left, upper right, and lower right portions, and the correlation value at the identification feature point 184 between the # 14 model image 164 and the query image is at the lower right portion and the lower left portion. It can be detected that it is positive. Thereby, it is determined that the model image corresponding to the query image 151 is the # 13 model image 163.

  Next, a case where similar model images are discriminated by classifying the identification feature points hierarchically will be described.

  The identification feature point extraction unit 79 of the identification feature point selection unit 61 classifies the similar model images into a plurality of similar models, and by making the classification detailed, finally, the similar models can be individually identified. Make it hierarchical so that you can. Here, as shown in FIG. 11, the # 11 model image 161 to # 14 model image 164 of FIG. 8 are classified into two layers. The first layer is a similar model identification layer 1 that identifies four similar models into two similar model groups, and the second layer is a similar model discrimination layer 2 that discriminates similar models into individual similar model images. When the number of similar models is large, the number of similar model identification layers increases.

  First, the identification feature point extraction unit 79 of the identification feature point selection unit 61 selects an identification feature point group suitable for identifying a certain model group, and holds it as a layer 1 model identification feature point. For example, the # 11 model image 161 to # 14 model image 164 of FIG. 8 are divided into two model groups, # 11 model image 161 and # 14 model image 164, and # 12 model image 162 and # 13 model image 163. . The most suitable for identifying the two model groups is the presence or absence of the upper right heart mark, and therefore the lower right heart mark portion is set as the layer 1 model identification feature point. Then, in the identification feature point comparison unit 99, when the # 12 model image 162 or # 13 model image 163 and the query image have a high correlation value at the upper right position, that is, when there is a match pair, the query image and the model A positive determination is made to estimate that the images are similar. Then, in the identification feature point comparison unit 99, when the correlation value at the upper right position is low in the # 11 model image 161 or # 14 model image 164 and the query image, that is, when there is no match pair, the query image and model A positive determination is made to estimate that the images are similar.

  Then, the identification feature point extraction unit 79 of the identification feature point selection unit 61 selects a further distinguishable identification feature point from the plurality of model images included in each model group, and holds it as a layer 2 model identification feature point. . The most suitable for identifying the # 11 model image 161 and the # 14 model image 164 and for identifying the # 12 model image 162 and the # 13 model image 163 is the presence or absence of the lightning mark at the lower left. Therefore, the lightning bolt mark at the lower left is set as the layer 2 model identification feature point. Then, in the identification feature point comparison unit 99, when the correlation value at the lower right position is high in the # 11 model image 161 or the # 12 model image 162 and the query image, that is, when there is a match pair, A positive determination is made to estimate that the model images are similar. Then, in the identification feature point comparison unit 99, when the correlation value at the lower right position is low in the # 13 model image 163 or # 14 model image 164 and the query image, that is, when there is no match pair, the query image A positive determination is made to estimate that the model images are similar.

  In the # 11 model image 161 to # 14 model image 164 of FIG. 8, the layer 1 model identification feature points and the layer 2 model identification feature points having the hierarchical structure shown in FIG. 11 are as shown in FIG. Also in FIG. 12, the portion indicated by + in the figure is estimated that the query image and the model image are similar when the correlation value between the model image and the query image is high, that is, when there is a match pair at the corresponding feature point. Are identified feature points that perform positive determination, and a portion indicated by x in the figure indicates that when the correlation value between the model image and the query image is low, that is, when there is no match pair at the corresponding feature point, The model image is an identification feature point for performing a positive determination that is estimated to be similar.

  The identification feature point comparison unit 99 of the determination processing unit 62 that calculates the correlation value at the identification feature point using the layer 1 model identification feature point and the layer 2 model identification feature point of FIG. In the # 12 model image 162, only the layer 1 model identification feature point is positive, and in the # 13 model image 163, the layer 1 model identification feature point and the layer 2 model identification feature point are not positive. Both are positive, and in the # 14 model image 164, it can be calculated that the layer 2 model identification feature point is positive. That is, the identification feature point comparison unit 99 can detect that the # 11 model image 161 and the # 14 model image 164 do not correspond to the query image in the layer 1, and the # 12 model image 162 in the layer 2 It can be detected that the image does not correspond. Thereby, it is determined that the model image corresponding to the query image 151 is the # 13 model image 163.

  In this way, identification feature points in similar model images can be extracted even when similar forms between model images are represented in a hierarchical structure, or even when there are multiple similar parts between model images. It is possible to reliably execute by distinguishing similar parts or by making them hierarchical.

  Next, with reference to FIG. 13, an identification feature point extraction process 1 that is a first example of a process executed by the identification feature point selection unit 61 of the recognition device 51 will be described.

  In step S <b> 11, the model image acquisition unit 71 acquires a model image for identification processing and supplies the model image to the model image dictionary storage unit 73 and the feature amount extraction unit 72.

  In step S12, the feature amount extraction unit 72 extracts feature points from each model image supplied from the model image acquisition unit 71, extracts local feature amounts from around the feature points, and describes them as vectors. The feature quantity extraction unit 72 supplies the extracted local feature quantity information to the model image dictionary storage unit 73, the matching processing unit 74, and the outlier removal unit 75.

  In step S <b> 13, the model image dictionary storage unit 73 stores each model image supplied from the model image acquisition unit 71 and local feature amounts corresponding to each model image supplied from the feature amount extraction unit 72. . The model image stored in the model image dictionary storage unit 73 and the local feature amount corresponding to the model image are supplied to and stored in the model image dictionary storage unit 91 of the determination processing unit 62 after the identification feature point extraction process 1 ends. Is done.

  In step S <b> 14, the matching processing unit 74 calculates the local feature amount of the model image of interest supplied from the feature amount extraction unit 72 and the local feature amount of each model image currently stored in the model image dictionary storage unit 73. For example, matching is performed by a technique such as K-NN to obtain a match pair. Then, the matching processing unit 74 supplies the attention model image, the similar model images having the number of match pairs greater than or equal to a predetermined number with respect to the target model image, and information on those match pairs to the outlier removal unit 75.

  In step S15, the outlier removal unit 75 removes mismatched pairs that do not satisfy the geometric constraint from the match pairs detected by the matching processing unit 74 using, for example, Hough transform or RANSAC. The outlier removal unit 75 extracts a similar model image that is a set of similar model images having a predetermined number of match pairs or more with respect to the target model image, and extracts the target model image, the similar model image, and the match pair after the outlier removal. Information is supplied to the affine parameter acquisition unit 76.

  In step S <b> 16, the affine parameter acquisition unit 76 obtains affine parameters between the model image of interest and each similar model image from the remaining match pairs, and has a number of match pairs greater than or equal to a predetermined value for the model model of interest and the model image of interest. In addition to the similar model image and the match pair information after outlier removal, the calculated affine parameters are supplied to the affine transformation processing unit 77.

  In step S <b> 17, the affine transformation processing unit 77 transforms the supplied similar model image with affine parameters, and corrects the image so as to be captured in the same size from the imaging direction corresponding to the target model image. The affine transformation processing unit 77 sends the attention model image, the similar model image after affine transformation having a predetermined number or more of match pairs to the attention model image, and the match pair information after outlier removal to the correlation image generation unit 78. Supply.

  In step S18, the correlation image generation unit 78 calculates, for each feature point of the target model image, the correlation value (feature amount distance) of the feature amount with the similar model image after the affine transformation at that position, and A correlation image between the model image and the similar model image is generated, and the correlation image, that is, information indicating the correlation value for each obtained feature point is supplied to the identification feature point extraction unit 79.

  In step S19, the identification feature point extraction unit 79, based on the correlation value of each feature point of the target model image with the similar model image, the feature point having a correlation value lower than a certain threshold or the correlation value in descending order. Then, a predetermined number of feature points are extracted as identification feature points having a high model identification capability with similar model images, supplied to the identification feature point storage unit 92 of the determination processing unit 62 and registered, and the process ends.

  By such processing, in addition to the feature amount at the feature point of the model image, an identification feature point having a high model identification capability with the similar model image can be obtained for each model image. The determination processing unit 62 in which similar identification feature points are registered can execute identification processing for similar model images.

  In the process of step S19, the identification feature point extraction unit 79 performs the extraction of the identification feature point in the similar model image while distinguishing the similar parts as described with reference to FIGS. Alternatively, it may be hierarchized.

  Next, with reference to FIG. 14, an identification feature point extraction process 2 that is a second example of a process executed by the identification feature point selection unit 61 of the recognition apparatus 51 will be described.

  In the identification feature point extraction process 1 described with reference to FIG. 13, model images are sequentially input regardless of whether the model images are similar, and one model image is input. In contrast, the extraction feature point extraction process 2 generates a set of similar model images in advance, and all the combinations of images included in the set are extracted. Based on the average of the correlation, etc., the identification feature points are extracted.

  In steps S21 to S25, processing similar to that in steps S11 to S15 in FIG. 13 is executed. That is, the model image acquisition unit 71 acquires a model image for identification processing, and the feature amount extraction unit 72 extracts a feature point from each model image supplied from the model image acquisition unit 71, and a feature point Extract local features from the surroundings and describe them as vectors. Then, the model image dictionary storage unit 73 stores each model image supplied from the model image acquisition unit 71 and local feature amounts corresponding to each model image supplied from the feature amount extraction unit 72. Then, the matching processing unit 74 matches the local feature amount of the model image of interest with the local feature amount of each model image currently stored in the model image dictionary storage unit 73 by a technique such as K-NN. Then, the match pair is obtained, and the outlier removal unit 75 removes the mismatch pair that does not satisfy the geometric constraint from the match pair detected by the matching processing unit 74 using, for example, Hough transform or RANSAC.

  In step S <b> 26, the outlier removal unit 75 determines whether the matching of all model images to be subjected to the identification feature point extraction process and the outlier removal process have been completed. If it is determined in step S26 that the matching of all model images and the outlier removal process have not been completed, the process returns to step S21, and the subsequent processes are repeated.

  If it is determined in step S26 that the matching of all model images and the outlier removal processing have been completed, in step S27, the outlier removal unit 75 sets a set of similar model images based on the remaining match pairs after the outlier removal. And the image data of the set of similar model images and the match pair after the outlier removal are supplied to the affine parameter acquisition unit 76 as affine parameters.

  In step S28, the affine parameter acquisition unit 76 obtains the affine parameters between the attention model image and each similar model image from the remaining match pairs after the outlier removal in all combinations of similar model images, and the attention model image, The calculated affine parameters are supplied to the affine transformation processing unit 77 in addition to the information on the similar model image having the number of match pairs greater than or equal to the predetermined model image and the match pair after the outlier removal.

  In step S29, the affine transformation processing unit 77 transforms the supplied similar model image with affine parameters in all combinations of similar model images, and images captured in the same size from the imaging direction corresponding to the target model image. To correct. The affine transformation processing unit 77 sends the attention model image, the similar model image after affine transformation having a predetermined number or more of match pairs to the attention model image, and the match pair information after outlier removal to the correlation image generation unit 78. Supply.

  In step S <b> 30, the correlation image generation unit 78 correlates the feature value (feature value) with the similar model image after affine transformation at each position for each feature point of the model image of interest in all combinations of similar model images. (Quantity distance) is calculated, a correlation image between the model image of interest and the similar model image is generated, and an average thereof is obtained. Then, the correlation image generation unit 78 supplies the identification feature point extraction unit 79 with information indicating the average value of the correlation image, that is, the average value of the correlation values for each obtained feature point.

  In step S <b> 31, the identification feature point extraction unit 79 selects a feature point having a correlation value lower than a certain threshold or a correlation value based on the average value of the correlation values with the similar model image at each feature point of the target model image. A predetermined number of feature points are extracted from the descending order as identification feature points having high model identification capability with similar model images, supplied to the identification feature point storage unit 92 of the determination processing unit 62 and registered, and the processing ends. To do.

  By such processing, in addition to the feature amount at the feature point of the model image, an identification feature point having a high model identification capability with the similar model image can be obtained for each model image. The determination processing unit 62 in which similar identification feature points are registered can execute identification processing for similar model images.

  Even in the process of step S31, the identification feature point extraction unit 79 performs the extraction of the identification feature points in the similar model images while distinguishing the similar parts as described with reference to FIGS. Or may be hierarchized.

  Next, the identification determination process executed by the determination processing unit 62 of the recognition device 51 will be described with reference to the flowchart of FIG.

  In step S41, the query image acquisition unit 93 acquires a query image for identification.

  In step S42, the feature amount extraction unit 94 extracts feature points from the query image supplied from the query image acquisition unit 93, extracts local feature amounts from around the feature points, and describes them as vectors. The feature quantity extraction unit 94 supplies the extracted local feature quantity information to the matching processing unit 95 and the outlier removal unit 96 together with the query image.

  In step S43, the matching processing unit 95 uses the local feature amount supplied from the feature amount extracting unit 94 and the local feature amount corresponding to the model image stored in the model image dictionary storage unit 91, for example, K− Matching is done using a technique such as NN to find a match pair. Then, the matching processing unit 95 supplies the query image, the data of the similar model image that is a model image having a predetermined number of match pairs or more with respect to the query image, and information on the match pairs to the outlier removal unit 96. .

  In step S44, the outlier removal unit 96 removes mismatched pairs that do not satisfy the geometric constraint from the match pairs detected by the matching processing unit 95 using Hough transform, RANSAC, or the like.

  In step S <b> 45, the outlier removal unit 96 determines whether there is a model image having a matching degree equal to or greater than a predetermined value. If it is determined in step S45 that there is no model image having a matching degree equal to or greater than a predetermined value, the outlier removal unit 96 supplies the determination result to the supply to the identification processing unit 100, and the process will be described later. Proceed to step S51.

  In step S45, when the outlier removal unit 96 determines that there is a model image having a matching degree equal to or greater than a predetermined value, in step S46, the outlier removal unit 96 determines that the number of match pairs remaining after the outlier removal is predetermined. It is determined whether or not a predetermined number or more of model images larger than the threshold value are extracted, in other words, whether or not similar model images exist. When it is determined in step S46 that the predetermined number or more of model images have not been extracted, the outlier removal unit 96 identifies the model image having a matching degree equal to or higher than a predetermined value and the number of match pairs that remain without being removed. The process proceeds to step S50 described later.

  If it is determined in step S46 that a predetermined number or more of model images have been extracted, in step S47, the outlier removal unit 96 determines that the number of remaining match pairs is equal to or greater than a predetermined number of model images, that is, similar models. Information about the image, the query image, and the match pair after outlier removal is supplied to the affine parameter acquisition unit 97. The affine parameter acquisition unit 97 acquires affine parameters between the query image and each similar model image from the remaining match pairs, and uses the query image, affine parameters, similar model image, and match pair information after outlier removal as affine parameters. The data is supplied to the conversion processing unit 98.

  In step S48, the affine transformation processing unit 98 transforms the supplied similar model image with affine parameters, and corrects the image so as to be captured in the same size from the imaging direction corresponding to the query image. The affine transformation processing unit 98 supplies the query image, the similar model image after the affine transformation, and the match pair information after the outlier removal to the identification feature point comparison unit 99.

  In step S49, the identification feature point comparison unit 99 obtains the identification feature point in the similar model image after the affine transformation based on the identification feature point of each model image stored in the identification feature point storage unit 92. A feature value is calculated, and a correlation value at an identified feature point with a feature value on a query image is calculated.

  In the process of step S19, as described with reference to FIGS. 8 to 12, the identification feature point extraction unit 79 performs identification feature point extraction in a similar model image while distinguishing similar parts. Alternatively, when the classification feature points are hierarchized, the identification feature point comparison unit 99 calculates a correlation value according to each of the distinguished or hierarchical identification feature points.

  If it is determined in step S46 that the number of match pairs has not been extracted more than a predetermined number, or after the processing in step S49 is completed, the identification processing unit 100 determines in step S50 each model image supplied from the outlier removal unit 96. The model image corresponding to the query image is identified based on the number of matched pairs with the query image of the image or the correlation value at the identification feature point with the query image for each similar model image supplied from the identification feature point comparison unit 99, The identification result is output to the identification result output unit 101.

  If it is determined in step S45 that there is no model image having a matching degree equal to or greater than a predetermined value, in step S51, the identification processing unit 100 determines which query image is based on the notification from the outlier removal unit 96. And the determination result is output to the identification result output unit 101.

  After completion of the processing of step S50 or step S51, in step S52, the identification result output unit 101 displays the identification result supplied from the identification processing unit 100, for example, on a display unit, or outputs it as audio data. Alternatively, by outputting via a predetermined transmission path or recording on a predetermined recording medium, the data is output to another device, and the processing is terminated.

  With such a process, for example, even when the images 11 to 15 shown in FIG. 1 are registered as model images and the query image 111 described with reference to FIG. 4 is input, each of the query images has many feature points. The image 12 can be identified from the matching images 11 to 15.

  Through the processing described above, the recognition device 51 correctly determines which similar model is closest to the model even if a similar model exists while maintaining the characteristics of the general object recognizer that is robust to occlusion and robust. It becomes possible to identify.

  Here, although the recognition device 51 has been described as one device, the identification feature point selection unit 61 and the determination processing unit 62 may be configured as different devices.

  Further, the extraction of the identification feature points and the image identification processing do not have to be performed continuously, and the identification feature point selection unit 61 and the determination processing unit 62 are each configured as one different device, and are separated from each other. Needless to say, it may be installed. In other words, the device corresponding to the determination processing unit 62 in which the identification feature points generated by the device corresponding to the identification feature point selection unit 61 are stored in the identification feature point storage unit 92 is transferred to the identification feature point selection unit 61. Even if it is installed at a location deviating from the corresponding device, the image identification processing can be performed independently.

  The series of processes described above can be executed by hardware or can be executed by software. The software is a computer in which the program constituting the software is incorporated in dedicated hardware, or various functions can be executed by installing various programs, for example, a general-purpose personal computer For example, it is installed from a recording medium. In this case, the processing described above is executed by a personal computer 500 as shown in FIG.

  In FIG. 16, a CPU (Central Processing Unit) 501 performs various processes according to a program stored in a ROM (Read Only Memory) 502 or a program loaded from a storage unit 508 to a RAM (Random Access Memory) 503. Execute. The RAM 503 also appropriately stores data necessary for the CPU 501 to execute various processes.

  The CPU 501, ROM 502, and RAM 503 are connected to each other via an internal bus 504. An input / output interface 505 is also connected to the internal bus 504.

  The input / output interface 505 includes an input unit 506 including a keyboard and a mouse, a display including CRT and LCD, an output unit 507 including a speaker, a storage unit 508 including a hard disk, a modem, a terminal adapter, and the like. A communicator 509 is connected. A communication unit 509 performs communication processing via various networks including a telephone line and CATV.

  A drive 510 is also connected to the input / output interface 505 as necessary, and a removable medium 521 composed of a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately mounted, and a computer program read therefrom is It is installed in the storage unit 508 as necessary.

  When a series of processing is executed by software, a program constituting the software is installed from a network or a recording medium.

  As shown in FIG. 16, this recording medium is not only composed of a package medium consisting of a removable medium 521 on which a program is recorded, which is distributed to provide a program to the user, separately from the computer. These are configured by a hard disk including a ROM 502 storing a program and a storage unit 508 provided to the user in a state of being pre-installed in the apparatus main body.

  Further, in the present specification, the step of describing the program recorded on the recording medium is not limited to the processing performed in chronological order according to the described order, but may be performed in parallel or It also includes processes that are executed individually.

  In the present specification, the system represents the entire apparatus constituted by a plurality of apparatuses.

  The embodiment of the present invention is not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present invention.

It is a figure for demonstrating a similar model image. It is a block diagram which shows the structure of a recognition apparatus. It is a figure for demonstrating an identification feature point. It is a figure for demonstrating identification of a query image. It is a figure for demonstrating the application to a telop. It is a figure for demonstrating the application to a telop. It is a figure for demonstrating the application to a telop. It is a figure for demonstrating the example of the similar model image from which a similar point differs. It is a figure for demonstrating the example of the similar model image from which a similar point differs. It is a figure for demonstrating the classification | category of an identification feature point. It is a figure for demonstrating the case where a similar image is discriminate | determined hierarchically. It is a figure for demonstrating the case where a similar image is discriminate | determined hierarchically. It is a flowchart for demonstrating the identification feature point extraction process 1. FIG. It is a flowchart for demonstrating the identification feature point extraction process 2. FIG. It is a flowchart for demonstrating an identification determination process. It is a block diagram which shows the structure of a personal computer.

Explanation of symbols

  DESCRIPTION OF SYMBOLS 51 Recognition apparatus, 61 Identification feature point selection part, 62 Judgment process part, 71 Model image acquisition part, 72 Feature-value extraction part, 73 Model image dictionary memory | storage part, 74 Matching process part, 75 Outlier removal part, 76 Affine parameter acquisition part , 77 affine transformation processing unit, 78 correlation image generation unit, 79 identification feature point extraction unit, 91 model image dictionary storage unit, 92 identification feature point storage unit, 93 query image acquisition unit, 94 feature quantity extraction unit, 95 matching processing unit , 96 Outlier removal unit, 97 Affine parameter acquisition unit, 98 Affine transformation processing unit, 99 Discrimination feature point comparison unit, 100 Discrimination processing unit, 101 Discrimination result output unit

Claims (17)

  1. In an information processing apparatus that generates information used to identify an image,
    Model image acquisition means for acquiring a plurality of model images;
    Model image feature quantity extraction means for extracting the feature quantity of the model image acquired by the model image acquisition means;
    Matching means for performing matching processing of the feature amount of the model image extracted by the model image feature amount extraction means;
    As a result of matching by the matching means, in a similar model image that is a model image in which a predetermined number of feature quantities match a predetermined model image, feature points having a small correlation with the predetermined model image are represented by the similar model image. An information processing apparatus comprising: an identification feature point extracting unit that extracts as an identification feature point used for identification between the two.
  2. The information processing apparatus according to claim 1, wherein the matching unit includes an outlier removal unit that removes an outlier from a result of the process of matching the feature amount between the model images.
  3. A first conversion unit that performs affine transformation on the similar model image so as to match the predetermined model image;
    The information processing apparatus according to claim 1, wherein the identification feature point extraction unit extracts the identification feature point using the similar model image converted by the first conversion unit.
  4. Recognition image acquisition means for acquiring a recognition image;
    A recognized image feature amount extracting unit that extracts a feature amount of the recognized image acquired by the recognized image acquiring unit;
    Identification for identifying the similar model image corresponding to the recognition image by obtaining a correlation between the feature amount of the similar model image and the recognition image at the identification feature point extracted by the identification feature point extraction unit The information processing apparatus according to claim 1, further comprising: means.
  5. A second conversion means for performing an affine transformation on the similar model image so as to match the recognition image;
    The information processing apparatus according to claim 4, wherein the identification unit identifies the similar model image corresponding to the recognition image using the similar model image converted by the second conversion unit.
  6. The identification feature point extracted by the identification feature point extracting means is a first identification feature point that is identified as a corresponding similar model image as the number of matches increases, and a similar model image that corresponds as the number of matches decreases. The information processing device according to claim 1, wherein the information processing device is classified into two identification feature points identified as being.
  7. The information processing apparatus according to claim 1, wherein the identification feature point extraction unit extracts the identification feature points hierarchically.
  8. In an information processing method of an information processing apparatus that generates information used to identify an image,
    Acquire multiple model images,
    Extracting the feature quantity of the acquired model image,
    Perform the matching process of the feature amount of the extracted model image,
    As a result of matching, in a similar model image that is a model image in which a predetermined number of feature quantities match a predetermined model image, feature points that have little correlation with the predetermined model image are identified between the similar model images. An information processing method including a step of extracting as identification feature points to be used.
  9. A program for causing a computer to execute processing for generating information used to identify an image,
    Acquire multiple model images,
    Extracting the feature quantity of the acquired model image,
    Perform the matching process of the feature amount of the extracted model image,
    As a result of matching, in a similar model image that is a model image in which a predetermined number of feature quantities match a predetermined model image, feature points that have little correlation with the predetermined model image are identified between the similar model images. A program that causes a computer to execute processing including a step of extracting as identification feature points to be used.
  10. In an image identification device that stores a plurality of model images and feature amounts of the model images, receives an input of a recognition image, and identifies the model image corresponding to the recognition image,
    Among the model images, in a similar model image that is a plurality of model images in which a predetermined number or more of the feature quantities match with any of the other model images, feature points with little correlation between the similar model images Storage means for storing an identification feature point;
    Recognition image acquisition means for acquiring the recognition image;
    A recognized image feature amount extracting unit that extracts a feature amount of the recognized image acquired by the recognized image acquiring unit;
    An identification unit for identifying the similar model image corresponding to the recognition image by obtaining a correlation between feature quantities of the similar model image and the recognition image at the identification feature point stored by the storage unit. Image identification device.
  11. Further comprising conversion means for performing affine transformation on the similar model image so as to match the recognition image,
    The image identification device according to claim 10, wherein the identification unit identifies the similar model image corresponding to the recognition image using the similar model image converted by the conversion unit.
  12. The image processing apparatus further includes a matching unit that performs a feature amount matching process between the recognized image acquired by the recognized image acquiring unit and the stored model image.
    As a result of the matching by the matching unit, when the number of model images in which the recognition image and the feature quantity equal to or greater than a predetermined threshold match is greater than or equal to a predetermined number, the identification unit The similar model image corresponding to the recognized image is identified by obtaining a correlation between the similar model image and a feature amount of the recognized image at the identification feature point stored by the storage unit. The image identification device described in 1.
  13. The image identification device according to claim 12, wherein the matching unit includes an outlier removal unit that removes an outlier from the result of the matching processing of the feature amount between the model images.
  14. The identification feature point stored by the storage means is a first identification feature point identified as a corresponding similar model image as the number of matches increases, and a corresponding similar model image as the number of matches decreases. The information processing apparatus according to claim 10, wherein the information processing apparatus is classified into two identification feature points to be identified.
  15. The information processing apparatus according to claim 10, wherein the identification feature points stored by the storage unit have a hierarchical structure.
  16. A first storage unit that stores a plurality of model images and feature amounts of the model images; and a plurality of the feature images that match a predetermined number or more of the feature amounts with any of the other model images of the model images A similar model image that is a model image, and a second storage unit that stores identification feature points that are feature points with little correlation between the similar model images, and receives input of a recognition image and corresponds to the recognition image In the image identification method of the image identification device for identifying the model image to be performed,
    Obtaining the recognition image;
    Extracting the feature quantity of the acquired recognition image;
    An image identification method including the step of identifying the similar model image corresponding to the recognized image by obtaining a correlation between feature quantities of the similar model image and the recognized image at the identified feature point.
  17. A first storage unit that stores a plurality of model images and feature amounts of the model images; and a plurality of the feature images that match a predetermined number or more of the feature amounts with any of the other model images of the model images Corresponding to an input recognition image using information stored in a second storage unit that stores identification feature points that are feature points having little correlation between the similar model images in a similar model image that is a model image A program for causing a computer to execute processing for identifying the model image to be performed,
    Obtaining the recognition image;
    Extracting the feature quantity of the acquired recognition image;
    A program that causes a computer to execute a process including a step of identifying the similar model image corresponding to the recognized image by obtaining a correlation between feature amounts of the similar model image and the recognized image at the identified feature point.
JP2007285164A 2007-11-01 2007-11-01 Image identification apparatus, image identification method, and program Expired - Fee Related JP5018404B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007285164A JP5018404B2 (en) 2007-11-01 2007-11-01 Image identification apparatus, image identification method, and program

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2007285164A JP5018404B2 (en) 2007-11-01 2007-11-01 Image identification apparatus, image identification method, and program
CN 200810168388 CN101425144B (en) 2007-11-01 2008-10-30 Information processing apparatus, information processing method, image identifying apparatus, image identifying method, and program
US12/261,190 US8374437B2 (en) 2007-11-01 2008-10-30 Information processing apparatus, information processing method, image identifying apparatus, image identifying method, and program

Publications (3)

Publication Number Publication Date
JP2009116385A true JP2009116385A (en) 2009-05-28
JP2009116385A5 JP2009116385A5 (en) 2010-11-04
JP5018404B2 JP5018404B2 (en) 2012-09-05

Family

ID=40615748

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007285164A Expired - Fee Related JP5018404B2 (en) 2007-11-01 2007-11-01 Image identification apparatus, image identification method, and program

Country Status (3)

Country Link
US (1) US8374437B2 (en)
JP (1) JP5018404B2 (en)
CN (1) CN101425144B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011008507A (en) * 2009-06-25 2011-01-13 Kddi Corp Image retrieval method and system
WO2012014403A1 (en) * 2010-07-30 2012-02-02 パナソニック株式会社 Detection device and method for transition area in space
JP2013008093A (en) * 2011-06-22 2013-01-10 Canon Inc Image recognition device, image recognition method and program
WO2014030400A1 (en) * 2012-08-23 2014-02-27 日本電気株式会社 Object discrimination device, object discrimination method, and program
JP2014120156A (en) * 2013-11-12 2014-06-30 Team Lab Inc Medicine recognition device, medicine recognition method, and medicine recognition program
WO2014103297A1 (en) * 2012-12-28 2014-07-03 日本電気株式会社 Object identification device, method, and storage medium
KR101528017B1 (en) * 2010-03-11 2015-06-10 퀄컴 인코포레이티드 Image feature detection based on application of multiple feature detectors
EP2919163A2 (en) 2014-03-14 2015-09-16 Omron Corporation Image processing device, image processing method, and image processing program
JP2016014914A (en) * 2014-06-30 2016-01-28 キヤノン株式会社 Image processor, image processing method and program
JP2016038790A (en) * 2014-08-08 2016-03-22 Kddi株式会社 Image processor and image feature detection method thereof, program and device
JP2016110477A (en) * 2014-12-09 2016-06-20 エヌ・ティ・ティ・コムウェア株式会社 Information processor, information processing method, and program
JP2016541052A (en) * 2013-11-14 2016-12-28 シクパ ホルディング ソシエテ アノニムSicpa Holding Sa Image analysis to certify products
US10055670B2 (en) 2014-03-14 2018-08-21 Omron Corporation Image recognition device, image sensor, and image recognition method using feature

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5018404B2 (en) * 2007-11-01 2012-09-05 ソニー株式会社 Image identification apparatus, image identification method, and program
US20110103965A1 (en) * 2009-10-30 2011-05-05 General Electric Company Wind turbine blades
CN101839722A (en) * 2010-05-06 2010-09-22 南京航空航天大学 Method for automatically recognizing target at medium and low altitudes and positioning carrier with high accuracy
ES2384928B1 (en) * 2010-06-25 2013-05-20 Telefónica, S.A. Method and system for the rapid and robust identification of specific products in images.
JP5131414B2 (en) * 2011-05-17 2013-01-30 パナソニック株式会社 Refrigerator dismantling method and refrigerator dismantling device
JP6098065B2 (en) * 2011-09-13 2017-03-22 株式会社リコー Image inspection apparatus, image inspection method, and program
JP2013156722A (en) * 2012-01-27 2013-08-15 Sony Corp Image processing device, image processing method, learning device, learning method and program
US9430876B1 (en) * 2012-05-10 2016-08-30 Aurasma Limited Intelligent method of determining trigger items in augmented reality environments
US9066200B1 (en) * 2012-05-10 2015-06-23 Longsand Limited User-generated content in a virtual reality environment
US9269022B2 (en) * 2013-04-11 2016-02-23 Digimarc Corporation Methods for object recognition and related arrangements
WO2014178241A1 (en) * 2013-05-02 2014-11-06 コニカミノルタ株式会社 Image processing device, image processing method, and image processing program
CA2922081C (en) * 2013-08-28 2016-12-06 Ricoh Company, Ltd. Image processing apparatus, image processing method, and imaging system
US10210427B2 (en) * 2014-07-09 2019-02-19 Slyce Acquisition Inc. Systems, methods, and devices for image matching and object recognition in images
US10121237B1 (en) * 2017-04-17 2018-11-06 Rohr, Inc. Component inspection method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01102688A (en) * 1987-10-15 1989-04-20 Nec Corp Pattern recognition system
JP2000123186A (en) * 1998-10-14 2000-04-28 Canon Inc Object recognizing device
JP2002183667A (en) * 2000-12-12 2002-06-28 Ricoh Co Ltd Character-recognizing device and recording medium
JP2004185264A (en) * 2002-12-03 2004-07-02 Canon Inc Character recognition method
JP2004326693A (en) * 2003-04-28 2004-11-18 Sony Corp Image recognition device and method as well as robot device
JP2005196678A (en) * 2004-01-09 2005-07-21 Neucore Technol Inc Template matching method, and objective image area extracting device
JP2006190201A (en) * 2005-01-07 2006-07-20 Sony Corp Image processing system, learning device, method and program
JP2006252272A (en) * 2005-03-11 2006-09-21 Casio Comput Co Ltd Image identification device, image identification processing program, image identification processing method, and data structure of registerd image data

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6453069B1 (en) * 1996-11-20 2002-09-17 Canon Kabushiki Kaisha Method of extracting image from input image using reference image
US7417645B2 (en) 2003-03-27 2008-08-26 Microsoft Corporation Markup language and object model for vector graphics
EP1697877B1 (en) * 2003-12-16 2009-09-16 Canon Kabushiki Kaisha Pattern identification method, apparatus, and program
US8050491B2 (en) * 2003-12-17 2011-11-01 United Technologies Corporation CAD modeling system and method
CA2455359C (en) 2004-01-16 2013-01-08 Geotango International Corp. System, computer program and method for 3d object measurement, modeling and mapping from single imagery
JP2005210526A (en) 2004-01-23 2005-08-04 Konica Minolta Photo Imaging Inc Image processing apparatus, method, and program, image pickup device, and image data outputting method and program
US7706603B2 (en) * 2005-04-19 2010-04-27 Siemens Corporation Fast object detection for augmented reality systems
JP4653606B2 (en) * 2005-05-23 2011-03-16 株式会社東芝 Image recognition apparatus, method and program
US7856125B2 (en) * 2006-01-31 2010-12-21 University Of Southern California 3D face reconstruction from 2D images
US8254690B2 (en) * 2007-10-19 2012-08-28 Sony Corporation Information processing apparatus, information processing method, and program
JP5018404B2 (en) * 2007-11-01 2012-09-05 ソニー株式会社 Image identification apparatus, image identification method, and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01102688A (en) * 1987-10-15 1989-04-20 Nec Corp Pattern recognition system
JP2000123186A (en) * 1998-10-14 2000-04-28 Canon Inc Object recognizing device
JP2002183667A (en) * 2000-12-12 2002-06-28 Ricoh Co Ltd Character-recognizing device and recording medium
JP2004185264A (en) * 2002-12-03 2004-07-02 Canon Inc Character recognition method
JP2004326693A (en) * 2003-04-28 2004-11-18 Sony Corp Image recognition device and method as well as robot device
JP2005196678A (en) * 2004-01-09 2005-07-21 Neucore Technol Inc Template matching method, and objective image area extracting device
JP2006190201A (en) * 2005-01-07 2006-07-20 Sony Corp Image processing system, learning device, method and program
JP2006252272A (en) * 2005-03-11 2006-09-21 Casio Comput Co Ltd Image identification device, image identification processing program, image identification processing method, and data structure of registerd image data

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011008507A (en) * 2009-06-25 2011-01-13 Kddi Corp Image retrieval method and system
KR101528017B1 (en) * 2010-03-11 2015-06-10 퀄컴 인코포레이티드 Image feature detection based on application of multiple feature detectors
US9064171B2 (en) 2010-07-30 2015-06-23 Panasonic Intellectual Property Management Co., Ltd. Detection device and method for transition area in space
WO2012014403A1 (en) * 2010-07-30 2012-02-02 パナソニック株式会社 Detection device and method for transition area in space
JP2013008093A (en) * 2011-06-22 2013-01-10 Canon Inc Image recognition device, image recognition method and program
US9002115B2 (en) 2011-06-22 2015-04-07 Canon Kabushiki Kaisha Dictionary data registration apparatus for image recognition, method therefor, and program
WO2014030400A1 (en) * 2012-08-23 2014-02-27 日本電気株式会社 Object discrimination device, object discrimination method, and program
US9679221B2 (en) 2012-08-23 2017-06-13 Nec Corporation Object identification apparatus, object identification method, and program
JPWO2014030400A1 (en) * 2012-08-23 2016-07-28 日本電気株式会社 Object identification device, object identification method, and program
JPWO2014103297A1 (en) * 2012-12-28 2017-01-12 日本電気株式会社 Object identification device, method and program
WO2014103297A1 (en) * 2012-12-28 2014-07-03 日本電気株式会社 Object identification device, method, and storage medium
US9633278B2 (en) 2012-12-28 2017-04-25 Nec Corporation Object identification device, method, and storage medium
JP2014120156A (en) * 2013-11-12 2014-06-30 Team Lab Inc Medicine recognition device, medicine recognition method, and medicine recognition program
JP2016541052A (en) * 2013-11-14 2016-12-28 シクパ ホルディング ソシエテ アノニムSicpa Holding Sa Image analysis to certify products
JP2015176272A (en) * 2014-03-14 2015-10-05 オムロン株式会社 Image processor, image processing method, and image processing program
US9600893B2 (en) 2014-03-14 2017-03-21 Omron Corporation Image processing device, method, and medium for discriminating a type of input image using non-common regions
EP2919163A2 (en) 2014-03-14 2015-09-16 Omron Corporation Image processing device, image processing method, and image processing program
US10055670B2 (en) 2014-03-14 2018-08-21 Omron Corporation Image recognition device, image sensor, and image recognition method using feature
JP2016014914A (en) * 2014-06-30 2016-01-28 キヤノン株式会社 Image processor, image processing method and program
JP2016038790A (en) * 2014-08-08 2016-03-22 Kddi株式会社 Image processor and image feature detection method thereof, program and device
JP2016110477A (en) * 2014-12-09 2016-06-20 エヌ・ティ・ティ・コムウェア株式会社 Information processor, information processing method, and program

Also Published As

Publication number Publication date
CN101425144A (en) 2009-05-06
US8374437B2 (en) 2013-02-12
JP5018404B2 (en) 2012-09-05
CN101425144B (en) 2012-05-23
US20090141984A1 (en) 2009-06-04

Similar Documents

Publication Publication Date Title
Setlur et al. Automatic image retargeting
Cheng et al. Custom pictorial structures for re-identification.
US8750578B2 (en) Detecting facial expressions in digital images
Achanta et al. Salient region detection and segmentation
US8411910B2 (en) Computationally efficient feature extraction and matching iris recognition
US8582887B2 (en) Image processing system, learning device and method, and program
KR101808467B1 (en) Feature extraction and matching and template update for biometric authentication
Gllavata et al. Text detection in images based on unsupervised classification of high-frequency wavelet coefficients
US7031553B2 (en) Method and apparatus for recognizing text in an image sequence of scene imagery
US8897575B2 (en) Multi-scale, perspective context, and cascade features for object detection
US20070242856A1 (en) Object Recognition Method and Apparatus Therefor
JP4284288B2 (en) Pattern recognition apparatus and method
Lei et al. An efficient 3D face recognition approach based on the fusion of novel local low-level features
US20150294469A1 (en) Image processing device and image processing method, and program
DE60114469T2 (en) Method and device for determining interesting images and for image transmission
US7929765B2 (en) Video text processing apparatus
JP4739355B2 (en) Fast object detection method using statistical template matching
JP5054122B2 (en) Improved image identification
JP2008530701A (en) Face feature detection method
Nishiyama et al. Facial deblur inference using subspace analysis for recognition of blurred faces
US9076070B2 (en) Methods and apparatus to detect differences between images
US8126206B2 (en) Image processing apparatus, image processing method, and program
US7995805B2 (en) Image matching apparatus, image matching method, computer program and computer-readable storage medium
US20120045095A1 (en) Image processing apparatus, method thereof, program, and image capturing apparatus
JP2012038106A (en) Information processor, information processing method and program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20100915

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100915

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20111027

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20111117

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20111216

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20120515

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20120528

FPAY Renewal fee payment (prs date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150622

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees